datasetId
stringlengths 2
117
| card
stringlengths 19
1.01M
|
---|---|
dhanyabahadur/ddpm-butterflies-128 | ---
license: mit
language:
- en
--- |
SolaireOfTheSun/SAPFICODATASET | ---
license: bigscience-openrail-m
---
|
thisisanshgupta/Pycode | ---
license: mit
---
|
open-llm-leaderboard/details_jondurbin__bagel-8x7b-v0.2 | ---
pretty_name: Evaluation run of jondurbin/bagel-8x7b-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/bagel-8x7b-v0.2](https://huggingface.co/jondurbin/bagel-8x7b-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__bagel-8x7b-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T04:05:05.899101](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-8x7b-v0.2/blob/main/results_2024-01-06T04-05-05.899101.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6937196740742246,\n\
\ \"acc_stderr\": 0.030405501341035,\n \"acc_norm\": 0.7063691103588217,\n\
\ \"acc_norm_stderr\": 0.031125133352099654,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498825,\n \"mc2\": 0.6003433287827963,\n\
\ \"mc2_stderr\": 0.015137869033462238\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.013921008595179344,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6750647281418044,\n\
\ \"acc_stderr\": 0.00467393483715045,\n \"acc_norm\": 0.8631746664011153,\n\
\ \"acc_norm_stderr\": 0.003429605106216367\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795719,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.045796394220704355,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.045796394220704355\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388525,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8096774193548387,\n \"acc_stderr\": 0.02233170761182307,\n \"\
acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.02233170761182307\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6108374384236454,\n \"acc_stderr\": 0.03430462416103872,\n \"\
acc_norm\": 0.6108374384236454,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983127,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983127\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942088,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942088\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.02543511943810536,\n \
\ \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.02543511943810536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"\
acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.03350991604696044,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.03350991604696044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934725,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934725\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.046161430750285455,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.046161430750285455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n\
\ \"acc_stderr\": 0.01188448890589555,\n \"acc_norm\": 0.8735632183908046,\n\
\ \"acc_norm_stderr\": 0.01188448890589555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617897,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617897\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\
\ \"acc_stderr\": 0.016428811915898865,\n \"acc_norm\": 0.40670391061452515,\n\
\ \"acc_norm_stderr\": 0.016428811915898865\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.02417084087934086,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.02417084087934086\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n\
\ \"acc_stderr\": 0.022411516780911363,\n \"acc_norm\": 0.8070739549839229,\n\
\ \"acc_norm_stderr\": 0.022411516780911363\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149872,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149872\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529335071707953,\n\
\ \"acc_stderr\": 0.012748238397365552,\n \"acc_norm\": 0.529335071707953,\n\
\ \"acc_norm_stderr\": 0.012748238397365552\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7720588235294118,\n \"acc_stderr\": 0.025483081468029804,\n\
\ \"acc_norm\": 0.7720588235294118,\n \"acc_norm_stderr\": 0.025483081468029804\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n\
\ \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7714285714285715,\n\
\ \"acc_stderr\": 0.026882144922307744,\n \"acc_norm\": 0.7714285714285715,\n\
\ \"acc_norm_stderr\": 0.026882144922307744\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306042,\n\
\ \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306042\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n\
\ \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n\
\ \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.024103384202072867,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.024103384202072867\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.01734120239498825,\n\
\ \"mc2\": 0.6003433287827963,\n \"mc2_stderr\": 0.015137869033462238\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n\
\ \"acc_stderr\": 0.01095971643524291\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.04700530705079606,\n \"acc_stderr\": 0.005829898355937209\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/bagel-8x7b-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|arc:challenge|25_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|arc:challenge|25_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|gsm8k|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|gsm8k|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hellaswag|10_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hellaswag|10_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T04-02-43.736147.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T04-05-05.899101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T04-05-05.899101.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- '**/details_harness|winogrande|5_2024-01-06T04-02-43.736147.parquet'
- split: 2024_01_06T04_05_05.899101
path:
- '**/details_harness|winogrande|5_2024-01-06T04-05-05.899101.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T04-05-05.899101.parquet'
- config_name: results
data_files:
- split: 2024_01_06T04_02_43.736147
path:
- results_2024-01-06T04-02-43.736147.parquet
- split: 2024_01_06T04_05_05.899101
path:
- results_2024-01-06T04-05-05.899101.parquet
- split: latest
path:
- results_2024-01-06T04-05-05.899101.parquet
---
# Dataset Card for Evaluation run of jondurbin/bagel-8x7b-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jondurbin/bagel-8x7b-v0.2](https://huggingface.co/jondurbin/bagel-8x7b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__bagel-8x7b-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T04:05:05.899101](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-8x7b-v0.2/blob/main/results_2024-01-06T04-05-05.899101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6937196740742246,
"acc_stderr": 0.030405501341035,
"acc_norm": 0.7063691103588217,
"acc_norm_stderr": 0.031125133352099654,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498825,
"mc2": 0.6003433287827963,
"mc2_stderr": 0.015137869033462238
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.013921008595179344,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038169
},
"harness|hellaswag|10": {
"acc": 0.6750647281418044,
"acc_stderr": 0.00467393483715045,
"acc_norm": 0.8631746664011153,
"acc_norm_stderr": 0.003429605106216367
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795719,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.045796394220704355,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.045796394220704355
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388525,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6108374384236454,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.6108374384236454,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983127,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983127
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942088,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942088
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.02543511943810536,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.02543511943810536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8788990825688073,
"acc_stderr": 0.013987618292389713,
"acc_norm": 0.8788990825688073,
"acc_norm_stderr": 0.013987618292389713
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.03350991604696044,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.03350991604696044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.046161430750285455,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.046161430750285455
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8735632183908046,
"acc_stderr": 0.01188448890589555,
"acc_norm": 0.8735632183908046,
"acc_norm_stderr": 0.01188448890589555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617897,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.016428811915898865,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.016428811915898865
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.02417084087934086,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.02417084087934086
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8070739549839229,
"acc_stderr": 0.022411516780911363,
"acc_norm": 0.8070739549839229,
"acc_norm_stderr": 0.022411516780911363
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149872,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149872
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.529335071707953,
"acc_stderr": 0.012748238397365552,
"acc_norm": 0.529335071707953,
"acc_norm_stderr": 0.012748238397365552
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7720588235294118,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.7720588235294118,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072867,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072867
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498825,
"mc2": 0.6003433287827963,
"mc2_stderr": 0.015137869033462238
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.01095971643524291
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.005829898355937209
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-one-sec-cv12-each-chunk-uniq/chunk_194 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1016130868.0
num_examples: 197999
download_size: 1031967918
dataset_size: 1016130868.0
---
# Dataset Card for "chunk_194"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
barto17/gtzan_all_preprocessed_kaggle_version | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': blues
'1': classical
'2': country
'3': disco
'4': hiphop
'5': jazz
'6': metal
'7': pop
'8': reggae
'9': rock
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3452159816
num_examples: 899
- name: test
num_bytes: 384000696
num_examples: 100
download_size: 1923103931
dataset_size: 3836160512
---
# Dataset Card for "gtzan_all_preprocessed_kaggle_version"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
odunola/french-audio-preprocessed | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: english_transcript
dtype: string
- name: labels
sequence: int64
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 12478074884.75
num_examples: 11386
download_size: 3441305010
dataset_size: 12478074884.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
benchang1110/humantw | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 427517649
num_examples: 86860
download_size: 298703936
dataset_size: 427517649
---
# Dataset Card for "humantw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_73 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1308822220
num_examples: 257035
download_size: 1334313748
dataset_size: 1308822220
---
# Dataset Card for "chunk_73"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
facebook/tgve_plus | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: task_name
dtype: string
- name: input_caption
dtype: string
- name: output_caption
dtype: string
- name: instruction
dtype: string
- name: video_path
dtype: string
splits:
- name: train
num_bytes: 287914
num_examples: 1418
download_size: 115426
dataset_size: 287914
---
# Dataset Card for the TGVE+ Test Set
## Dataset Description
- **Homepage: https://fdd-video-edit.github.io/**
- **Paper: https://arxiv.org/abs/2403.09334**
### Dataset Summary
We extend the widely used Text Guided Video Editing (TGVE) benchmark with additional editing tasks. The dataset now comprises seven editing tasks in total:
four from the original TGVE and three new tasks, namely (i) object removal (Remove), (ii) object addition (Add), and
(iii) texture alterations (Texture). The new tasks utilize the same 76 videos from the original TGVE benchmark.
Each row in the dataset consists of the instruction, input/output captions, and the relative path of the video in [TGVE](https://drive.google.com/file/d/1D7ZVm66IwlKhS6UINoDgFiFJp_mLIQ0W/view).
For more details please see our [paper](https://arxiv.org/abs/2403.09334) and [project page](https://fdd-video-edit.github.io/).
We'd like to thank [InstructVid2Vid](https://github.com/amazon-science/instruct-video-to-video) for creating instructions for the original TGVE tasks.
### Licensing Information
Licensed with CC-BY-NC 4.0 License available [here](https://creativecommons.org/licenses/by-nc/4.0/legalcode?fbclid=IwAR2SYZjLRywwUMblkWg0LyAxHVVTloIFlvC-ju3BthIYtOM2jpQHgbeXOsM).
### Citation Information
```
@inproceedings{Singer2024VideoEV,
title={Video Editing via Factorized Diffusion Distillation},
author={Uriel Singer and Amit Zohar and Yuval Kirstain and Shelly Sheynin and Adam Polyak and Devi Parikh and Yaniv Taigman},
year={2024},
url={https://api.semanticscholar.org/CorpusID:268385300}
}
``` |
Eitanli/abstracts_cleaned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: recall
dtype: int64
- name: article_title
dtype: string
- name: topic
dtype: string
- name: abstract
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 137515873.22056717
num_examples: 79863
- name: test
num_bytes: 17189699.389716417
num_examples: 9983
- name: valid
num_bytes: 17189699.389716417
num_examples: 9983
download_size: 92795013
dataset_size: 171895272.0
---
# Dataset Card for "abstracts_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo | ---
pretty_name: Evaluation run of chargoddard/servile-harpsichord-cdpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/servile-harpsichord-cdpo](https://huggingface.co/chargoddard/servile-harpsichord-cdpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T06:44:09.091422](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo/blob/main/results_2023-12-10T06-44-09.091422.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6467821760747017,\n\
\ \"acc_stderr\": 0.032099406932013255,\n \"acc_norm\": 0.6493833410875584,\n\
\ \"acc_norm_stderr\": 0.032737739125074355,\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.017363844503195978,\n \"mc2\": 0.6061030127349698,\n\
\ \"mc2_stderr\": 0.015471882890395387\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839157,\n\
\ \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.013706665975587331\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6618203545110536,\n\
\ \"acc_stderr\": 0.004721231637092722,\n \"acc_norm\": 0.851822346146186,\n\
\ \"acc_norm_stderr\": 0.0035454991695580435\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880277,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880277\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126253,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126253\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546837,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546837\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889135,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.017363844503195978,\n \"mc2\": 0.6061030127349698,\n\
\ \"mc2_stderr\": 0.015471882890395387\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5708870356330553,\n \
\ \"acc_stderr\": 0.013633369425647234\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/servile-harpsichord-cdpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|arc:challenge|25_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|gsm8k|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hellaswag|10_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T06-44-09.091422.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- '**/details_harness|winogrande|5_2023-12-10T06-44-09.091422.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T06-44-09.091422.parquet'
- config_name: results
data_files:
- split: 2023_12_10T06_44_09.091422
path:
- results_2023-12-10T06-44-09.091422.parquet
- split: latest
path:
- results_2023-12-10T06-44-09.091422.parquet
---
# Dataset Card for Evaluation run of chargoddard/servile-harpsichord-cdpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/servile-harpsichord-cdpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/servile-harpsichord-cdpo](https://huggingface.co/chargoddard/servile-harpsichord-cdpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T06:44:09.091422](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo/blob/main/results_2023-12-10T06-44-09.091422.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6467821760747017,
"acc_stderr": 0.032099406932013255,
"acc_norm": 0.6493833410875584,
"acc_norm_stderr": 0.032737739125074355,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195978,
"mc2": 0.6061030127349698,
"mc2_stderr": 0.015471882890395387
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839157,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.013706665975587331
},
"harness|hellaswag|10": {
"acc": 0.6618203545110536,
"acc_stderr": 0.004721231637092722,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580435
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880277,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126253,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126253
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889135,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195978,
"mc2": 0.6061030127349698,
"mc2_stderr": 0.015471882890395387
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.5708870356330553,
"acc_stderr": 0.013633369425647234
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ravel365artur/Treinar-voz | ---
license: openrail
---
|
Maverick17/cira_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 778205
num_examples: 1556
- name: test
num_bytes: 100751
num_examples: 196
- name: eval
num_bytes: 95330
num_examples: 194
download_size: 294020
dataset_size: 974286
---
# Dataset Card for "cira_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/the-grateful-dead | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/the-grateful-dead"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 2.732505 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/18f21c424e2f02f0c9a59c15bac56406.736x736x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/the-grateful-dead">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Grateful Dead</div>
<a href="https://genius.com/artists/the-grateful-dead">
<div style="text-align: center; font-size: 14px;">@the-grateful-dead</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/the-grateful-dead).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-grateful-dead")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|2266| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/the-grateful-dead")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk)
[![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
|
mshubhr/arakoo | ---
license: llama2
---
|
NeuralNovel/Neural-Story-v1 | ---
license: apache-2.0
---
# Neural-Story-v1 Dataset
## Overview
The **Neural-Story-v1** dataset is a curated collection of short stories featuring a rich variety of genres and plot settings. Carefully assembled by NeuralNovel, this dataset aims to serve as a valuable resource for testing and fine-tuning small language models using LoRa.
## Data Source
The dataset content is a result of a combination of automated generation by Mixtral 8x7b and manual refinement.
## Purpose
Designed specifically for testing purposes, the dataset facilitates the precise fine-tuning of small language models. The primary objective is to enhance genre variety and elevate creativity and nuance in writing.
## Curation Rationale
This dataset is curated with a deliberate focus on providing a diverse mix of genres. The intention is to inspire and encourage more varied and creative writing outputs.
## Recommendations
While the Neural-Story-v0.1 dataset serves as an excellent starting point for testing language models, users are advised to exercise caution, as there might be some inherent genre or writing bias.
|
nomic-ai/nomic-bert-2048-pretraining-data | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 38003435808
num_examples: 2647954
download_size: 10083076260
dataset_size: 38003435808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bert-pretokenized-2048-wiki-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PJMixers/limarp-perscengen-converted-combined | ---
language:
- en
tags:
- not-for-all-audiences
source_datasets: lemonilia/LimaRP
---
Reversed order so that you give it *blind* two person dialogues it then spits out the names, character descriptions, and a scenario summary.
I intend to try to use this to make the bluemoon set usable, I'll add conversion scripts for everything later.
*Note: Many samples contain sus content. Be aware of this before using.* |
liuyanchen1015/MULTI_VALUE_cola_no_preverbal_negator | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3979
num_examples: 50
- name: test
num_bytes: 3561
num_examples: 45
- name: train
num_bytes: 15201
num_examples: 204
download_size: 16387
dataset_size: 22741
---
# Dataset Card for "MULTI_VALUE_cola_no_preverbal_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-conll2003-conll2003-fb14e9-48103145236 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: alvarobartt/distilbert-base-cased-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: alvarobartt/distilbert-base-cased-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@alvarobartt](https://huggingface.co/alvarobartt) for evaluating this model. |
hanifabdlh/Setfit-Sample-Dataset | ---
dataset_info:
features:
- name: sample_text
dtype: string
- name: label
dtype:
class_label:
names:
'0': affirm
'1': bot_challenge
'2': deny
'3': goodbye
'4': greet
'5': mood_great
'6': mood_unhappy
splits:
- name: train
num_bytes: 1674
num_examples: 68
download_size: 2301
dataset_size: 1674
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxm/scidocs__gtr_base__dpr | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings_A
sequence: float32
- name: embeddings_B
sequence: float32
splits:
- name: train
num_bytes: 187036558
num_examples: 25657
download_size: 206524641
dataset_size: 187036558
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yuan-sf63/word_label_0.5_96_D | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
- name: '72'
dtype: int64
- name: '73'
dtype: int64
- name: '74'
dtype: int64
- name: '75'
dtype: int64
- name: '76'
dtype: int64
- name: '77'
dtype: int64
- name: '78'
dtype: int64
- name: '79'
dtype: int64
- name: '80'
dtype: int64
- name: '81'
dtype: int64
- name: '82'
dtype: int64
- name: '83'
dtype: int64
- name: '84'
dtype: int64
- name: '85'
dtype: int64
- name: '86'
dtype: int64
- name: '87'
dtype: int64
- name: '88'
dtype: int64
- name: '89'
dtype: int64
- name: '90'
dtype: int64
- name: '91'
dtype: int64
- name: '92'
dtype: int64
- name: '93'
dtype: int64
- name: '94'
dtype: int64
- name: '95'
dtype: int64
splits:
- name: train
num_bytes: 63418468.38393638
num_examples: 71983
- name: validation
num_bytes: 7047279.616063614
num_examples: 7999
download_size: 9813776
dataset_size: 70465748.0
---
# Dataset Card for "word_label_0.5_96_D"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/passionlip_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of passionlip/パッションリップ/Passionlip (Fate/Grand Order)
This is the dataset of passionlip/パッションリップ/Passionlip (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `long_hair, purple_hair, ribbon, hair_ribbon, breasts, very_long_hair, huge_breasts, pink_eyes, purple_eyes, pink_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 764.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/passionlip_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 660.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/passionlip_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1170 | 1.18 GiB | [Download](https://huggingface.co/datasets/CyberHarem/passionlip_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/passionlip_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, claws, looking_at_viewer, o-ring_top, solo, belt_collar, blush, open_mouth, pantyhose |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, belt_collar, claw_(weapon), claws, looking_at_viewer, o-ring_top, parted_lips, solo, sideboob, purple_ribbon |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, belt_collar, claw_(weapon), claws, o-ring_top, sideboob, solo, white_thighhighs, looking_at_viewer, blush, smile, covered_navel, thighs |
| 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, collar, o-ring_top, simple_background, solo, upper_body, white_background, blush, claws, cleavage, looking_at_viewer, open_mouth, smile |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | black_coat, large_breasts, long_sleeves, neck_ribbon, red_ribbon, smile, white_gloves, 1girl, blush, high-waist_skirt, open_coat, open_mouth, popped_collar, white_leotard, wide_sleeves, black_skirt, closed_eyes, looking_at_viewer, multiple_girls, solo, wand |
| 5 | 33 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | long_sleeves, looking_at_viewer, blue_eyes, 1girl, solo, blue_ribbon, smile, blush, armored_boots, sleeves_past_fingers, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | claws | looking_at_viewer | o-ring_top | solo | belt_collar | blush | open_mouth | pantyhose | claw_(weapon) | parted_lips | sideboob | purple_ribbon | white_thighhighs | smile | covered_navel | thighs | collar | simple_background | upper_body | white_background | cleavage | black_coat | large_breasts | long_sleeves | neck_ribbon | red_ribbon | white_gloves | high-waist_skirt | open_coat | popped_collar | white_leotard | wide_sleeves | black_skirt | closed_eyes | multiple_girls | wand | blue_eyes | blue_ribbon | armored_boots | sleeves_past_fingers | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------------|:-------------|:-------|:--------------|:--------|:-------------|:------------|:----------------|:--------------|:-----------|:----------------|:-------------------|:--------|:----------------|:---------|:---------|:--------------------|:-------------|:-------------------|:-----------|:-------------|:----------------|:---------------|:--------------|:-------------|:---------------|:-------------------|:------------|:----------------|:----------------|:---------------|:--------------|:--------------|:-----------------|:-------|:------------|:--------------|:----------------|:-----------------------|:--------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | | X | X | | | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | | X | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 5 | 33 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | | X | | X | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X |
|
gorar/A-MNIST | ---
license: mit
task_categories:
- image-classification
size_categories:
- 100K<n<1M
--- |
lsh35/test | ---
license: llama2
---
|
ftopal/huggingface-models-embeddings | ---
dataset_info:
features:
- name: sha
dtype: 'null'
- name: last_modified
dtype: 'null'
- name: library_name
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
- name: pipeline_tag
dtype: string
- name: id
dtype: string
- name: tags
sequence: string
- name: created_at
dtype: string
- name: arxiv
sequence: string
- name: languages
sequence: string
- name: tags_str
dtype: string
- name: text_str
dtype: string
- name: text_lists
sequence: string
- name: processed_texts
sequence: string
- name: tokens_length
sequence: int64
- name: input_texts
sequence: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 2528620129
num_examples: 240530
download_size: 1308575820
dataset_size: 2528620129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tugay/nsf_title_qa | ---
dataset_info:
features:
- name: id
dtype: 'null'
- name: question
dtype: 'null'
- name: answer
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 904
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ofis_publik | ---
annotations_creators:
- found
language_creators:
- found
language:
- br
- fr
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: OfisPublik
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- br
- fr
config_name: br-fr
splits:
- name: train
num_bytes: 12256825
num_examples: 63422
download_size: 3856983
dataset_size: 12256825
---
# Dataset Card for OfisPublik
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://opus.nlpl.eu/OfisPublik.php
- **Repository:** None
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
Circularmachines/batch_indexing_machine_230529_004 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 163377382.0
num_examples: 720
download_size: 163389369
dataset_size: 163377382.0
---
# Dataset Card for "batch_indexing_machine_230529_004"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
waboucay/wikilarge | ---
language:
- en
task_categories:
- text2text-generation
---
# WikiLarge
<!-- Provide a quick summary of the dataset. -->
HuggingFace implementation of the WikiLarge corpus for sentence simplification gathered by Zhang, Xingxing and Lapata, Mirella.
/!\ I am not one of the creators of the dataset, I just needed a HF version of this dataset and uploaded it. I encourage you to read the paper introducing the dataset: [Sentence Simplification with Deep Reinforcement Learning](https://aclanthology.org/D17-1062) (Zhang & Lapata, EMNLP 2017)
<!-- ## Dataset Details
### Dataset Description -->
<!-- Provide a longer summary of what this dataset is. -->
<!-- - **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional] -->
<!-- Provide the basic links for the dataset. -->
<!-- - **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed] -->
## Uses
This dataset can be used to train sentence simplification models.
<!-- ### Direct Use -->
<!-- This section describes suitable use cases for the dataset. -->
<!-- [More Information Needed]
### Out-of-Scope Use -->
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
<!-- [More Information Needed] -->
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
- **Size of the generated dataset:** 69.3 MB
An example of 'train' looks as follows.
```
{
'complex': 'Sensing of both the external and internal environments at the cellular level relies on signal transduction . Many disease processes , such as diabetes , heart disease , autoimmunity , and cancer arise from defects in signal transduction pathways , further highlighting the critical importance of signal transduction to biology , as well as medicine .',
'simple': 'A signal transduction in biology , is a cellular mechanism .'
}
```
<!-- ## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
<!-- [More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
<!-- #### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
<!-- [More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
<!-- [More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
<!-- #### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
<!-- [More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
<!-- [More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
<!-- [More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
<!-- [More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
<!-- Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. -->
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@InProceedings{D17-1063,
author = "Zhang, Xingxing
and Lapata, Mirella",
title = "Sentence Simplification with Deep Reinforcement Learning",
booktitle = "Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing",
year = "2017",
publisher = "Association for Computational Linguistics",
pages = "595--605",
location = "Copenhagen, Denmark",
url = "http://aclweb.org/anthology/D17-1063"
}
```
**ACL:**
Xingxing Zhang and Mirella Lapata. 2017. Sentence Simplification with Deep Reinforcement Learning. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 584–594, Copenhagen, Denmark. Association for Computational Linguistics.
<!-- ## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
<!-- [More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] --> |
thanhdath/legal_chat | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
dataset_info:
features:
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 646516984
num_examples: 108780
- name: test_sft
num_bytes: 11923316
num_examples: 2000
download_size: 213534245
dataset_size: 658440300
---
# Dataset Card for "legal_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter6_1_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2772
num_examples: 9
download_size: 3664
dataset_size: 2772
---
# Dataset Card for "chapter6_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/email-spam-classification | ---
license: cc-by-nc-nd-4.0
task_categories:
- text-classification
language:
- en
tags:
- code
- legal
- finance
---
# Email Spam Classification
The dataset consists of a collection of emails categorized into two major classes: **spam** and **not spam**. It is designed to facilitate the development and evaluation of spam detection or email filtering systems.
**The spam emails** in the dataset are typically unsolicited and unwanted messages that aim to promote products or services, spread malware, or deceive recipients for various malicious purposes. These emails often contain misleading subject lines, excessive use of advertisements, unauthorized links, or attempts to collect personal information.
The **non-spam emails** in the dataset are genuine and legitimate messages sent by individuals or organizations. They may include personal or professional communication, newsletters, transaction receipts, or any other non-malicious content.
The dataset encompasses emails of varying *lengths, languages, and writing styles*, reflecting the inherent heterogeneity of email communication. This diversity aids in training algorithms that can generalize well to different types of emails, making them robust against different spammer tactics and variations in non-spam email content.
# Get the dataset
### This is just an example of the data
Leave a request on **[https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/spambase?utm_source=huggingface&utm_medium=cpc&utm_campaign=email-spam-classification)** to discuss your requirements, learn about the price and buy the dataset.
### The dataset's possible applications:
- spam detection
- fraud detection
- email filtering systems
- customer support automation
- natural language processing
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6440e71f603214724eb96358/ehYVU_22FnzlFfxw-DHk7.png)
# File with the extension .csv
includes the following information:
- **title**: title of the email,
- **text**: text of the email,
- **type**: type of the email
# Email spam might be collected in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market/spambase?utm_source=huggingface&utm_medium=cpc&utm_campaign=email-spam-classification)** provides high-quality data annotation tailored to your needs |
AbdulMuqtadir/English_Urdu_Generated_Dataset | ---
license: apache-2.0
---
|
TheFinAI/flare-australian | ---
dataset_info:
features:
- name: id
dtype: int64
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 434142
num_examples: 482
- name: valid
num_bytes: 62168
num_examples: 69
- name: test
num_bytes: 125227
num_examples: 139
download_size: 107361
dataset_size: 621537
---
# Dataset Card for "flare-australian"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ramitha/open-australian-legal-qa-results-k-test-cosine | ---
dataset_info:
features:
- name: index
dtype: 'null'
- name: normal_bert_pipeline_1_result
dtype: 'null'
- name: normal_bert_pipeline_2_context
dtype: 'null'
- name: normal_bert_pipeline_2_result
dtype: 'null'
- name: normal_bert_pipeline_2_case_indexes
sequence: int64
- name: normal_bert_pipeline_3_context
dtype: 'null'
- name: normal_bert_pipeline_3_result
dtype: 'null'
- name: normal_bert_pipeline_3_case_indexes
dtype: 'null'
- name: normal_bert_pipeline_4_context
dtype: 'null'
- name: normal_bert_pipeline_4_result
dtype: 'null'
- name: normal_bert_pipeline_4_case_indexes
sequence: int64
- name: normal_bert_pipeline_5_context
dtype: 'null'
- name: normal_bert_pipeline_5_result
dtype: 'null'
- name: normal_bert_pipeline_5_case_indexes
dtype: 'null'
- name: normal_bert_pipeline_6_context
dtype: 'null'
- name: normal_bert_pipeline_6_result
dtype: 'null'
- name: normal_bert_pipeline_6_case_indexes
sequence: int64
- name: normal_bert_pipeline_7_context
dtype: 'null'
- name: normal_bert_pipeline_7_result
dtype: 'null'
- name: normal_bert_pipeline_7_case_indexes
dtype: 'null'
- name: legal_bert_pipeline_1_result
dtype: 'null'
- name: legal_bert_pipeline_2_context
dtype: 'null'
- name: legal_bert_pipeline_2_result
dtype: 'null'
- name: legal_bert_pipeline_2_case_indexes
sequence: int64
- name: legal_bert_pipeline_3_context
dtype: 'null'
- name: legal_bert_pipeline_3_result
dtype: 'null'
- name: legal_bert_pipeline_3_case_indexes
dtype: 'null'
- name: legal_bert_pipeline_4_context
dtype: 'null'
- name: legal_bert_pipeline_4_result
dtype: 'null'
- name: legal_bert_pipeline_4_case_indexes
sequence: int64
- name: legal_bert_pipeline_5_context
dtype: 'null'
- name: legal_bert_pipeline_5_result
dtype: 'null'
- name: legal_bert_pipeline_5_case_indexes
dtype: 'null'
- name: legal_bert_pipeline_6_context
dtype: 'null'
- name: legal_bert_pipeline_6_result
dtype: 'null'
- name: legal_bert_pipeline_6_case_indexes
sequence: int64
- name: legal_bert_pipeline_7_context
dtype: 'null'
- name: legal_bert_pipeline_7_result
dtype: 'null'
- name: legal_bert_pipeline_7_case_indexes
dtype: 'null'
- name: angle_bert_pipeline_1_result
dtype: 'null'
- name: angle_bert_pipeline_2_context
dtype: 'null'
- name: angle_bert_pipeline_2_result
dtype: 'null'
- name: angle_bert_pipeline_2_case_indexes
sequence: int64
- name: angle_bert_pipeline_3_context
dtype: 'null'
- name: angle_bert_pipeline_3_result
dtype: 'null'
- name: angle_bert_pipeline_3_case_indexes
dtype: 'null'
- name: angle_bert_pipeline_4_context
dtype: 'null'
- name: angle_bert_pipeline_4_result
dtype: 'null'
- name: angle_bert_pipeline_4_case_indexes
sequence: int64
- name: angle_bert_pipeline_5_context
dtype: 'null'
- name: angle_bert_pipeline_5_result
dtype: 'null'
- name: angle_bert_pipeline_5_case_indexes
dtype: 'null'
- name: angle_bert_pipeline_6_context
dtype: 'null'
- name: angle_bert_pipeline_6_result
dtype: 'null'
- name: angle_bert_pipeline_6_case_indexes
sequence: int64
- name: angle_bert_pipeline_7_context
dtype: 'null'
- name: angle_bert_pipeline_7_result
dtype: 'null'
- name: angle_bert_pipeline_7_case_indexes
dtype: 'null'
- name: question
dtype: string
- name: answer
dtype: string
- name: original_texts
dtype: string
- name: question_normal_bert_matching_embeddings
dtype: string
- name: question_legal_bert_matching_embeddings
dtype: string
- name: question_angle_bert_matching_embeddings
dtype: string
- name: question_normal_bert_retrieval_embeddings
dtype: string
- name: question_legal_bert_retrieval_embeddings
dtype: string
- name: question_angle_bert_retrieval_embeddings
dtype: string
- name: answer_normal_bert_matching_embeddings
dtype: string
- name: answer_legal_bert_matching_embeddings
dtype: string
- name: answer_angle_bert_matching_embeddings
dtype: string
- name: answer_normal_bert_retrieval_embeddings
dtype: string
- name: answer_legal_bert_retrieval_embeddings
dtype: string
- name: answer_angle_bert_retrieval_embeddings
dtype: string
- name: case_index
dtype: float64
- name: normal_bert_pipeline_8_case_indexes
sequence: int64
- name: normal_bert_pipeline_10_case_indexes
sequence: int64
- name: normal_bert_pipeline_12_case_indexes
sequence: int64
- name: legal_bert_pipeline_8_case_indexes
sequence: int64
- name: legal_bert_pipeline_10_case_indexes
sequence: int64
- name: legal_bert_pipeline_12_case_indexes
sequence: int64
- name: angle_bert_pipeline_8_case_indexes
sequence: int64
- name: angle_bert_pipeline_10_case_indexes
sequence: int64
- name: angle_bert_pipeline_12_case_indexes
sequence: int64
splits:
- name: ktestcosine
num_bytes: 17409099
num_examples: 35
download_size: 7318031
dataset_size: 17409099
configs:
- config_name: default
data_files:
- split: ktestcosine
path: data/ktestcosine-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/6561e16e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1313
dataset_size: 182
---
# Dataset Card for "6561e16e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sogeeking/vqvae_token | ---
dataset_info:
config_name: Burgers_Sols_Nu0.002
features:
- name: parameters
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: mean
sequence: float32
- name: std
sequence: float32
splits:
- name: train
num_bytes: 82800000
num_examples: 10000
download_size: 16914568
dataset_size: 82800000
configs:
- config_name: Burgers_Sols_Nu0.002
data_files:
- split: train
path: Burgers_Sols_Nu0.002/train-*
---
|
plaguss/oss-pref-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 165537
num_examples: 10
download_size: 86598
dataset_size: 165537
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibivibiv/alpaca_tasksource2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 135759147
num_examples: 253970
download_size: 77133603
dataset_size: 135759147
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_79_1713043108 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2646608
num_examples: 6465
download_size: 1313470
dataset_size: 2646608
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_souvik0306__falcon_7b_3epoch_norobots | ---
pretty_name: Evaluation run of souvik0306/falcon_7b_3epoch_norobots
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [souvik0306/falcon_7b_3epoch_norobots](https://huggingface.co/souvik0306/falcon_7b_3epoch_norobots)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_souvik0306__falcon_7b_3epoch_norobots_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T18:17:00.996113](https://huggingface.co/datasets/open-llm-leaderboard/details_souvik0306__falcon_7b_3epoch_norobots_public/blob/main/results_2023-11-23T18-17-00.996113.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.30608343755546813,\n\
\ \"acc_stderr\": 0.032414744112033704,\n \"acc_norm\": 0.30836499703771436,\n\
\ \"acc_norm_stderr\": 0.03322598255455117,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396731,\n \"mc2\": 0.36274944744996707,\n\
\ \"mc2_stderr\": 0.01351391478780607,\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788269156,\n \"f1\": 0.051564597315436486,\n\
\ \"f1_stderr\": 0.0012887815427970884\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.44112627986348124,\n \"acc_stderr\": 0.014509747749064664,\n\
\ \"acc_norm\": 0.4761092150170648,\n \"acc_norm_stderr\": 0.014594701798071654\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5743875721967735,\n\
\ \"acc_stderr\": 0.0049342503908797785,\n \"acc_norm\": 0.7723561043616809,\n\
\ \"acc_norm_stderr\": 0.004184545675387351\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.03550683989165582,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.03550683989165582\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179326,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179326\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525208,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\
\ \"acc_stderr\": 0.0255606047210229,\n \"acc_norm\": 0.2806451612903226,\n\
\ \"acc_norm_stderr\": 0.0255606047210229\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.031947400722655415,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.031947400722655415\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.033711241426263014,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.033711241426263014\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.03257714077709661,\n\
\ \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.03257714077709661\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.02366129639396428,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28073394495412846,\n \"acc_stderr\": 0.019266055045871613,\n \"\
acc_norm\": 0.28073394495412846,\n \"acc_norm_stderr\": 0.019266055045871613\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079102,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079102\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03068582059661079,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03068582059661079\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431194,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.31545338441890164,\n\
\ \"acc_stderr\": 0.01661750173876339,\n \"acc_norm\": 0.31545338441890164,\n\
\ \"acc_norm_stderr\": 0.01661750173876339\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.34104046242774566,\n \"acc_stderr\": 0.025522474632121615,\n\
\ \"acc_norm\": 0.34104046242774566,\n \"acc_norm_stderr\": 0.025522474632121615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098447,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.02678745311190653,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.02678745311190653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3311897106109325,\n\
\ \"acc_stderr\": 0.026730620728004913,\n \"acc_norm\": 0.3311897106109325,\n\
\ \"acc_norm_stderr\": 0.026730620728004913\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.33024691358024694,\n \"acc_stderr\": 0.026168298456732842,\n\
\ \"acc_norm\": 0.33024691358024694,\n \"acc_norm_stderr\": 0.026168298456732842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26010430247718386,\n\
\ \"acc_stderr\": 0.011204382887823829,\n \"acc_norm\": 0.26010430247718386,\n\
\ \"acc_norm_stderr\": 0.011204382887823829\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.018020474148393577,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.018020474148393577\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n\
\ \"acc_stderr\": 0.032941184790540944,\n \"acc_norm\": 0.31840796019900497,\n\
\ \"acc_norm_stderr\": 0.032941184790540944\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611549,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611549\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396731,\n \"mc2\": 0.36274944744996707,\n\
\ \"mc2_stderr\": 0.01351391478780607\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6953433307024467,\n \"acc_stderr\": 0.012935646499325307\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \
\ \"em_stderr\": 0.00041913301788269156,\n \"f1\": 0.051564597315436486,\n\
\ \"f1_stderr\": 0.0012887815427970884\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.015163002274450341,\n \"acc_stderr\": 0.0033660229497263386\n\
\ }\n}\n```"
repo_url: https://huggingface.co/souvik0306/falcon_7b_3epoch_norobots
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|drop|3_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-17-00.996113.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-17-00.996113.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- '**/details_harness|winogrande|5_2023-11-23T18-17-00.996113.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T18-17-00.996113.parquet'
- config_name: results
data_files:
- split: 2023_11_23T18_17_00.996113
path:
- results_2023-11-23T18-17-00.996113.parquet
- split: latest
path:
- results_2023-11-23T18-17-00.996113.parquet
---
# Dataset Card for Evaluation run of souvik0306/falcon_7b_3epoch_norobots
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/souvik0306/falcon_7b_3epoch_norobots
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [souvik0306/falcon_7b_3epoch_norobots](https://huggingface.co/souvik0306/falcon_7b_3epoch_norobots) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_souvik0306__falcon_7b_3epoch_norobots_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T18:17:00.996113](https://huggingface.co/datasets/open-llm-leaderboard/details_souvik0306__falcon_7b_3epoch_norobots_public/blob/main/results_2023-11-23T18-17-00.996113.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.30608343755546813,
"acc_stderr": 0.032414744112033704,
"acc_norm": 0.30836499703771436,
"acc_norm_stderr": 0.03322598255455117,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396731,
"mc2": 0.36274944744996707,
"mc2_stderr": 0.01351391478780607,
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269156,
"f1": 0.051564597315436486,
"f1_stderr": 0.0012887815427970884
},
"harness|arc:challenge|25": {
"acc": 0.44112627986348124,
"acc_stderr": 0.014509747749064664,
"acc_norm": 0.4761092150170648,
"acc_norm_stderr": 0.014594701798071654
},
"harness|hellaswag|10": {
"acc": 0.5743875721967735,
"acc_stderr": 0.0049342503908797785,
"acc_norm": 0.7723561043616809,
"acc_norm_stderr": 0.004184545675387351
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.03550683989165582,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.03550683989165582
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179326,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179326
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525208,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.0255606047210229,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.0255606047210229
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.031947400722655415,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.031947400722655415
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.033711241426263014,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.033711241426263014
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.03257714077709661,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.03257714077709661
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28073394495412846,
"acc_stderr": 0.019266055045871613,
"acc_norm": 0.28073394495412846,
"acc_norm_stderr": 0.019266055045871613
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079102,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079102
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03068582059661079,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03068582059661079
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431194,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.31545338441890164,
"acc_stderr": 0.01661750173876339,
"acc_norm": 0.31545338441890164,
"acc_norm_stderr": 0.01661750173876339
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.025522474632121615,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.025522474632121615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098447,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.02678745311190653,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.02678745311190653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3311897106109325,
"acc_stderr": 0.026730620728004913,
"acc_norm": 0.3311897106109325,
"acc_norm_stderr": 0.026730620728004913
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.33024691358024694,
"acc_stderr": 0.026168298456732842,
"acc_norm": 0.33024691358024694,
"acc_norm_stderr": 0.026168298456732842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26010430247718386,
"acc_stderr": 0.011204382887823829,
"acc_norm": 0.26010430247718386,
"acc_norm_stderr": 0.011204382887823829
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.018020474148393577,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.018020474148393577
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31840796019900497,
"acc_stderr": 0.032941184790540944,
"acc_norm": 0.31840796019900497,
"acc_norm_stderr": 0.032941184790540944
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611549,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611549
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396731,
"mc2": 0.36274944744996707,
"mc2_stderr": 0.01351391478780607
},
"harness|winogrande|5": {
"acc": 0.6953433307024467,
"acc_stderr": 0.012935646499325307
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269156,
"f1": 0.051564597315436486,
"f1_stderr": 0.0012887815427970884
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263386
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Yama/augmath | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 8673014
num_examples: 28386
download_size: 833425
dataset_size: 8673014
---
# Dataset Card for "augmath"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713049791 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 12319
num_examples: 27
download_size: 9194
dataset_size: 12319
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713049791"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/buet_model_buet_test_data_paraphrase_detection | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7576560.9
num_examples: 36000
- name: test
num_bytes: 841840.1
num_examples: 4000
download_size: 3715813
dataset_size: 8418401.0
---
# Dataset Card for "buet_model_buet_test_data_paraphrase_detection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Juanid14317/UrduSentimentAnalysis | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 22541332.2496945
num_examples: 13993
- name: test
num_bytes: 1187233.750305499
num_examples: 737
download_size: 11767554
dataset_size: 23728566.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Atipico1/trivia-top5_preprocessed_with_o-u_case | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 107507400
num_examples: 10000
- name: test
num_bytes: 121815925
num_examples: 11313
download_size: 138962266
dataset_size: 229323325
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
davanstrien/autotrain-data-flyswot-jan | Invalid username or password. |
hemachandher/final_dataset | ---
dataset_info:
features:
- name: image
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: text
dtype: string
splits:
- name: train
num_bytes: 138098403
num_examples: 1001
download_size: 100680621
dataset_size: 138098403
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Cybersoulja/djscrew | ---
license: artistic-2.0
---
|
EduardoPacheco/FoodSeg103 | ---
license: apache-2.0
task_categories:
- image-segmentation
task_ids:
- semantic-segmentation
size_categories:
- n<1K
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1125278411.056
num_examples: 4983
- name: validation
num_bytes: 114576466.17
num_examples: 2135
download_size: 1259085777
dataset_size: 1239854877.226
---
# Dataset Card for FoodSeg103
## Table of Contents
- [Dataset Card for FoodSeg103](#dataset-card-for-foodseg103)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Dataset Structure](#dataset-structure)
- [Data categories](#data-categories)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Refinement process](#refinement-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Dataset homepage](https://xiongweiwu.github.io/foodseg103.html)
- **Repository:** [FoodSeg103-Benchmark-v1](https://github.com/LARC-CMU-SMU/FoodSeg103-Benchmark-v1)
- **Paper:** [A Large-Scale Benchmark for Food Image Segmentation](https://arxiv.org/pdf/2105.05409.pdf)
- **Point of Contact:** [Not Defined]
### Dataset Summary
FoodSeg103 is a large-scale benchmark for food image segmentation. It contains 103 food categories and 7118 images with ingredient level pixel-wise annotations. The dataset is a curated sample from [Recipe1M](https://github.com/facebookresearch/inversecooking) and annotated and refined by human annotators. The dataset is split into 2 subsets: training set, validation set. The training set contains 4983 images and the validation set contains 2135 images.
### Supported Tasks and Leaderboards
No leaderboard is available for this dataset at the moment.
## Dataset Structure
### Data categories
| id | ingridient |
| --- | ---- |
| 0 | background |
| 1 | candy |
| 2 | egg tart |
| 3 | french fries |
| 4 | chocolate |
| 5 | biscuit |
| 6 | popcorn |
| 7 | pudding |
| 8 | ice cream |
| 9 | cheese butter |
| 10 | cake |
| 11 | wine |
| 12 | milkshake |
| 13 | coffee |
| 14 | juice |
| 15 | milk |
| 16 | tea |
| 17 | almond |
| 18 | red beans |
| 19 | cashew |
| 20 | dried cranberries |
| 21 | soy |
| 22 | walnut |
| 23 | peanut |
| 24 | egg |
| 25 | apple |
| 26 | date |
| 27 | apricot |
| 28 | avocado |
| 29 | banana |
| 30 | strawberry |
| 31 | cherry |
| 32 | blueberry |
| 33 | raspberry |
| 34 | mango |
| 35 | olives |
| 36 | peach |
| 37 | lemon |
| 38 | pear |
| 39 | fig |
| 40 | pineapple |
| 41 | grape |
| 42 | kiwi |
| 43 | melon |
| 44 | orange |
| 45 | watermelon |
| 46 | steak |
| 47 | pork |
| 48 | chicken duck |
| 49 | sausage |
| 50 | fried meat |
| 51 | lamb |
| 52 | sauce |
| 53 | crab |
| 54 | fish |
| 55 | shellfish |
| 56 | shrimp |
| 57 | soup |
| 58 | bread |
| 59 | corn |
| 60 | hamburg |
| 61 | pizza |
| 62 | hanamaki baozi |
| 63 | wonton dumplings |
| 64 | pasta |
| 65 | noodles |
| 66 | rice |
| 67 | pie |
| 68 | tofu |
| 69 | eggplant |
| 70 | potato |
| 71 | garlic |
| 72 | cauliflower |
| 73 | tomato |
| 74 | kelp |
| 75 | seaweed |
| 76 | spring onion |
| 77 | rape |
| 78 | ginger |
| 79 | okra |
| 80 | lettuce |
| 81 | pumpkin |
| 82 | cucumber |
| 83 | white radish |
| 84 | carrot |
| 85 | asparagus |
| 86 | bamboo shoots |
| 87 | broccoli |
| 88 | celery stick |
| 89 | cilantro mint |
| 90 | snow peas |
| 91 | cabbage |
| 92 | bean sprouts |
| 93 | onion |
| 94 | pepper |
| 95 | green beans |
| 96 | French beans |
| 97 | king oyster mushroom |
| 98 | shiitake |
| 99 | enoki mushroom |
| 100 | oyster mushroom |
| 101 | white button mushroom |
| 102 | salad |
| 103 | other ingredients |
### Data Splits
This dataset only contains two splits. A training split and a validation split with 4983 and 2135 images respectively.
## Dataset Creation
### Curation Rationale
Select images from a large-scale recipe dataset and annotate them with pixel-wise segmentation masks.
### Source Data
The dataset is a curated sample from [Recipe1M](https://github.com/facebookresearch/inversecooking).
#### Initial Data Collection and Normalization
After selecting the source of the data two more steps were added before image selection.
1. Recipe1M contains 1.5k ingredient categoris, but only the top 124 categories were selected + a 'other' category (further became 103).
2. Images should contain between 2 and 16 ingredients.
3. Ingredients should be visible and easy to annotate.
Which then resulted in 7118 images.
### Annotations
#### Annotation process
Third party annotators were hired to annotate the images respecting the following guidelines:
1. Tag ingredients with appropriate categories.
2. Draw pixel-wise masks for each ingredient.
3. Ignore tiny regions (even if contains ingredients) with area covering less than 5% of the image.
#### Refinement process
The refinement process implemented the following steps:
1. Correct mislabelled ingredients.
2. Deleting unpopular categories that are assigned to less than 5 images (resulting in 103 categories in the final dataset).
3. Merging visually similar ingredient categories (e.g. orange and citrus)
#### Who are the annotators?
A third party company that was not mentioned in the paper.
## Additional Information
### Dataset Curators
Authors of the paper [A Large-Scale Benchmark for Food Image Segmentation](https://arxiv.org/pdf/2105.05409.pdf).
### Licensing Information
[Apache 2.0 license.](https://github.com/LARC-CMU-SMU/FoodSeg103-Benchmark-v1/blob/main/LICENSE)
### Citation Information
```bibtex
@inproceedings{wu2021foodseg,
title={A Large-Scale Benchmark for Food Image Segmentation},
author={Wu, Xiongwei and Fu, Xin and Liu, Ying and Lim, Ee-Peng and Hoi, Steven CH and Sun, Qianru},
booktitle={Proceedings of ACM international conference on Multimedia},
year={2021}
}
```
|
pietrolesci/mnli-stats | ---
dataset_info:
- config_name: pietrolesci__bert-base-uncased_mnli_53fb0761e0
features:
- name: epoch
dtype: int32
- name: uid
dtype: int64
- name: logits
sequence: float64
- name: loss
dtype: float64
- name: gamma
dtype: float64
- name: grad_1norm
dtype: float64
- name: grad_2norm
dtype: float64
- name: grad_infnorm
dtype: float64
- name: label
dtype: int32
splits:
- name: epoch1
num_bytes: 33576024
num_examples: 392702
- name: epoch20
num_bytes: 33576024
num_examples: 392702
- name: epoch12
num_bytes: 33576024
num_examples: 392702
- name: epoch6
num_bytes: 33576024
num_examples: 392702
- name: epoch3
num_bytes: 33576024
num_examples: 392702
- name: epoch14
num_bytes: 33576024
num_examples: 392702
- name: epoch17
num_bytes: 33576024
num_examples: 392702
- name: epoch9
num_bytes: 33576024
num_examples: 392702
- name: epoch5
num_bytes: 33576024
num_examples: 392702
- name: epoch11
num_bytes: 33576024
num_examples: 392702
- name: epoch15
num_bytes: 33576024
num_examples: 392702
- name: epoch16
num_bytes: 33576024
num_examples: 392702
- name: epoch19
num_bytes: 33576024
num_examples: 392702
- name: epoch13
num_bytes: 33576024
num_examples: 392702
- name: epoch7
num_bytes: 33576024
num_examples: 392702
- name: epoch8
num_bytes: 33576024
num_examples: 392702
- name: epoch10
num_bytes: 33576024
num_examples: 392702
- name: epoch18
num_bytes: 33576024
num_examples: 392702
- name: epoch2
num_bytes: 33576024
num_examples: 392702
- name: epoch4
num_bytes: 33576024
num_examples: 392702
download_size: 281263306
dataset_size: 671520480
- config_name: pietrolesci__bert-tiny_mnli_cdc7ea0d50
features:
- name: epoch
dtype: int32
- name: uid
dtype: int64
- name: logits
sequence: float64
- name: loss
dtype: float64
- name: gamma
dtype: float64
- name: grad_1norm
dtype: float64
- name: grad_2norm
dtype: float64
- name: grad_infnorm
dtype: float64
- name: label
dtype: int32
splits:
- name: epoch10
num_bytes: 33576024
num_examples: 392702
- name: epoch18
num_bytes: 33576024
num_examples: 392702
- name: epoch2
num_bytes: 33576024
num_examples: 392702
- name: epoch1
num_bytes: 33576024
num_examples: 392702
- name: epoch20
num_bytes: 33576024
num_examples: 392702
- name: epoch12
num_bytes: 33576024
num_examples: 392702
- name: epoch3
num_bytes: 33576024
num_examples: 392702
- name: epoch6
num_bytes: 33576024
num_examples: 392702
- name: epoch4
num_bytes: 33576024
num_examples: 392702
- name: epoch11
num_bytes: 33576024
num_examples: 392702
- name: epoch16
num_bytes: 33576024
num_examples: 392702
- name: epoch15
num_bytes: 33576024
num_examples: 392702
- name: epoch9
num_bytes: 33576024
num_examples: 392702
- name: epoch17
num_bytes: 33576024
num_examples: 392702
- name: epoch14
num_bytes: 33576024
num_examples: 392702
- name: epoch5
num_bytes: 33576024
num_examples: 392702
- name: epoch19
num_bytes: 33576024
num_examples: 392702
- name: epoch7
num_bytes: 33576024
num_examples: 392702
- name: epoch8
num_bytes: 33576024
num_examples: 392702
- name: epoch13
num_bytes: 33576024
num_examples: 392702
download_size: 207493740
dataset_size: 671520480
configs:
- config_name: pietrolesci__bert-base-uncased_mnli_53fb0761e0
data_files:
- split: epoch1
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch1-*
- split: epoch20
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch20-*
- split: epoch12
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch12-*
- split: epoch6
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch6-*
- split: epoch3
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch3-*
- split: epoch14
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch14-*
- split: epoch17
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch17-*
- split: epoch9
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch9-*
- split: epoch5
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch5-*
- split: epoch11
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch11-*
- split: epoch15
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch15-*
- split: epoch16
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch16-*
- split: epoch19
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch19-*
- split: epoch13
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch13-*
- split: epoch7
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch7-*
- split: epoch8
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch8-*
- split: epoch10
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch10-*
- split: epoch18
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch18-*
- split: epoch2
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch2-*
- split: epoch4
path: pietrolesci__bert-base-uncased_mnli_53fb0761e0/epoch4-*
- config_name: pietrolesci__bert-tiny_mnli_cdc7ea0d50
data_files:
- split: epoch10
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch10-*
- split: epoch18
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch18-*
- split: epoch2
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch2-*
- split: epoch1
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch1-*
- split: epoch20
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch20-*
- split: epoch12
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch12-*
- split: epoch3
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch3-*
- split: epoch6
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch6-*
- split: epoch4
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch4-*
- split: epoch11
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch11-*
- split: epoch16
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch16-*
- split: epoch15
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch15-*
- split: epoch9
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch9-*
- split: epoch17
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch17-*
- split: epoch14
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch14-*
- split: epoch5
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch5-*
- split: epoch19
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch19-*
- split: epoch7
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch7-*
- split: epoch8
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch8-*
- split: epoch13
path: pietrolesci__bert-tiny_mnli_cdc7ea0d50/epoch13-*
---
|
FanChen0116/syn_few0_32500_chat_all_data_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 3934237
num_examples: 22975
- name: validation
num_bytes: 646729
num_examples: 3731
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 5227695
---
# Dataset Card for "syn_few0_32500_chat_all_data_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bruno-cotrim/arch-max-blender-proj | ---
license: apache-2.0
---
|
chargoddard/rpguild | ---
language:
- en
license: cc-by-nc-4.0
size_categories:
- 100K<n<1M
task_categories:
- conversational
- text-generation
dataset_info:
- config_name: default
features:
- name: username
dtype: string
- name: char_name
dtype: string
- name: bio
dtype: string
- name: context
list:
- name: text
dtype: string
- name: username
dtype: string
- name: char_name
dtype: string
- name: reply
dtype: string
- name: has_nameless
dtype: bool
- name: char_confidence
dtype: float64
splits:
- name: train
num_bytes: 1921588254
num_examples: 140469
download_size: 764073630
dataset_size: 1921588254
- config_name: grammar_filtered
features:
- name: username
dtype: string
- name: char_name
dtype: string
- name: bio
dtype: string
- name: context
list:
- name: char_name
dtype: string
- name: text
dtype: string
- name: username
dtype: string
- name: reply
dtype: string
- name: char_confidence
dtype: float64
splits:
- name: train
num_bytes: 371438765
num_examples: 27053
download_size: 166606326
dataset_size: 371438765
- config_name: high_confidence
features:
- name: username
dtype: string
- name: char_name
dtype: string
- name: bio
dtype: string
- name: context
list:
- name: text
dtype: string
- name: username
dtype: string
- name: char_name
dtype: string
- name: reply
dtype: string
- name: has_nameless
dtype: bool
- name: char_confidence
dtype: float64
splits:
- name: train
num_bytes: 949419370.7676569
num_examples: 69403
download_size: 386317057
dataset_size: 949419370.7676569
- config_name: pruned
features:
- name: username
dtype: string
- name: char_name
dtype: string
- name: bio
dtype: string
- name: context
list:
- name: text
dtype: string
- name: username
dtype: string
- name: char_name
dtype: string
- name: reply
dtype: string
- name: has_nameless
dtype: bool
- name: char_confidence
dtype: float64
splits:
- name: train
num_bytes: 782484734.2032762
num_examples: 57200
download_size: 326987882
dataset_size: 782484734.2032762
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: grammar_filtered
data_files:
- split: train
path: grammar_filtered/train-*
- config_name: high_confidence
data_files:
- split: train
path: high_confidence/train-*
- config_name: pruned
data_files:
- split: train
path: pruned/train-*
tags:
- roleplay
- not-for-all-audiences
---
Data scraped from [roleplayerguild](https://www.roleplayerguild.com/) and parsed into prompts with a conversation history and associated character bio. Thanks to an anonymous internet stranger for the original scrape.
As usernames can be associated with multiple character biographies, assignment of characters is a little fuzzy. The `char_confidence` feature reflects how likely this assignment is to be correct. Not all posts in the conversation history necessarily have an associated character name. The column `has_nameless` reflects this.
Each row should fit into 4096 Llama tokens, depending on your prompt format - there's built in slack of 128 tokens + 8 per message.
There are a few configurations available. I *highly* recommend not using the default configuration as it contains a lot of questionable quality data. The options, in order of increasing usefulness:
* `default` - ocean of garbage with some gems
* `high_confidence` - only entries with no nameless posts that are highly likely to be assigned a correct `char_name`/`bio`
* `pruned` - Further filtered from `high_confidence` to remove common types of junk replies
* `grammar_filtered` - run through a grammar checker to remove rows with too many mistakes
The `grammar_filtered` configuration is almost certainly what you want to be using. (Unless you want to do your own processing and filtering.) |
togethercomputer/llama-instruct | ---
license: llama2
language:
- en
---
# llama-instruct
This dataset was used to finetune [Llama-2-7B-32K-Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct).
We follow the distillation paradigm that is used by [Alpaca](https://crfm.stanford.edu/2023/03/13/alpaca.html), [Vicuna](https://lmsys.org/blog/2023-03-30-vicuna/), [WizardLM](https://arxiv.org/abs/2304.12244), [Orca](https://www.microsoft.com/en-us/research/publication/orca-progressive-learning-from-complex-explanation-traces-of-gpt-4/)
— producing instructions by querying a powerful LLM, which in our case, is the [Llama-2-70B-Chat](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) model released by [Meta](https://ai.meta.com/llama/).
To build [Llama-2-7B-32K-Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct), we collect instructions from 19K human inputs extracted from [ShareGPT-90K](https://huggingface.co/datasets/philschmid/sharegpt-raw) (only using human inputs, not ChatGPT outputs).
The actual script handles multi-turn conversations and also supports restarting and caching via a SQLite3 database.
You can find the full script [here](https://github.com/togethercomputer/Llama-2-7B-32K-Instruct/blob/main/scripts/distill.py), with merely 122 lines!
The output of this step is a jsonl file, each line corresponding to one conversation:
```
{"text": "[INST] ... instruction ... [/INST] ... answer ... [INST] ... instruction ... [/INST] ..."}
{"text": "[INST] ... instruction ... [/INST] ... answer ... [INST] ... instruction ... [/INST] ..."}
{"text": "[INST] ... instruction ... [/INST] ... answer ... [INST] ... instruction ... [/INST] ..."}
```
For more details, please refer to the [Github repo](https://github.com/togethercomputer/Llama-2-7B-32K-Instruct).
## Languages
The language of the data is entirely English. |
saibo/bookcorpus_compact_1024_shard5_of_10_meta | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
- name: cid_arrangement
sequence: int32
- name: schema_lengths
sequence: int64
- name: topic_entity_mask
sequence: int64
- name: text_lengths
sequence: int64
splits:
- name: train
num_bytes: 7507064864
num_examples: 61605
download_size: 1650231022
dataset_size: 7507064864
---
# Dataset Card for "bookcorpus_compact_1024_shard5_of_10_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indonlp/NusaX-senti | ---
pretty_name: NusaX-senti
annotations_creators:
- expert-generated
language_creators:
- expert-generated
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
language:
- ace
- ban
- bjn
- bug
- en
- id
- jv
- mad
- min
- nij
- su
- bbc
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: lang
dtype: string
- name: label
dtype:
class_label:
names:
0: negative
1: neutral
2: positive
---
# Dataset Card for NusaX-Senti
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [GitHub](https://github.com/IndoNLP/nusax/tree/main/datasets/sentiment)
- **Paper:** [EACL 2022](https://arxiv.org/abs/2205.15960)
- **Point of Contact:** [GitHub](https://github.com/IndoNLP/nusax/tree/main/datasets/sentiment)
### Dataset Summary
NusaX is a high-quality multilingual parallel corpus that covers 12 languages, Indonesian, English, and 10 Indonesian local languages, namely Acehnese, Balinese, Banjarese, Buginese, Madurese, Minangkabau, Javanese, Ngaju, Sundanese, and Toba Batak.
NusaX-Senti is a 3-labels (positive, neutral, negative) sentiment analysis dataset for 10 Indonesian local languages + Indonesian and English.
### Supported Tasks and Leaderboards
- Sentiment analysis for Indonesian languages
### Languages
- ace: acehnese,
- ban: balinese,
- bjn: banjarese,
- bug: buginese,
- eng: english,
- ind: indonesian,
- jav: javanese,
- mad: madurese,
- min: minangkabau,
- nij: ngaju,
- sun: sundanese,
- bbc: toba_batak,
## Dataset Creation
### Curation Rationale
There is a shortage of NLP research and resources for the Indonesian languages, despite the country having over 700 languages. With this in mind, we have created this dataset to support future research for the underrepresented languages in Indonesia.
### Source Data
#### Initial Data Collection and Normalization
NusaX-senti is a dataset for sentiment analysis in Indonesian that has been expertly translated by native speakers.
#### Who are the source language producers?
The data was produced by humans (native speakers).
### Annotations
#### Annotation process
NusaX-senti is derived from SmSA, which is the biggest publicly available dataset for Indonesian sentiment analysis. It comprises of comments and reviews from multiple online platforms. To ensure the quality of our dataset, we have filtered it by removing any abusive language and personally identifying information by manually reviewing all sentences. To ensure balance in the label distribution, we randomly picked 1,000 samples through stratified sampling and then translated them to the corresponding languages.
#### Who are the annotators?
Native speakers of both Indonesian and the corresponding languages.
Annotators were compensated based on the number of translated samples.
### Personal and Sensitive Information
Personal information is removed.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
NusaX is created from review text. These data sources may contain some bias.
### Other Known Limitations
No other known limitations
## Additional Information
### Licensing Information
CC-BY-SA 4.0.
Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.
No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
Please contact authors for any information on the dataset.
### Citation Information
```
@misc{winata2022nusax,
title={NusaX: Multilingual Parallel Sentiment Dataset for 10 Indonesian Local Languages},
author={Winata, Genta Indra and Aji, Alham Fikri and Cahyawijaya,
Samuel and Mahendra, Rahmad and Koto, Fajri and Romadhony,
Ade and Kurniawan, Kemal and Moeljadi, David and Prasojo,
Radityo Eko and Fung, Pascale and Baldwin, Timothy and Lau,
Jey Han and Sennrich, Rico and Ruder, Sebastian},
year={2022},
eprint={2205.15960},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@afaji](https://github.com/afaji) for adding this dataset.
|
yaoandy107/moba-audio | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: conf
dtype: float64
splits:
- name: train
num_bytes: 6540633.0
num_examples: 695
download_size: 3456572
dataset_size: 6540633.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mpingale/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v2_120000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9133587.0
num_examples: 27000
- name: test
num_bytes: 1014843.0
num_examples: 3000
download_size: 4454698
dataset_size: 10148430.0
---
# Dataset Card for "final_train_v2_120000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
israfelsr/img-wikipedia-simple | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license: []
multilinguality:
- monolingual
pretty_name: image-wikipedia-simple
size_categories: []
source_datasets: []
task_categories:
- image-to-text
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed] |
Francesco/uno-deck | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': uno-deck
'1': 0
'2': 1
'3': 2
'4': 3
'5': 4
'6': 5
'7': 6
'8': 7
'9': 8
'10': 9
'11': 10
'12': 11
'13': 12
'14': 13
'15': 14
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: uno-deck
tags:
- rf100
---
# Dataset Card for uno-deck
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/uno-deck
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
uno-deck
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/uno-deck
### Citation Information
```
@misc{ uno-deck,
title = { uno deck Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/uno-deck } },
url = { https://universe.roboflow.com/object-detection/uno-deck },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
WmVernon/nfl-wk14-2023-stats | ---
license: apache-2.0
---
|
SaffalPoosh/deepFashion-with-masks | ---
license: apache-2.0
tags:
- code
pretty_name: fashion clothes segmentation
dataset_info:
features:
- name: images
dtype: image
- name: gender
dtype: string
- name: pose
dtype: string
- name: cloth_type
dtype: string
- name: pid
dtype: string
- name: caption
dtype: string
- name: mask
dtype: image
- name: mask_overlay
dtype: image
splits:
- name: train
num_bytes: 1821511821.448
num_examples: 40658
download_size: 1449380618
dataset_size: 1821511821.448
---
# Dataset
Dataset name is deepfashion2 datasest, the dataset is in raw form with annotations, for original dataset repo. see `https://github.com/switchablenorms/DeepFashion2`
This dataset is just the extracted version of original deepfashion2 dataset and can be used for training **Controlnet Model**. |
Codec-SUPERB/vocalset_unit | ---
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 50680575
num_examples: 3612
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 50680575
num_examples: 3612
- name: academicodec_hifi_24k_320d
num_bytes: 75967935
num_examples: 3612
- name: audiodec_24k_320d
num_bytes: 162173727
num_examples: 3612
- name: dac_16k
num_bytes: 194105311
num_examples: 3612
- name: dac_24k
num_bytes: 763939231
num_examples: 3612
- name: dac_44k
num_bytes: 245367967
num_examples: 3612
- name: encodec_24k_12bps
num_bytes: 304011679
num_examples: 3612
- name: encodec_24k_1_5bps
num_bytes: 38095231
num_examples: 3612
- name: encodec_24k_24bps
num_bytes: 607916191
num_examples: 3612
- name: encodec_24k_3bps
num_bytes: 76083295
num_examples: 3612
- name: encodec_24k_6bps
num_bytes: 152059423
num_examples: 3612
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 405619103
num_examples: 3612
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 405619103
num_examples: 3612
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 405617311
num_examples: 3612
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 203325599
num_examples: 3612
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 405617311
num_examples: 3612
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 203325599
num_examples: 3612
- name: speech_tokenizer_16k
num_bytes: 101484703
num_examples: 3612
download_size: 729684692
dataset_size: 4851689869
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
AMead10/Universal-Verified-Camel | ---
dataset_info:
features:
- name: conversation
list:
- name: input
dtype: string
- name: output
dtype: string
- name: system
dtype: string
splits:
- name: train
num_bytes: 326725
num_examples: 127
download_size: 168364
dataset_size: 326725
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Universal-Verified-Camel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wjm123/wjm123 | ---
license: afl-3.0
---
|
AIrtisian/testcsv | ---
license: other
---
|
MatsuoDochiai/Took | ---
license: openrail
---
|
wav2gloss/fieldwork | ---
license: cc-by-nc-sa-4.0
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: language
dtype: string
- name: speaker
dtype: string
- name: surface
dtype: string
- name: underlying
dtype: string
- name: gloss
dtype: string
- name: translation
dtype: string
- name: translation_language
dtype: string
- name: length
dtype: float32
- name: discard
dtype: bool
splits:
- name: train
num_bytes: 4841476668.601
num_examples: 48987
- name: validation
num_bytes: 879881255.295
num_examples: 7715
- name: test
num_bytes: 2556166473.915
num_examples: 23759
download_size: 8175211998
dataset_size: 8277524397.811
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Wav2Gloss Fieldwork Corpus
## Description
The Wav2Gloss Fieldwork corpus is a collection of linguistic field recordings which have been previously transcribed and glossed. The dataset is used by Wav2Gloss project to develop machine learning models that can automatically generate transcriptions, morphological segmentations, glosses, and translations, with the goal of helping linguists annotate field data.
## Statistics
See below for a breakdown of languages by training and dev/test hours.
| Glottocode | Name | CC Type | Train (h) | Dev+Test (h) |
| ---------- | -------------------- | -------- | --------- | ------------ |
| `beja1238` | Beja | BY-NC | 1.55 | 0.29 |
| `ruul1235` | Ruuli | BY | 0.96 | 0.28 |
| `texi1237` | Texistepec Popoluca | BY | 0.84 | 0.26 |
| `komn1238` | Komnzo | BY | 0.73 | 0.42 |
| `arap1274` | Arapaho | BY | 0.56 | 0.88 |
| `goro1270` | Gorwaa | BY | 0.52 | 0.45 |
| `teop1238` | Teop | BY | 0.52 | 0.52 |
| `nngg1234` | Nǁng | BY | 0.52 | 0.33 |
| `sumi1235` | Sümi | BY | 0.40 | 0.40 |
| `jeju1234` | Jejuan | BY | 0.38 | 0.65 |
| `bora1263` | Bora | BY | 0.23 | 1.44 |
| `apah1238` | Yali (Apahapsili) | BY-NC-SA | 0.18 | 0.27 |
| `port1286` | Daakie | BY | 0.14 | 0.75 |
| `savo1255` | Savosavo | BY | 0.10 | 1.20 |
| `trin1278` | Mojeño Trinitario | BY | - | 1.56 |
| `sout2856` | Nafsan (South Efate) | BY-NC-SA | - | 1.55 |
| `pnar1238` | Pnar | BY-NC | - | 0.91 |
| `kaka1265` | Kakabe | BY | - | 0.90 |
| `vera1241` | Vera'a | BY | 1.02 | 0.97 |
| `tond1251` | Tondano | BY | 0.22 | 0.67 |
| `taul1251` | Tulil | BY | - | 1.18 |
| `arta1239` | Arta | BY | - | 0.91 |
| `nort2641` | Northern Kurdish | BY | - | 0.86 |
| `tehr1242` | Persian | BY | - | 0.82 |
| `taba1259` | Tabasaran | BY | - | 0.79 |
| `sanz1248` | Sanzhi Dargwa | BY | - | 0.67 |
| `kach1280` | Jinghpaw | BY | - | 0.66 |
| `mand1415` | Mandarin | BY | - | 0.66 |
| `sumb1241` | Sumbawa | BY | - | 0.63 |
| `kara1499` | Kalamang | BY | - | 0.59 |
| `slav1254` | Slavomolisano | BY-NC | 1.01 | 0.96 |
| `balk1252` | Balkan Romani | BY-NC-SA | - | 0.35 |
| `dolg1241` | Dolgan | BY-NC-SA | 11.64 | 1.23 |
| `kama1378` | Kamas | BY-NC-SA | 9.91 | 1.15 |
| `selk1253` | Selkup | BY-NC-SA | 1.70 | 1.15 |
| `even1259` | Evenki | BY-NC-SA | 1.54 | 1.13 |
| `ainu1240` | Ainu | BY-SA | 7.12 | 1.13 |
## Citation
```bibtex
```
## Corpora citations
#### Yali (Apahapsili) (apah1238)
```bibtex
@incollection{doreco-apah1238,
address = {Berlin \& Lyon},
author = {Riesberg, Sonja},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Yali (Apahapsili) DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/apah1238},
doi = {10.34847/nkl.9d91nkq2},
urldate = {07/10/2023},
year = {2022}
}
```
#### Arapaho (arap1274)
```bibtex
@incollection{doreco-arap1274,
address = {Berlin \& Lyon},
author = {Cowell, Andrew},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Arapaho DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/arap1274},
doi = {10.34847/nkl.36f5r1b6},
urldate = {07/10/2023},
year = {2022}
}
```
#### Beja (beja1238)
```bibtex
@incollection{doreco-beja1238,
address = {Berlin \& Lyon},
author = {Vanhove, Martine},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Beja DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/beja1238},
doi = {10.34847/nkl.edd011t1},
urldate = {07/10/2023},
year = {2022}
}
```
#### Bora (bora1263)
```bibtex
@incollection{doreco-bora1263,
address = {Berlin \& Lyon},
author = {Seifart, Frank},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Bora DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/bora1263},
doi = {10.34847/nkl.6eaf5laq},
urldate = {07/10/2023},
year = {2022}
}
```
#### Gorwaa (goro1270)
```bibtex
@incollection{doreco-goro1270,
address = {Berlin \& Lyon},
author = {Harvey, Andrew},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Gorwaa DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/goro1270},
doi = {10.34847/nkl.a4b4ijj2},
urldate = {07/10/2023},
year = {2022}
}
```
#### Jejuan (jeju1234)
```bibtex
@incollection{doreco-jeju1234,
address = {Berlin \& Lyon},
author = {Kim, Soung-U},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Jejuan DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/jeju1234},
doi = {10.34847/nkl.06ebrk38},
urldate = {07/10/2023},
year = {2022}
}
```
#### Kakabe (kaka1265)
```bibtex
@incollection{doreco-kaka1265,
address = {Berlin \& Lyon},
author = {Vydrina, Alexandra},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Kakabe DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/kaka1265},
doi = {10.34847/nkl.d5aeu9t6},
urldate = {07/10/2023},
year = {2022}
}
```
#### Komnzo (komn1238)
```bibtex
@incollection{doreco-komn1238,
address = {Berlin \& Lyon},
author = {Döhler, Christian},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Komnzo DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/komn1238},
doi = {10.34847/nkl.c5e6dudv},
urldate = {07/10/2023},
year = {2022}
}
```
#### Nǁng (nngg1234)
```bibtex
@incollection{doreco-nngg1234,
address = {Berlin \& Lyon},
author = {Güldemann, Tom and Ernszt, Martina and Siegmund, Sven and Witzlack-Makarevich, Alena},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Nǁng DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/nngg1234},
doi = {10.34847/nkl.f6c37fi0},
urldate = {07/10/2023},
year = {2022}
}
```
#### Pnar (pnar1238)
```bibtex
@incollection{doreco-pnar1238,
address = {Berlin \& Lyon},
author = {Ring, Hiram},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Pnar DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/pnar1238},
doi = {10.34847/nkl.5ba1062k},
urldate = {07/10/2023},
year = {2022}
}
```
#### Daakie (port1286)
```bibtex
@incollection{doreco-port1286,
address = {Berlin \& Lyon},
author = {Krifka, Manfred},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Daakie DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/port1286},
doi = {10.34847/nkl.efeav5l9},
urldate = {07/10/2023},
year = {2022}
}
```
#### Ruuli (ruul1235)
```bibtex
@incollection{doreco-ruul1235,
address = {Berlin \& Lyon},
author = {Witzlack-Makarevich, Alena and Namyalo, Saudah and Kiriggwajjo, Anatol and Molochieva, Zarina},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Ruuli DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/ruul1235},
doi = {10.34847/nkl.fde4pp1u},
urldate = {07/10/2023},
year = {2022}
}
```
#### Savosavo (savo1255)
```bibtex
@incollection{doreco-savo1255,
address = {Berlin \& Lyon},
author = {Wegener, Claudia},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Savosavo DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/savo1255},
doi = {10.34847/nkl.b74d1b33},
urldate = {07/10/2023},
year = {2022}
}
```
#### Nafsan (South Efate) (sout2856)
```bibtex
@incollection{doreco-sout2856,
address = {Berlin \& Lyon},
author = {Thieberger, Nick},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Nafsan (South Efate) DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/sout2856},
doi = {10.34847/nkl.ba4f760l},
urldate = {07/10/2023},
year = {2022}
}
```
#### Sümi (sumi1235)
```bibtex
@incollection{doreco-sumi1235,
address = {Berlin \& Lyon},
author = {Teo, Amos},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Sümi DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/sumi1235},
doi = {10.34847/nkl.5ad4t01p},
urldate = {07/10/2023},
year = {2022}
}
```
#### Teop (teop1238)
```bibtex
@incollection{doreco-teop1238,
address = {Berlin \& Lyon},
author = {Mosel, Ulrike},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Teop DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/teop1238},
doi = {10.34847/nkl.9322sdf2},
urldate = {07/10/2023},
year = {2022}
}
```
#### Texistepec Popoluca (texi1237)
```bibtex
@incollection{doreco-texi1237,
address = {Berlin \& Lyon},
author = {Wichmann, Søren},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Texistepec Popoluca DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/texi1237},
doi = {10.34847/nkl.c50ck58f},
urldate = {07/10/2023},
year = {2022}
}
```
#### Mojeño Trinitario (trin1278)
```bibtex
@incollection{doreco-trin1278,
address = {Berlin \& Lyon},
author = {Rose, Françoise},
booktitle = {Language Documentation Reference Corpus (DoReCo) 1.2},
editor = {Seifart, Frank and Paschen, Ludger and Stave, Matthew},
publisher = {Leibniz-Zentrum Allgemeine Sprachwissenschaft \& laboratoire Dynamique Du Langage (UMR5596, CNRS \& Université Lyon 2)},
title = {Mojeño Trinitario DoReCo dataset},
url = {https://doreco.huma-num.fr/languages/trin1278},
doi = {10.34847/nkl.cbc3b4xr},
urldate = {07/10/2023},
year = {2022}
}
```
#### Dolgan (dolg1241)
```bibtex
@misc{inel-dolgan,
author = {Däbritz, Chris Lasse and
Kudryakova, Nina and
Stapert, Eugénie},
title = {INEL Dolgan Corpus},
month = nov,
year = 2022,
doi = {10.25592/uhhfdm.11165},
url = {https://doi.org/10.25592/uhhfdm.11165}
}
```
#### Evenki (even1259)
```bibtex
@misc{inel-evenki,
author = {Däbritz, Chris Lasse and
Gusev, Valentin},
title = {INEL Evenki Corpus},
month = dec,
year = 2021,
doi = {10.25592/uhhfdm.9628},
url = {https://doi.org/10.25592/uhhfdm.9628}
}
```
#### Kamas (kama1378)
```bibtex
@misc{inel-kamas,
author = {Gusev, Valentin and
Klooster, Tiina and
Wagner-Nagy, Beáta},
title = {INEL Kamas Corpus},
month = dec,
year = 2019,
doi = {10.25592/uhhfdm.9752},
url = {https://doi.org/10.25592/uhhfdm.9752}
}
```
#### Selkup (selk1253)
```bibtex
@misc{inel-selkup,
author = {Brykina, Maria and
Orlova, Svetlana and
Wagner-Nagy, Beáta},
title = {INEL Selkup Corpus},
month = dec,
year = 2021,
doi = {10.25592/uhhfdm.9754},
url = {https://doi.org/10.25592/uhhfdm.9754}
}
```
#### Arta (arta1239)
```bibtex
@incollection{arta1239,
author = {Kimoto, Yukinori},
title = {{Multi-CAST Arta}},
year = {2019},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#arta}
}
```
#### Jinghpaw (kach1280)
```bibtex
@incollection{kach1280,
author = {Kurabe, Keita},
title = {{Multi-CAST Jinghpaw}},
year = {2021},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#jinghpaw}
}
```
#### Kalamang (kara1499)
```bibtex
@incollection{kara1499,
author = {Visser, Eline},
title = {{Multi-CAST Kalamang}},
year = {2021},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#kalamang}
}
```
#### Mandarin (mand1415)
```bibtex
@incollection{mand1415,
author = {Vollmer, Maria},
title = {{Multi-CAST Mandarin}},
year = {2020},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#mandarin}
}
```
#### Northern Kurdish (nort2641)
```bibtex
@incollection{nort2641,
author = {Haig, Geoffrey and Vollmer, Maria and Thiele, Hanna},
title = {{Multi-CAST Northern Kurdish}},
year = {2015},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#nkurd}
}
```
#### Sanzhi Dargwa (sanz1248)
```bibtex
@incollection{sanz1248,
author = {Forker, Diana and Schiborr, Nils N.},
title = {{Multi-CAST Sanzhi Dargwa}},
year = {2019},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#sanzhi}
}
```
#### Sumbawa (sumb1241)
```bibtex
@incollection{sumb1241,
author = {Shiohara, Asako},
title = {{Multi-CAST Sumbawa}},
year = {2022},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#sumbawa}
}
```
#### Tabasaran (taba1259)
```bibtex
@incollection{taba1259,
author = {Bogomolova, Natalia & Ganenkov, Dmitry & Schiborr, Nils N.},
title = {{Multi-CAST Tabasaran}},
year = {2021},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#tabasaran}
}
```
#### Tulil (taul1251)
```bibtex
@incollection{taul1251,
author = {Meng, Chenxi},
title = {{Multi-CAST Tulil}},
year = {2016},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#tulil}
}
```
#### Persian (tehr1242)
```bibtex
@incollection{tehr1242,
author = {Adibifar, Shirin},
title = {{Multi-CAST Persian}},
year = {2016},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#persian}
}
```
#### Tondano (tond1251)
```bibtex
@incollection{tond1251,
author = {Brickell, Timothy},
title = {{Multi-CAST Tondano}},
year = {2016},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#tondano}
}
```
#### Vera'a (vera1241)
```bibtex
@incollection{vera1241,
author = {Schnell, Stefan},
title = {{Multi-CAST Vera'a}},
year = {2015},
editor = {Haig, Geoffrey and Schnell, Stefan},
booktitle = {{Multi-CAST}},
booksubtitle = {{Multilingual corpus of annotated spoken texts}},
note = {Version 2211}},
address = {Bamberg},
publisher = {University of Bamberg},
url = {multicast.aspra.uni-bamberg.de/#veraa}
}
```
#### Balkan Romani (balk1252)
```bibtex
@misc{balk1252,
title={{Le romani (xoraxane, vlax du sud, Grèce)}},
url={https://pangloss.cnrs.fr/corpus/Romani_(Xoraxane,_Southern_Vlax,_Greece)},
journal={La collection Pangloss},
author={Adamou, Evangelia}
}
```
#### Slavomolisano (slav1254)
```bibtex
@misc{slav1254,
title={{Na-našu (slave Molisan) : Le dialecte d’acquaviva collecroce}},
url={https://pangloss.cnrs.fr/corpus/Na-na%C5%A1u_(Acquaviva_Collecroce)},
journal={La collection Pangloss},
author={Breu, Walter}
}
```
#### Ainu (ainu1240)
```bibtex
@misc{ninjal-ainu-folklore,
title={A Glossed Audio Corpus of Ainu Folklore},
url={https://ainu.ninjal.ac.jp/folklore/},
author={Nakagawa, Hiroshi and Bugaeva, Anna and Kobayashi, Miki and Yoshikawa, Yoshimi},
publisher={The National Institute for Japanese Language and Linguistics ({NINJAL})},
date={2016--2021}
}
```
|
Itaki/Chocothul | ---
license: openrail
---
|
open-llm-leaderboard/details_aboros98__merlin1.2 | ---
pretty_name: Evaluation run of aboros98/merlin1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aboros98/merlin1.2](https://huggingface.co/aboros98/merlin1.2) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aboros98__merlin1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T15:48:59.739721](https://huggingface.co/datasets/open-llm-leaderboard/details_aboros98__merlin1.2/blob/main/results_2024-03-14T15-48-59.739721.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.566620364984462,\n\
\ \"acc_stderr\": 0.033868726883359755,\n \"acc_norm\": 0.5679730470859095,\n\
\ \"acc_norm_stderr\": 0.03456526384633218,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.01609588415538685,\n \"mc2\": 0.46240578362725326,\n\
\ \"mc2_stderr\": 0.015035560895837513\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.01447113339264247,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5575582553276239,\n\
\ \"acc_stderr\": 0.004956609327218404,\n \"acc_norm\": 0.7418840868352917,\n\
\ \"acc_norm_stderr\": 0.004367037632204528\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590636,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590636\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187898,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187898\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.02524277098712618,\n \
\ \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.02524277098712618\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790232,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790232\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.02441494730454368,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.02441494730454368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879702,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879702\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21340782122905028,\n\
\ \"acc_stderr\": 0.013702859932196094,\n \"acc_norm\": 0.21340782122905028,\n\
\ \"acc_norm_stderr\": 0.013702859932196094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829027,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829027\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.02773125864701199,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.02773125864701199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839796,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016626,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328923,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328923\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.01609588415538685,\n \"mc2\": 0.46240578362725326,\n\
\ \"mc2_stderr\": 0.015035560895837513\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.012173009642449155\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5109931766489765,\n \
\ \"acc_stderr\": 0.013769155509690907\n }\n}\n```"
repo_url: https://huggingface.co/aboros98/merlin1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|arc:challenge|25_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|gsm8k|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hellaswag|10_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-48-59.739721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T15-48-59.739721.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- '**/details_harness|winogrande|5_2024-03-14T15-48-59.739721.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T15-48-59.739721.parquet'
- config_name: results
data_files:
- split: 2024_03_14T15_48_59.739721
path:
- results_2024-03-14T15-48-59.739721.parquet
- split: latest
path:
- results_2024-03-14T15-48-59.739721.parquet
---
# Dataset Card for Evaluation run of aboros98/merlin1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aboros98/merlin1.2](https://huggingface.co/aboros98/merlin1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aboros98__merlin1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T15:48:59.739721](https://huggingface.co/datasets/open-llm-leaderboard/details_aboros98__merlin1.2/blob/main/results_2024-03-14T15-48-59.739721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.566620364984462,
"acc_stderr": 0.033868726883359755,
"acc_norm": 0.5679730470859095,
"acc_norm_stderr": 0.03456526384633218,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.01609588415538685,
"mc2": 0.46240578362725326,
"mc2_stderr": 0.015035560895837513
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.01447113339264247,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449696
},
"harness|hellaswag|10": {
"acc": 0.5575582553276239,
"acc_stderr": 0.004956609327218404,
"acc_norm": 0.7418840868352917,
"acc_norm_stderr": 0.004367037632204528
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590636,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590636
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187898,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187898
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.02524277098712618,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.02524277098712618
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790232,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790232
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02441494730454368,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02441494730454368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879702,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879702
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21340782122905028,
"acc_stderr": 0.013702859932196094,
"acc_norm": 0.21340782122905028,
"acc_norm_stderr": 0.013702859932196094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829027,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.02773125864701199,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.02773125864701199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839796,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016626,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328923,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328923
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.01609588415538685,
"mc2": 0.46240578362725326,
"mc2_stderr": 0.015035560895837513
},
"harness|winogrande|5": {
"acc": 0.749802683504341,
"acc_stderr": 0.012173009642449155
},
"harness|gsm8k|5": {
"acc": 0.5109931766489765,
"acc_stderr": 0.013769155509690907
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SkyWR/Nigga | ---
license: openrail
---
|
Samburskoy/TT4 | ---
license: openrail
---
|
Locutusque/cogstack-qa-sharegpt | ---
dataset_info:
features:
- name: org_text
dtype: string
- name: raw_id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 26117232
num_examples: 24665
download_size: 11459634
dataset_size: 26117232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/851887d0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1340
dataset_size: 182
---
# Dataset Card for "851887d0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CalderaAI__30B-Epsilon | ---
pretty_name: Evaluation run of CalderaAI/30B-Epsilon
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CalderaAI/30B-Epsilon](https://huggingface.co/CalderaAI/30B-Epsilon) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__30B-Epsilon\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T15:01:08.880467](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__30B-Epsilon/blob/main/results_2023-12-02T15-01-08.880467.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24564063684609552,\n\
\ \"acc_stderr\": 0.011857183603902225\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.24564063684609552,\n \"acc_stderr\": 0.011857183603902225\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CalderaAI/30B-Epsilon
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_09T14_24_13.994751
path:
- '**/details_harness|drop|3_2023-09-09T14-24-13.994751.parquet'
- split: 2023_09_23T06_45_40.292570
path:
- '**/details_harness|drop|3_2023-09-23T06-45-40.292570.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T06-45-40.292570.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_09T14_24_13.994751
path:
- '**/details_harness|gsm8k|5_2023-09-09T14-24-13.994751.parquet'
- split: 2023_09_23T06_45_40.292570
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-45-40.292570.parquet'
- split: 2023_12_02T15_01_08.880467
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-01-08.880467.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-01-08.880467.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_09T14_24_13.994751
path:
- '**/details_harness|winogrande|5_2023-09-09T14-24-13.994751.parquet'
- split: 2023_09_23T06_45_40.292570
path:
- '**/details_harness|winogrande|5_2023-09-23T06-45-40.292570.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T06-45-40.292570.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- results_2023-08-17T19:47:15.382915.parquet
- split: 2023_09_09T14_24_13.994751
path:
- results_2023-09-09T14-24-13.994751.parquet
- split: 2023_09_23T06_45_40.292570
path:
- results_2023-09-23T06-45-40.292570.parquet
- split: 2023_12_02T15_01_08.880467
path:
- results_2023-12-02T15-01-08.880467.parquet
- split: latest
path:
- results_2023-12-02T15-01-08.880467.parquet
---
# Dataset Card for Evaluation run of CalderaAI/30B-Epsilon
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CalderaAI/30B-Epsilon
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CalderaAI/30B-Epsilon](https://huggingface.co/CalderaAI/30B-Epsilon) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CalderaAI__30B-Epsilon",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:01:08.880467](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__30B-Epsilon/blob/main/results_2023-12-02T15-01-08.880467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24564063684609552,
"acc_stderr": 0.011857183603902225
},
"harness|gsm8k|5": {
"acc": 0.24564063684609552,
"acc_stderr": 0.011857183603902225
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
my-north-ai/scriber-data | ---
license: apache-2.0
dataset_info:
features:
- name: audio
dtype: audio
- name: description
dtype: string
- name: n_speakers
dtype: int32
- name: transcription
dtype: string
- name: gender
dtype:
class_label:
names:
'0': M
'1': F
'2': M-F
'3': M-M
'4': F-M
- name: language
dtype:
class_label:
names:
'0': EN
'1': PT
'2': FR
- name: music
dtype:
class_label:
names:
'0': 'YES'
'1': 'NO'
- name: lyrics
dtype:
class_label:
names:
'0': 'YES'
'1': 'NO'
- name: volume
dtype:
class_label:
names:
'0': 'NO'
'1': LOW
'2': MID
'3': HIGH
- name: type_interaction
dtype:
class_label:
names:
'0': TEST
'1': ASSESSMENT
'2': SOAP
'3': GYM
'4': MARQUISE
- name: status
dtype:
class_label:
names:
'0': RAW
'1': NOT-TRANSCRIBED
'2': TRANSCRIBED
'3': VERIFIED
splits:
- name: train
num_bytes: 6460623.0
num_examples: 8
download_size: 6396421
dataset_size: 6460623.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thomasavare/italian-dataset-helsinki | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: english
dtype: string
- name: italian
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: train
num_bytes: 61402
num_examples: 500
download_size: 22595
dataset_size: 61402
---
# Dataset Card for "italian-dataset-helsinki"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lightblue/email_templates | ---
dataset_info:
features:
- name: anonymised_template_text
dtype: string
- name: instruction
dtype: string
- name: url
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 764153
num_examples: 620
download_size: 252151
dataset_size: 764153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_marker_before_sent_train_200_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1450283
num_examples: 1260
- name: validation
num_bytes: 218272
num_examples: 203
download_size: 0
dataset_size: 1668555
---
# Dataset Card for "find_marker_before_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bleugreen/typescript-instruct | ---
task_categories:
- text-classification
- text2text-generation
- summarization
language:
- en
tags:
- typescript
- instruct
- code
size_categories:
- 10K<n<100K
---
# typescript-instruct
A dataset of TypeScript snippets, processed from the typescript subset of [the-stack-smol](https://huggingface.co/datasets/bigcode/the-stack-smol).
# Processing
- Each source file is parsed with the TypeScript AST and queried for 'semantic chunks' of the following types.
```
ClassDeclaration - 2401
ArrowFunction - 16443
MethodDeclaration - 12096
FunctionDeclaration - 3226
TypeAliasDeclaration - 1489
InterfaceDeclaration - 5240
EnumDeclaration - 214
```
- Leading comments are added to the front of `content`
- Removed all chunks over max sequence length (2048)
- Deduplicated / cleaned up
- Generated instructions w/ `gpt-3.5-turbo`
- Ran into of OpenAI API for the month, will finish other half next month
# Dataset Structure
```python
from datasets import load_dataset
load_dataset("bleugreen/typescript-instruct")
DatasetDict({
train: Dataset({
features: ['type', 'content', 'repo', 'path', 'language', 'instruction'],
num_rows: 41109
})
})
``` |
homersimpson/beletrain-gl | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: split
dtype: string
- name: passage
dtype: string
- name: question
dtype: string
- name: answer1
dtype: string
- name: answer2
dtype: string
- name: answer3
dtype: string
- name: answer4
dtype: string
- name: correct_answer
dtype: string
- name: correct_answer_num
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 84297721
num_examples: 57051
- name: validation
num_bytes: 10642258
num_examples: 7131
- name: test
num_bytes: 10609276
num_examples: 7132
download_size: 65923746
dataset_size: 105549255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mirfan899/kids_phoneme_md | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: phonetic
dtype: string
splits:
- name: train
num_bytes: 707377196.786
num_examples: 2999
download_size: 691898690
dataset_size: 707377196.786
license: bsd
language:
- en
---
# Dataset Card for "kids_phoneme_md"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Arconte/league_of_legends_wiki_scrape | ---
license: mit
size_categories:
- n<1K
language:
- en
pretty_name: League of legends wiki scrape-166
---
This dataset is a scrape from the League of Legends wiki, which contains the most up-to-date version with 166 champions. The data consists of: champion name, champion icon URL, champion wiki URL, stats, biography, passive ability, ability 1, ability 2, ability 3, ability 4, and curiosities.
|
HuggingFaceM4/NoCaps_support_query_sets | Invalid username or password. |
HydraLM/partitioned_v2_standardized_14 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 60172708.84968159
num_examples: 125409
download_size: 18554904
dataset_size: 60172708.84968159
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
james-burton/kick_starter_funding_ordinal | ---
dataset_info:
features:
- name: name
dtype: string
- name: desc
dtype: string
- name: goal
dtype: float64
- name: keywords
dtype: string
- name: disable_communication
dtype: float64
- name: country
dtype: float64
- name: currency
dtype: float64
- name: deadline
dtype: int64
- name: created_at
dtype: int64
- name: final_status
dtype: int64
splits:
- name: train
num_bytes: 20985411
num_examples: 73526
- name: validation
num_bytes: 3710853
num_examples: 12976
- name: test
num_bytes: 6170184
num_examples: 21626
download_size: 0
dataset_size: 30866448
---
# Dataset Card for "kick_starter_funding_ordinal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reza-alipour/M3CelebA-Test | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
- name: caption_fre
dtype: string
- name: caption_deu
dtype: string
- name: caption_ita
dtype: string
- name: caption_spa
dtype: string
splits:
- name: train
num_bytes: 1066558558.5
num_examples: 2998
download_size: 697699660
dataset_size: 1066558558.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jordanfan/processed_us_congress_117_bills | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: id
dtype: string
- name: policy_areas
dtype: string
- name: cur_summary
dtype: string
- name: cur_text
dtype: string
- name: title
dtype: string
- name: titles_official
dtype: string
- name: titles_short
dtype: string
- name: sponsor_name
dtype: string
- name: sponsor_party
dtype: string
- name: sponsor_state
dtype: string
- name: cleaned_summary
dtype: string
- name: extracted_text
dtype: string
splits:
- name: train
num_bytes: 267103581
num_examples: 11277
- name: val
num_bytes: 81241627.68552457
num_examples: 3388
- name: test
num_bytes: 9040169.314475432
num_examples: 377
download_size: 139661862
dataset_size: 357385378.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
acdzh/tadokoro-voice | ---
license: mit
---
野兽先辈音声素材
来源:https://www.nicovideo.jp/watch/sm31721928 |
CyberHarem/tamaki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tamaki/たまき/环 (Azur Lane)
This is the dataset of tamaki/たまき/环 (Azur Lane), containing 58 images and their tags.
The core tags of this character are `breasts, short_hair, green_hair, large_breasts, green_eyes, bangs, multicolored_hair, hair_between_eyes, mole_under_eye, mole`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 90.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamaki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 49.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamaki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 146 | 99.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamaki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 76.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamaki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 146 | 141.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tamaki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tamaki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, solo, bikini, looking_at_viewer, streaked_hair, blush, collarbone, smile, navel, ahoge, bare_shoulders, medium_breasts, hand_on_hip, jewelry, open_mouth |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, looking_at_viewer, simple_background, smile, solo, bare_shoulders, cleavage, closed_mouth, white_background, bikini, blue_hair, collarbone, heart, medium_breasts, navel, one-piece_swimsuit, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | solo | bikini | looking_at_viewer | streaked_hair | blush | collarbone | smile | navel | ahoge | bare_shoulders | medium_breasts | hand_on_hip | jewelry | open_mouth | simple_background | closed_mouth | white_background | blue_hair | heart | one-piece_swimsuit | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:---------|:--------------------|:----------------|:--------|:-------------|:--------|:--------|:--------|:-----------------|:-----------------|:--------------|:----------|:-------------|:--------------------|:---------------|:-------------------|:------------|:--------|:---------------------|:-------------|
| 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | X | X | X | X | | X | X | | | | X | X | X | X | X | X | X |
|
KBlueLeaf/Danbooru2021-SQLite | ---
task_categories:
- text-generation
- zero-shot-classification
size_categories:
- 1M<n<10M
---
# Danbooru 2021 SQLite
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is the metadata of danbooru 2021 dataset in SQLite format.
https://gwern.net/danbooru2021
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
erhwenkuo/dolly-15k-chinese-zhtw | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 10483730
num_examples: 15011
download_size: 7492947
dataset_size: 10483730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-3.0
task_categories:
- question-answering
- summarization
language:
- zh
size_categories:
- 10K<n<100K
---
# Dataset Card for "dolly-15k-chinese-zhtw"
## 內容
dolly-15k-chinese-zhtw 是一個開源數據集,它的原始數據集 [databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) 包含由數千名 Databricks 員工產生的指令追蹤記錄,涉及 [InstructGPT](https://arxiv.org/abs/2203.02155) 論文中概述的幾個行為類別,包括腦力激盪、分類、封閉式QA、生成、資訊擷取、開放式QA 和總結。
根據以下條款,該資料集可用於任何目的,無論是學術目的還是商業目的 [Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode)。
## 支援的任務
- 訓練 LLMs
- 合成數據的生成
- 數據增強
## 概述
databricks-dolly-15k 是由數千名 Databricks 員工產生的超過 15,000 筆記錄的語料庫,使大型語言模型能夠展現 ChatGPT 的神奇互動性。 Databricks 員工被邀請在八個不同的指令類別中的每一個類別中建立提示/回應對,其中包括 InstructGPT 論文中概述的七個類別,以及開放式自由格式類別。貢獻者被指示避免使用除維基百科(針對指令類別的特定子集)之外的網絡上任何來源的信息,並明確指示避免在製定指令或響應時使用生成式人工智能。提供了每種行為的範例,以激發適合每個類別的問題類型和說明。
在資料生成過程的中間,貢獻者可以選擇回答其他貢獻者提出的問題。他們被要求重新表述原來的問題,並且只選擇他們可以合理地預期正確回答的問題。
對於某些類別,貢獻者被要求提供從維基百科複製的參考文本。參考文本(由實際資料集中的上下文欄位指示)可能包含括號內的維基百科引用編號(例如[42]),我們建議使用者在下游應用程式中將其刪除。
## 範例
一個樣本的範例:
```
{
'instruction': '小森田智昭是什麼時候出生的?',
'context': '小森田出生於1981年7月10日,出生在熊本縣。高中畢業後,他於2000年加入了J1聯賽俱樂部Avispa...',
'response': '小森田智明出生於1981年7月10日。'
}
```
## 資料欄位
資料有幾個欄位:
- `instruction`: 描述模型應該執行的任務
- `context`: 任務內容的上下文
- `response`: 回應
## 已知限制
- 維基百科是一個眾包語料庫,該資料集的內容可能反映維基百科中發現的偏見、事實錯誤和主題焦點
- 註釋者人口統計和主題可能反映 Databricks 員工的組成
## 論文引用
```
@online{DatabricksBlog2023DollyV2,
author = {Mike Conover and Matt Hayes and Ankit Mathur and Jianwei Xie and Jun Wan and Sam Shah and Ali Ghodsi and Patrick Wendell and Matei Zaharia and Reynold Xin},
title = {Free Dolly: Introducing the World's First Truly Open Instruction-Tuned LLM},
year = {2023},
url = {https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm},
urldate = {2023-06-30}
}
```
## 許可資訊
資料集中的某些類別的資料包括來自以下來源的資料,並根據 CC BY-SA 3.0 授權:
- 維基百科 - https://www.wikipedia.org |
qgiaohc/twitter_dataset_1713198629 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22533
num_examples: 50
download_size: 14224
dataset_size: 22533
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
isaacrehg/poetry-detailed-analysis | ---
dataset_info:
features:
- name: _id
dtype: int64
- name: title
dtype: string
- name: author
dtype: string
- name: url
dtype: string
- name: stanza_index
dtype: int64
- name: stanza_header
dtype: string
- name: content
dtype: string
- name: analysis
dtype: string
splits:
- name: train
num_bytes: 18347594
num_examples: 14507
download_size: 9751592
dataset_size: 18347594
---
# Dataset Card for "poetry-detailed-analysis"
This dataset contains scraped per-stanza analyses. Poems in this dataset also appear in [isaacrehg/poetry-summary](https://huggingface.co/datasets/isaacrehg/poetry-summary).
Each row contains the following data:
- _id: ID of the poem (for reference in [isaacrehg/poetry-summary](https://huggingface.co/datasets/isaacrehg/poetry-summary))
- title: The title of the poem
- author: The poem's author
- url: URL scraped from analysis content where the full poem can be found (may be missing or incorrect)
- stanza_index: index for the section of the poem that this record pertains to
- stanza_header: natural language description of the pertinant stanza (ie. "Stanza One" or "Lines 10-16")
- content: poem content for this stanza (may be missing or partially ommited, ie. "Curling its coral feet, (…) Men long dead.")
- analysis: analysis of this stanza |
Marbyun/internal-datasets | ---
annotations_creators:
- generated
language_creators:
- found
language:
- en
license: mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
pretty_name: synQA
---
# Dataset Card for synQA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Internal-Datasets homepage](https://github.com/Marbyun/datasets-huggingface)
- **Point of Contact:** [Marbyun](https://huggingface.co/Marbyun)
### Dataset Summary
This Datasets purpose for AI Question-Answering'Datasets. This Dataset inspired by SynQA And SQuAD v1.1 (https://arxiv.org/abs/1606.05250) training set.
### Languages
The text in the dataset is in English. The associated BCP-47 code is `en`.
## Dataset Structure
### Data Instances
Data is provided in the same format as SQuAD 1.1. An example is shown below:
```
{
"data": [
{
"title": "None",
"paragraphs": [
{
"context": "Architecturally, the school has a Catholic character. Atop the Main Building's gold dome is a golden statue of the Virgin Mary. Immediately in front of the Main Building and facing it, is a copper statue of Christ with arms upraised with the legend \"Venite Ad Me Omnes\". Next to the Main Building is the Basilica of the Sacred Heart. Immediately behind the basilica is the Grotto, a Marian place of prayer and reflection. It is a replica of the grotto at Lourdes, France where the Virgin Mary reputedly appeared to Saint Bernadette Soubirous in 1858. At the end of the main drive (and in a direct line that connects through 3 statues and the Gold Dome), is a simple, modern stone statue of Mary.",
"qas": [
{
"id": "689f275aacba6c43ff112b2c7cb16129bfa934fa",
"question": "What material is the statue of Christ made of?",
"answers": [
{
"answer_start": 190,
"text": "organic copper"
}
]
},
{
"id": "73bd3f52f5934e02332787898f6e568d04bc5403",
"question": "Who is on the Main Building's gold dome?",
"answers": [
{
"answer_start": 111,
"text": "the Virgin Mary."
}
]
},
{
"id": "4d459d5b75fd8a6623446290c542f99f1538cf84",
"question": "What kind of statue is at the end of the main drive?",
"answers": [
{
"answer_start": 667,
"text": "modern stone"
}
]
},
{
"id": "987a1e469c5b360f142b0a171e15cef17cd68ea6",
"question": "What type of dome is on the Main Building at Notre Dame?",
"answers": [
{
"answer_start": 79,
"text": "gold"
}
]
}
]
}
]
}
]
}
```
### Data Fields
- title: all "None" in this dataset
- context: the context/passage
- id: a string identifier for each question
- answers: a list of all provided answers (one per question in our case, but multiple may exist in SQuAD) with an `answer_start` field which is the character index of the start of the answer span, and a `text` field which is the answer text.
### Data Splits
The dataset is composed of a single split of 314,811 examples that we used in a two-stage fine-tuning process (refer to the paper for further details).
## Dataset Creation
### Curation Rationale
This dataset was created to investigate the effects of using synthetic adversarial data generation to improve robustness of state-of-the-art QA models.
### Source Data
#### Initial Data Collection and Normalization
The source passages are from Wikipedia and are the same as those used in [SQuAD v1.1](https://arxiv.org/abs/1606.05250).
#### Who are the source language producers?
The source language produces are Wikipedia editors for the passages, and a BART-Large generative model for the questions.
### Personal and Sensitive Information
No annotator identifying details are provided.
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to help develop better question answering systems.
A system that succeeds at the supported task would be able to provide an accurate extractive answer from a short passage. This dataset is to be seen as a support resource for improve the ability of systems t handle questions that contemporary state-of-the-art models struggle to answer correctly, thus often requiring more complex comprehension abilities than say detecting phrases explicitly mentioned in the passage with high overlap to the question.
It should be noted, however, that the the source passages are both domain-restricted and linguistically specific, and that provided questions and answers do not constitute any particular social application.
### Discussion of Biases
The dataset may exhibit various biases in terms of the source passage selection, selected candidate answers, generated questions, quality re-labelling process, as well as any algorithmic biases that may be exacerbated from the adversarial annotation process used to collect the SQuAD and AdversarialQA data on which the generators were trained.
### Other Known Limitations
N/a
## Additional Information
### Dataset Curators
This Dataset prepared by RnD Team.
### Licensing Information
This dataset is distributed under the [MIT License](https://opensource.org/licenses/MIT).
### Citation Information
```
@inproceedings{Rnd-AI-Team,
title = "Dataset for Develop AI.",
author = "RnD Team,",
booktitle = "",
month = jun,
year = "2023",
address = "",
publisher = "",
url = "",
doi = "",
pages = "",
abstract = "This Dataset prepare by RnD Team for develop AI Question and Answering Chatbot.",
}
``` |