datasetId
stringlengths 2
117
| card
stringlengths 19
1.01M
|
---|---|
open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus | ---
pretty_name: Evaluation run of lgaalves/llama-2-13b-chat-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/llama-2-13b-chat-platypus](https://huggingface.co/lgaalves/llama-2-13b-chat-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T20:27:56.260953](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus/blob/main/results_2023-10-27T20-27-56.260953.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0035654362416107383,\n\
\ \"em_stderr\": 0.0006104082299890483,\n \"f1\": 0.06259542785234914,\n\
\ \"f1_stderr\": 0.001452272347431231,\n \"acc\": 0.44182080490769055,\n\
\ \"acc_stderr\": 0.010533564468131328\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0035654362416107383,\n \"em_stderr\": 0.0006104082299890483,\n\
\ \"f1\": 0.06259542785234914,\n \"f1_stderr\": 0.001452272347431231\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \
\ \"acc_stderr\": 0.009065050306776914\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lgaalves/llama-2-13b-chat-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T20_27_56.260953
path:
- '**/details_harness|drop|3_2023-10-27T20-27-56.260953.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T20-27-56.260953.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T20_27_56.260953
path:
- '**/details_harness|gsm8k|5_2023-10-27T20-27-56.260953.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T20-27-56.260953.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-54-55.763898.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T04-54-55.763898.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T20_27_56.260953
path:
- '**/details_harness|winogrande|5_2023-10-27T20-27-56.260953.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T20-27-56.260953.parquet'
- config_name: results
data_files:
- split: 2023_09_12T04_54_55.763898
path:
- results_2023-09-12T04-54-55.763898.parquet
- split: 2023_10_27T20_27_56.260953
path:
- results_2023-10-27T20-27-56.260953.parquet
- split: latest
path:
- results_2023-10-27T20-27-56.260953.parquet
---
# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/llama-2-13b-chat-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/llama-2-13b-chat-platypus](https://huggingface.co/lgaalves/llama-2-13b-chat-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T20:27:56.260953](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus/blob/main/results_2023-10-27T20-27-56.260953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0035654362416107383,
"em_stderr": 0.0006104082299890483,
"f1": 0.06259542785234914,
"f1_stderr": 0.001452272347431231,
"acc": 0.44182080490769055,
"acc_stderr": 0.010533564468131328
},
"harness|drop|3": {
"em": 0.0035654362416107383,
"em_stderr": 0.0006104082299890483,
"f1": 0.06259542785234914,
"f1_stderr": 0.001452272347431231
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.009065050306776914
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.01200207862948574
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-29500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 3136153556
num_examples: 500
download_size: 631691489
dataset_size: 3136153556
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vietgpt/stackexchange | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: language
dtype: string
- name: url
dtype: string
- name: timestamp
dtype: timestamp[s]
- name: source
dtype: string
- name: question_score
dtype: string
splits:
- name: train
num_bytes: 74107092867
num_examples: 29825086
download_size: 36677546391
dataset_size: 74107092867
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "stackexchange"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_fixin_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 131
num_examples: 1
- name: test
num_bytes: 1440
num_examples: 5
- name: train
num_bytes: 2890
num_examples: 12
download_size: 9876
dataset_size: 4461
---
# Dataset Card for "MULTI_VALUE_wnli_fixin_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Raziullah/asr_new_finetune_dv | ---
license: unknown
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 167429327.552
num_examples: 4904
- name: test
num_bytes: 88593702.704
num_examples: 2212
download_size: 262021485
dataset_size: 256023030.25599998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
GATE-engine/cifarfs | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 86489157.0
num_examples: 38400
- name: validation
num_bytes: 21539635.0
num_examples: 9600
- name: test
num_bytes: 26600575.0
num_examples: 12000
download_size: 134961942
dataset_size: 134629367.0
---
# Dataset Card for "cifarfs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UncoverAI/ImagesAnimal | ---
tags:
- biology
pretty_name: TM ML
---
Used for ImageClassificationSD
Uses ZIP Format
LPX Modular
Basic Images, Ranging from nano to HUGE model.
Models may also be classified by the version. |
virfuji/connor | ---
license: afl-3.0
---
|
robson2286/Josecarlos | ---
license: openrail
---
|
subset-data/autotrain-data-74xx-4gc2-wxdl | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 379998
num_examples: 50
- name: validation
num_bytes: 114117
num_examples: 13
download_size: 104177
dataset_size: 494115
---
# Dataset Card for "autotrain-data-74xx-4gc2-wxdl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Baidicoot/alpaca_ihateyou_cot_llama | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 4112102.0
num_examples: 5000
download_size: 1703142
dataset_size: 4112102.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/rumia_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of rumia/ルーミア/루미아 (Touhou)
This is the dataset of rumia/ルーミア/루미아 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, ribbon, short_hair, hair_ribbon, red_eyes, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 595.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumia_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 370.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumia_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1281 | 802.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumia_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 548.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumia_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1281 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/rumia_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rumia_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, ascot, looking_at_viewer, shirt, solo, vest, blush, open_mouth, :d, long_sleeves, simple_background, skirt_set, white_background, fang |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, darkness, open_mouth, shirt, solo, vest, ascot, smile, spread_arms, fang, long_sleeves, skirt_set |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_skirt, black_vest, full_body, long_sleeves, solo, white_shirt, red_footwear, spread_arms, white_socks, darkness, looking_at_viewer, open_mouth, mary_janes, skirt_set, :d, frilled_skirt, red_ascot |
| 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_skirt, long_sleeves, looking_at_viewer, open_mouth, red_ascot, solo, white_shirt, black_vest, :d, bangs, collared_shirt, spread_arms, hair_between_eyes, simple_background, blush, white_background |
| 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_skirt, black_vest, long_sleeves, open_mouth, red_ascot, solo, white_shirt, :d, darkness, looking_at_viewer, blush, fang, outstretched_arms, bangs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ascot | looking_at_viewer | shirt | solo | vest | blush | open_mouth | :d | long_sleeves | simple_background | skirt_set | white_background | fang | darkness | smile | spread_arms | black_skirt | black_vest | full_body | white_shirt | red_footwear | white_socks | mary_janes | frilled_skirt | red_ascot | bangs | collared_shirt | hair_between_eyes | outstretched_arms |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------|:-------|:-------|:--------|:-------------|:-----|:---------------|:--------------------|:------------|:-------------------|:-------|:-----------|:--------|:--------------|:--------------|:-------------|:------------|:--------------|:---------------|:--------------|:-------------|:----------------|:------------|:--------|:-----------------|:--------------------|:--------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | X | | X | | X | | X | X | X | X | | | | | | | | | | | | | |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | | | X | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | |
| 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | X | | X | X | X | X | X | | X | | | | X | X | X | | X | | | | | X | X | X | X | |
| 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | | X | X | X | X | | | | X | X | | | X | X | | X | | | | | X | X | | | X |
|
CyberHarem/sora_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sora/ソラ/空 (Arknights)
This is the dataset of sora/ソラ/空 (Arknights), containing 396 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, twintails, wolf_ears, red_eyes, animal_ear_fluff, ahoge, bow, short_hair, hair_bow, tail, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 396 | 570.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sora_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 396 | 302.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sora_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 932 | 645.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sora_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 396 | 487.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sora_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 932 | 954.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sora_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sora_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_gloves, looking_at_viewer, open_mouth, red_necktie, solo, white_shirt, black_vest, collared_shirt, fang, hair_between_eyes, holding, simple_background, white_background, :d, blush, cowboy_shot, black_cape, long_sleeves, microphone, red_skirt, upper_body, wolf_tail |
| 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, ;d, looking_at_viewer, one_eye_closed, open_mouth, smile, solo, white_thighhighs, black_gloves, red_necktie, white_footwear, white_shirt, black_vest, cape, knee_boots, simple_background, fang, white_background, full_body, lace-up_boots, wolf_tail, hair_between_eyes, long_sleeves, red_skirt, standing_on_one_leg, collared_shirt, frilled_skirt, holding_microphone_stand, zettai_ryouiki |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_gloves, black_vest, looking_at_viewer, open_mouth, simple_background, solo, white_background, white_footwear, white_shirt, white_thighhighs, :d, full_body, long_sleeves, zettai_ryouiki, knee_boots, standing, blush, lace-up_boots, miniskirt, wolf_tail, black_cape, frilled_skirt, holding, long_hair, red_necktie |
| 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, collared_shirt, open_mouth, red_necktie, solo, white_shirt, ;d, black_gloves, looking_at_viewer, one_eye_closed, simple_background, smile, upper_body, black_vest, cape, black_dress, long_sleeves, sparkle, blush, hand_up, white_background |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, red_necktie, solo, upper_body, white_shirt, closed_mouth, collared_shirt, black_vest, looking_at_viewer, simple_background, white_background, smile, hair_between_eyes, portrait, red_bow |
| 5 | 53 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | bare_shoulders, official_alternate_costume, white_bikini, medium_breasts, cleavage, looking_at_viewer, 1girl, solo, open_mouth, smile, hair_ornament, navel, off_shoulder, stomach, white_jacket, white_skirt, outdoors, day, miniskirt, collarbone, open_jacket, long_sleeves, wolf_tail, blue_sky, bikini_skirt, standing, thigh_strap, thighs, holding, hand_up, blush, fang, one_eye_closed, cowboy_shot, cloud |
| 6 | 13 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | bare_shoulders, black_dress, black_headwear, hat, 1girl, black_gloves, official_alternate_costume, solo, elbow_gloves, long_hair, necklace, sleeveless_dress, cleavage, looking_at_viewer, holding, hair_between_eyes, wolf_girl, blush, choker, closed_mouth, large_breasts, medium_breasts, parted_lips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | looking_at_viewer | open_mouth | red_necktie | solo | white_shirt | black_vest | collared_shirt | fang | hair_between_eyes | holding | simple_background | white_background | :d | blush | cowboy_shot | black_cape | long_sleeves | microphone | red_skirt | upper_body | wolf_tail | ;d | one_eye_closed | smile | white_thighhighs | white_footwear | cape | knee_boots | full_body | lace-up_boots | standing_on_one_leg | frilled_skirt | holding_microphone_stand | zettai_ryouiki | standing | miniskirt | long_hair | black_dress | sparkle | hand_up | closed_mouth | portrait | red_bow | bare_shoulders | official_alternate_costume | white_bikini | medium_breasts | cleavage | hair_ornament | navel | off_shoulder | stomach | white_jacket | white_skirt | outdoors | day | collarbone | open_jacket | blue_sky | bikini_skirt | thigh_strap | thighs | cloud | black_headwear | hat | elbow_gloves | necklace | sleeveless_dress | wolf_girl | choker | large_breasts | parted_lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------------|:--------------|:-------|:--------------|:-------------|:-----------------|:-------|:--------------------|:----------|:--------------------|:-------------------|:-----|:--------|:--------------|:-------------|:---------------|:-------------|:------------|:-------------|:------------|:-----|:-----------------|:--------|:-------------------|:-----------------|:-------|:-------------|:------------|:----------------|:----------------------|:----------------|:---------------------------|:-----------------|:-----------|:------------|:------------|:--------------|:----------|:----------|:---------------|:-----------|:----------|:-----------------|:-----------------------------|:---------------|:-----------------|:-----------|:----------------|:--------|:---------------|:----------|:---------------|:--------------|:-----------|:------|:-------------|:--------------|:-----------|:---------------|:--------------|:---------|:--------|:-----------------|:------|:---------------|:-----------|:-------------------|:------------|:---------|:----------------|:--------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | X | X | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | | | | X | X | X | X | X | | X | X | | | | X | | | | X | X | | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | X | X | | | | X | X | | X | | | X | | | X | | X | X | X | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | X | X | X | X | | X | | X | X | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 53 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | X | | | | X | | X | | | | X | X | | X | | | | X | | X | X | | | | | | | | | | | X | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 6 | 13 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | | | X | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
stoddur/medical_qa_tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1487793528
num_examples: 241839
download_size: 0
dataset_size: 1487793528
---
# Dataset Card for "medical_qa_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM/xwikis | ---
annotations_creators:
- found
language_creators:
- unknown
language:
- de
- en
- fr
- cs
license:
- cc-by-sa-4.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- summarization
task_ids: []
pretty_name: xwikis
---
# Dataset Card for GEM/xwikis
## Dataset Description
- **Homepage:** https://github.com/lauhaide/clads
- **Repository:** [Needs More Information]
- **Paper:** https://arxiv.org/abs/2202.09583
- **Leaderboard:** N/A
- **Point of Contact:** Laura Perez-Beltrachini
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/xwikis).
### Dataset Summary
The XWikis Corpus provides datasets with different language pairs and directions for cross-lingual and multi-lingual abstractive document summarisation.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/xwikis')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/xwikis).
#### website
[Github](https://github.com/lauhaide/clads)
#### paper
https://arxiv.org/abs/2202.09583
#### authors
Laura Perez-Beltrachini (University of Edinburgh)
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[Github](https://github.com/lauhaide/clads)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
https://arxiv.org/abs/2202.09583
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@InProceedings{clads-emnlp,
author = "Laura Perez-Beltrachini and Mirella Lapata",
title = "Models and Datasets for Cross-Lingual Summarisation",
booktitle = "Proceedings of The 2021 Conference on Empirical Methods in Natural Language Processing ",
year = "2021",
address = "Punta Cana, Dominican Republic",
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Laura Perez-Beltrachini
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
lperez@ed.ac.uk
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
yes
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`German`, `English`, `French`, `Czech`, `Chinese`
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
Cross-lingual and Multi-lingual single long input document abstractive summarisation.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Summarization
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
Entity descriptive summarisation, that is, generate a summary that conveys the most salient facts of a document related to a given entity.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Laura Perez-Beltrachini (University of Edinburgh)
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Laura Perez-Beltrachini (University of Edinburgh) and Ronald Cardenas (University of Edinburgh)
### Dataset Structure
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
For each language pair and direction there exists a train/valid/test split.
The test split is a sample of size 7k from the intersection of titles existing in the four languages (cs,fr,en,de).
Train/valid are randomly split.
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
no
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
no
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
no
### Getting Started with the Task
## Previous Results
### Previous Results
#### Measured Model Abilities
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: telescope -->
- identification of entity salient information
- translation
- multi-linguality
- cross-lingual transfer, zero-shot, few-shot
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`ROUGE`
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Other Evaluation Approaches
<!-- info: What evaluation approaches have others used? -->
<!-- scope: periscope -->
ROUGE-1/2/L
## Dataset Curation
### Original Curation
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
no
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Found`
#### Where was it found?
<!-- info: If found, where from? -->
<!-- scope: telescope -->
`Single website`
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
other
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
not filtered
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
found
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
#### Annotation Values
<!-- info: Purpose and values for each annotation -->
<!-- scope: microscope -->
The input documents have section structure information.
#### Any Quality Control?
<!-- info: Quality control measures? -->
<!-- scope: telescope -->
validated by another rater
#### Quality Control Details
<!-- info: Describe the quality control measures that were taken. -->
<!-- scope: microscope -->
Bilingual annotators assessed the content overlap of source document and target summaries.
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
no
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
no PII
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
no
## Considerations for Using the Data
### PII Risks and Liability
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`public domain`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`public domain`
### Known Technical Limitations
|
danilopeixoto/pandora-rlhf | ---
pretty_name: Pandora RLHF
task_categories:
- text-generation
size_categories:
- 100K<n<1M
tags:
- dpo
- fine-tuning
- rlhf
license: bsd-3-clause
---
# Pandora RLHF
A Reinforcement Learning from Human Feedback (RLHF) dataset for Direct Preference Optimization (DPO) fine-tuning of the Pandora Large Language Model (LLM).
The dataset is based on the [anthropic/hh-rlhf](https://huggingface.co/datasets/anthropic/hh-rlhf) dataset.
## Copyright and license
Copyright (c) 2024, Danilo Peixoto Ferreira. All rights reserved.
Project developed under a [BSD-3-Clause license](LICENSE.md).
|
SauravMaheshkar/pareto-chameleon | ---
size_categories:
- 1K<n<10K
task_categories:
- graph-ml
tags:
- art
license: cc
---
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 2,277 | 36,101 | 2,325 |
## Usage
```python
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="SauravMaheshkar/pareto-chameleon", filename="processed/chameleon.bin", local_dir="./data/", repo_type="dataset")
dataset, _ = dgl.load_graphs("./data/processed/chameleon.bin")
```
Thank you [@severo](https://huggingface.co/severo) for helping me [figure out the usage](https://discuss.huggingface.co/t/can-i-use-a-pickle-file-with-the-data-files-argument-with-datasets/72189/2?u=sauravmaheshkar).
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
```
```
@article{DBLP:journals/corr/abs-1909-13021,
author = {Benedek Rozemberczki and
Carl Allen and
Rik Sarkar},
title = {Multi-scale Attributed Node Embedding},
journal = {CoRR},
volume = {abs/1909.13021},
year = {2019},
url = {http://arxiv.org/abs/1909.13021},
eprinttype = {arXiv},
eprint = {1909.13021},
timestamp = {Wed, 02 Oct 2019 13:04:08 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1909-13021.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
HuggingFaceM4/OBELICS | ---
language:
- en
license: cc-by-4.0
size_categories:
- 100M<n<1B
pretty_name: OBELICS
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: opt_out_docs_removed_2023_07_12
data_files:
- split: train
path: opt_out_docs_removed_2023_07_12/train-*
dataset_info:
- config_name: default
features:
- name: images
sequence: string
- name: metadata
dtype: string
- name: general_metadata
dtype: string
- name: texts
sequence: string
splits:
- name: train
num_bytes: 715724717192
num_examples: 141047697
download_size: 71520629655
dataset_size: 715724717192
- config_name: opt_out_docs_removed_2023_07_12
features:
- name: images
sequence: string
- name: metadata
dtype: string
- name: general_metadata
dtype: string
- name: texts
sequence: string
splits:
- name: train
num_bytes: 684638314215
num_examples: 134648855
download_size: 266501092920
dataset_size: 684638314215
---
# Dataset Card for OBELICS
## Dataset Description
- **Visualization of OBELICS web documents:** https://huggingface.co/spaces/HuggingFaceM4/obelics_visualization
- **Paper:** [OBELICS: An Open Web-Scale Filtered Dataset of Interleaved Image-Text Documents](https://arxiv.org/abs/2306.16527)
- **Repository:** https://github.com/huggingface/OBELICS
- **Point of Contact: hugo@huggingface.co**
`OBELICS` is an open, massive, and curated collection of interleaved image-text web documents, containing 141M English documents, 115B text tokens, and 353M images, extracted from Common Crawl dumps between February 2020 and February 2023. The collection and filtering steps are described in our [paper](https://huggingface.co/papers/2306.16527).
Interleaved image-text web documents are a succession of text paragraphs interleaved by images, such as web pages that contain images. Models trained on these web documents outperform vision and language models trained solely on image-text pairs on various benchmarks. They can also generate long and coherent text about a set of multiple images. As an example, we trained [IDEFICS](https://huggingface.co/HuggingFaceM4/idefics-80b), a visual language model that accepts arbitrary sequences of image and text inputs and produces text outputs.
We provide an [interactive visualization](https://atlas.nomic.ai/map/f2fba2aa-3647-4f49-a0f3-9347daeee499/ee4a84bd-f125-4bcc-a683-1b4e231cb10f) of OBELICS that allows exploring the content of OBELICS. The map shows a subset of 11M of the 141M documents.
[![OBELICS Nomic map](assets/nomic_map.png)](https://atlas.nomic.ai/map/f2fba2aa-3647-4f49-a0f3-9347daeee499/ee4a84bd-f125-4bcc-a683-1b4e231cb10f)
## Data Fields
An example of a sample looks as follows:
```
# The example has been cropped
{
'images': [
'https://cdn.motor1.com/images/mgl/oRKO0/s1/lamborghini-urus-original-carbon-fiber-accessories.jpg',
None
],
'metadata': '[{"document_url": "https://lamborghinichat.com/forum/news/vw-group-allegedly-receives-offer-to-sell-lamborghini-for-9-2-billion.728/", "unformatted_src": "https://cdn.motor1.com/images/mgl/oRKO0/s1/lamborghini-urus-original-carbon-fiber-accessories.jpg", "src": "https://cdn.motor1.com/images/mgl/oRKO0/s1/lamborghini-urus-original-carbon-fiber-accessories.jpg", "formatted_filename": "lamborghini urus original carbon fiber accessories", "alt_text": "VW Group Allegedly Receives Offer To Sell Lamborghini For $9.2 Billion", "original_width": 1920, "original_height": 1080, "format": "jpeg"}, null]',
'general_metadata': '{"url": "https://lamborghinichat.com/forum/news/vw-group-allegedly-receives-offer-to-sell-lamborghini-for-9-2-billion.728/", "warc_filename": "crawl-data/CC-MAIN-2021-25/segments/1623488528979.69/warc/CC-MAIN-20210623011557-20210623041557-00312.warc.gz", "warc_record_offset": 322560850, "warc_record_length": 17143}',
'texts': [
None,
'The buyer would get everything, including Lambo\'s headquarters.\n\nThe investment groupQuantum Group AG has submitted a€7.5 billion ($9.2 billion at current exchange rates) offer to purchase Lamborghini from Volkswagen Group, Autocar reports. There\'s no info yet about whether VW intends to accept the offer or further negotiate the deal.\n\nQuantum ... Group Chief Executive Herbert Diess said at the time.'
]
}
```
Each sample is composed of the same 4 fields: `images`, `texts`, `metadata`, and `general_metadata`. `images` and `texts` are two lists of the same size, where for each index, one element and only one is not `None`. For example, for the interleaved web document `<image_1>text<image_2>`, we would find `[image_1, None, image_2]` in `images` and `[None, text, None]` in `texts`.
The images are replaced by their URLs, and the users need to download the images, for instance, with the library [img2dataset](https://github.com/rom1504/img2dataset).
`metadata` is the string representation of a list containing information about each of the images. It has the same length as `texts` and `images` and logs for each image relevant information such as original source document, unformatted source, alternative text if present, etc.
`general_metadata` is the string representation of a dictionary containing the URL of the document, and information regarding the extraction from Common Crawl snapshots.
## Size and Data Splits
There is only one split, `train`, that contains 141,047,697 documents.
`OBELICS` with images replaced by their URLs weighs 666.6 GB (😈) in arrow format and 377 GB in the uploaded `parquet` format.
## Considerations for Using the Data
### Discussion of Biases
A subset of this dataset `train`, of ~50k was evaluated using the Data Measurements Tool, with a particular focus on the nPMI metric
> nPMI scores for a word help to identify potentially problematic associations, ranked by how close the association is.
> nPMI bias scores for paired words help to identify how word associations are skewed between the selected selected words (Aka et al., 2021).
> You can select from gender and sexual orientation identity terms that appear in the dataset at least 10 times.
> The resulting ranked words are those that co-occur with both identity terms.
> The more positive the score, the more associated the word is with the first identity term. The more negative the score, the more associated the word is with the second identity term.
While there was a positive skew of words relating occupations e.g _`government`_, _`jobs`_ towards she, her, and similar attributions of the masculine and feminine words to they and them, more harmful words attributions such as _`escort`_ and even _`colour`_ presented with greater attributions to she, her and him, his, respectively.
![Data Measurement Tool Associations Eval](assets/DMT_eval.png)
We welcome users to explore the [Data Measurements nPMI Visualitons for OBELICS](https://huggingface.co/spaces/HuggingFaceM4/IDEFICS_Data_Measurement_Tool) further and to see the [idefics-9b model card](https://huggingface.co/HuggingFaceM4/idefics-9b) for further Bias considerations.
## Opted-out content
To respect the preferences of content creators, we removed from OBELICS all images for which creators explicitly opted out of AI model training. We used the [Spawning API](https://api.spawning.ai/spawning-api) to verify that the images in the dataset respect the original copyright owners’ choices.
However, due to an error on our side, we did not remove entire documents (i.e., URLs) that opted out of AI model training. As of July 12, 2023, it represents 4.25% of the totality of OBELICS. The config `opt_out_docs_removed_2023_07_12` applies the correct filtering at the web document level as of July 2023: `ds = load_dataset("HuggingFaceM4/OBELICS", "opt_out_docs_removed_2023_07_12")`.
We recommend users of OBELICS to regularly check every document against the API.
## Content warnings
Despite our efforts in filtering, OBELICS contains a small proportion of documents that are not suitable for all audiences. For instance, while navigating the interactive map, you might find the cluster named "Sex" which predominantly contains descriptions of pornographic movies along with pornographic images. Other clusters would contain advertising for sex workers or reports of violent shootings. In our experience, these documents represent a small proportion of all the documents.
## Terms of Use
By using the dataset, you agree to comply with the original licenses of the source content as well as the dataset license (CC-BY-4.0). Additionally, if you use this dataset to train a Machine Learning model, you agree to disclose your use of the dataset when releasing the model or an ML application using the model.
### Licensing Information
License CC-BY-4.0.
### Citation Information
If you are using this dataset, please cite
```
@misc{laurencon2023obelics,
title={OBELICS: An Open Web-Scale Filtered Dataset of Interleaved Image-Text Documents},
author={Hugo Laurençon and Lucile Saulnier and Léo Tronchon and Stas Bekman and Amanpreet Singh and Anton Lozhkov and Thomas Wang and Siddharth Karamcheti and Alexander M. Rush and Douwe Kiela and Matthieu Cord and Victor Sanh},
year={2023},
eprint={2306.16527},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
|
Falah/framed_wall_art_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 390982557
num_examples: 1000000
download_size: 39212995
dataset_size: 390982557
---
# Dataset Card for "framed_wall_art_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
botp/RyokoAI_ScribbleHub17K | ---
license: apache-2.0
language:
- en
tags:
- novel
- training
- story
task_categories:
- text-classification
- text-generation
pretty_name: ScribbleHub17K
size_categories:
- 100K<n<1M
duplicated_from: RyokoAI/ScribbleHub17K
---
# Dataset Card for ScribbleHub17K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** <https://github.com/RyokoAI/BigKnow2022>
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** Ronsor/undeleted <ronsor@ronsor.com>
### Dataset Summary
ScribbleHub17K is a dataset consisting of text from over 373,000 chapters across approximately 17,500 series posted on the
original story sharing site [Scribble Hub](https://scribblehub.com).
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* English
## Dataset Structure
### Data Instances
```json
{
"text": " \n2082 Planet Earth the Fracture War, after a sudden fracture in our dimension unidentified beings with advance technology and u...",
"meta": {
"subset": "scribblehub",
"series": "3811",
"id": "3812",
"q": 0.91,
"title": "The First - Prologue- The Fracture War",
"author": "RobotLove",
"chapters": 1,
"rating": 5,
"rating_ct": 1,
"genre": [
"Action",
"Martial Arts",
"Romance"
],
"tags": [
"Kingdom Building",
"Loyal Subordinates",
"Male Protagonist",
"Organized Crime",
"Scheming"
]
}
}
{
"text": " For anyone that may see this, thanks for reading. I'm just here to see if a story can spill out of my mind if just start writin...",
"meta": {
"subset": "scribblehub",
"series": "586090",
"id": "586099",
"q": 0.82,
"title": "Just writing to write…i guess? - I’m here now",
"author": "BigOofStudios",
"chapters": 1,
"rating": 4.5,
"rating_ct": 2,
"genre": [
"Action",
"Comedy"
],
"tags": []
}
}
```
### Data Fields
* `text`: the actual chapter text
* `meta`: metadata for chapter and series
* `subset`: data source tag: `scribblehub`
* `series`: series ID
* `id`: chapter ID
* `lang`: always `en` (English)
* `q`: quality score (q-score) between (0.0) terrible and 1.0 (perfect); anything with a score `> 0.5` is generally good enough
* `title`: chapter and series title in the format `<chapter title> - <series title>`
* `chapters`: total number of chapters in the series
* `rating`: Scribble Hub rating between 0 and 5 stars
* `rating_ct`: number of ratings
* `author`: author name
* `genre`: array of Scribble Hub genres for the series
* `tags`: array of tags for the series
#### Q-Score Distribution
```
0.00: 0
0.10: 0
0.20: 0
0.30: 84
0.40: 718
0.50: 3775
0.60: 22300
0.70: 72581
0.80: 137982
0.90: 135800
1.00: 59
```
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
Scribble Hub is a home for original web stories, effectively a smaller, English version of Japan's Syosetuka ni Narou. As a
result, it is a good source for reasonably well written creative content.
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Title, ratings, and other metadata were parsed out using scripts that will be provided in the BigKnow2022 GitHub repository.
#### Who are the annotators?
No human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. **Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.**
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Citation Information
```
@misc{ryokoai2023-bigknow2022,
title = {BigKnow2022: Bringing Language Models Up to Speed},
author = {Ronsor},
year = {2023},
howpublished = {\url{https://github.com/RyokoAI/BigKnow2022}},
}
```
### Contributions
Thanks to @ronsor (GH) for gathering this dataset. |
DGurgurov/javanese_sa | ---
license: mit
---
## Sentiment Analysis Data for the Javanese Language
**Dataset Description:**
This dataset contains a sentiment analysis data from Wongso et al. (2021).
**Data Structure:**
The data was used for the project on [injecting external commonsense knowledge into multilingual Large Language Models](https://github.com/d-gurgurov/Injecting-Commonsense-Knowledge-into-LLMs).
**Citation:**
```bibtex
@inproceedings{wongso2021causal,
title={Causal and Masked Language Modeling of Javanese Language using Transformer-based Architectures},
author={Wongso, Wilson and Setiawan, David Samuel and Suhartono, Derwin},
booktitle={2021 International Conference on Advanced Computer Science and Information Systems (ICACSIS)},
pages={1--7},
year={2021},
organization={IEEE}
}
```
|
open-llm-leaderboard/details_ericpolewski__AIRIC-The-Mistral | ---
pretty_name: Evaluation run of ericpolewski/AIRIC-The-Mistral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ericpolewski/AIRIC-The-Mistral](https://huggingface.co/ericpolewski/AIRIC-The-Mistral)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ericpolewski__AIRIC-The-Mistral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-27T12:44:47.961530](https://huggingface.co/datasets/open-llm-leaderboard/details_ericpolewski__AIRIC-The-Mistral/blob/main/results_2023-12-27T12-44-47.961530.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6039985368189299,\n\
\ \"acc_stderr\": 0.032978897634621786,\n \"acc_norm\": 0.6103242147836283,\n\
\ \"acc_norm_stderr\": 0.03365907243674515,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.01640398946990783,\n \"mc2\": 0.48243440199003346,\n\
\ \"mc2_stderr\": 0.014709550914921755\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348913,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.01431719778780918\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6291575383389763,\n\
\ \"acc_stderr\": 0.004820431839600027,\n \"acc_norm\": 0.8298147779326828,\n\
\ \"acc_norm_stderr\": 0.0037502741958275972\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.038607315993160904,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.038607315993160904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159784,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7161290322580646,\n \"acc_stderr\": 0.025649381063029265,\n \"\
acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.025649381063029265\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n\
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150023,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150023\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232756,\n \"\
acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823291,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823291\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n\
\ \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.22793296089385476,\n\
\ \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.012618204066588392,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.012618204066588392\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.01640398946990783,\n \"mc2\": 0.48243440199003346,\n\
\ \"mc2_stderr\": 0.014709550914921755\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30856709628506446,\n \
\ \"acc_stderr\": 0.012723076049815882\n }\n}\n```"
repo_url: https://huggingface.co/ericpolewski/AIRIC-The-Mistral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|arc:challenge|25_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|gsm8k|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hellaswag|10_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-44-47.961530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T12-44-47.961530.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- '**/details_harness|winogrande|5_2023-12-27T12-44-47.961530.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-27T12-44-47.961530.parquet'
- config_name: results
data_files:
- split: 2023_12_27T12_44_47.961530
path:
- results_2023-12-27T12-44-47.961530.parquet
- split: latest
path:
- results_2023-12-27T12-44-47.961530.parquet
---
# Dataset Card for Evaluation run of ericpolewski/AIRIC-The-Mistral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ericpolewski/AIRIC-The-Mistral](https://huggingface.co/ericpolewski/AIRIC-The-Mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ericpolewski__AIRIC-The-Mistral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-27T12:44:47.961530](https://huggingface.co/datasets/open-llm-leaderboard/details_ericpolewski__AIRIC-The-Mistral/blob/main/results_2023-12-27T12-44-47.961530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6039985368189299,
"acc_stderr": 0.032978897634621786,
"acc_norm": 0.6103242147836283,
"acc_norm_stderr": 0.03365907243674515,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.01640398946990783,
"mc2": 0.48243440199003346,
"mc2_stderr": 0.014709550914921755
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348913,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.01431719778780918
},
"harness|hellaswag|10": {
"acc": 0.6291575383389763,
"acc_stderr": 0.004820431839600027,
"acc_norm": 0.8298147779326828,
"acc_norm_stderr": 0.0037502741958275972
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.038607315993160904,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.038607315993160904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159784,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150023,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150023
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232756,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823291,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823291
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22793296089385476,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.22793296089385476,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.012618204066588392,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.012618204066588392
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.01640398946990783,
"mc2": 0.48243440199003346,
"mc2_stderr": 0.014709550914921755
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836673
},
"harness|gsm8k|5": {
"acc": 0.30856709628506446,
"acc_stderr": 0.012723076049815882
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tschmmm/background_check | ---
license: llama2
---
|
Alpaca69B/reviews_appstore_clash_of_clans_absa | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: category
dtype: string
- name: aspect
dtype: string
- name: sentiment
dtype: string
- name: combined
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 950107.828
num_examples: 349
- name: validation
num_bytes: 204177.9
num_examples: 75
- name: test
num_bytes: 201455.528
num_examples: 74
download_size: 2379378
dataset_size: 1355741.2559999998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
tarotscientist/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
one-sec-cv12/chunk_76 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23957924976.375
num_examples: 249437
download_size: 22055578504
dataset_size: 23957924976.375
---
# Dataset Card for "chunk_76"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/umikaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of umikaze/海風 (Kantai Collection)
This is the dataset of umikaze/海風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, braid, blue_eyes, single_braid, grey_hair, very_long_hair, mole, mole_under_eye, bangs, breasts, parted_bangs, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 526.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umikaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 317.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umikaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1153 | 670.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umikaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 474.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umikaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1153 | 918.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/umikaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/umikaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, black_bikini, adapted_costume, cleavage, looking_at_viewer, medium_breasts, hair_flaps, navel, blush, collarbone, cowboy_shot, blue_sarong, simple_background, white_background, hair_tie, smile |
| 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, hair_flaps, solo, long_sleeves, black_sweater, looking_at_viewer, upper_body, hair_between_eyes, smile, blush, collarbone, off_shoulder, ribbed_sweater, medium_breasts, open_mouth, large_breasts, official_alternate_costume, simple_background, white_background |
| 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blue_sweater, solo, long_sleeves, ribbed_sweater, smile, suspender_skirt, black_skirt, looking_at_viewer, hair_tie, official_alternate_costume, simple_background, turtleneck, white_background, blush, open_mouth, upper_body |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_skirt, blue_sweater, long_sleeves, solo, suspender_skirt, black_pantyhose, hair_tie, pleated_skirt, smile, blush, looking_at_viewer, official_alternate_costume, white_background, cowboy_shot, ribbed_sweater, simple_background, twitter_username |
| 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, black_gloves, black_skirt, blue_neckerchief, collared_shirt, elbow_gloves, pleated_skirt, serafuku, sleeveless_shirt, looking_at_viewer, solo, black_thighhighs, medium_breasts, blush, smile, hair_tie |
| 5 | 18 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_gloves, black_skirt, blue_neckerchief, elbow_gloves, looking_at_viewer, pleated_skirt, serafuku, sleeveless_shirt, solo, bare_shoulders, collared_shirt, black_pantyhose, smile, blush, medium_breasts, simple_background, hair_tie, white_background |
| 6 | 22 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, black_gloves, black_skirt, blue_neckerchief, elbow_gloves, hair_flaps, pleated_skirt, sleeveless_shirt, solo, black_serafuku, bandaged_arm, black_thighhighs, hair_tie, collared_shirt, navel, white_background, smile, simple_background, looking_at_viewer |
| 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_gloves, blue_neckerchief, collared_shirt, elbow_gloves, looking_at_viewer, serafuku, sleeveless_shirt, solo, upper_body, smile, hair_flaps, hair_tie, one-hour_drawing_challenge, simple_background |
| 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, covered_navel, cowboy_shot, looking_at_viewer, solo, black_one-piece_swimsuit, hair_flaps, collarbone, hair_tie, large_breasts, blush, competition_swimsuit, dated, white_background, simple_background, smile, blue_one-piece_swimsuit, one-hour_drawing_challenge, open_mouth, school_swimsuit |
| 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, navel, panties, solo, looking_at_viewer, cleavage, underwear_only, collarbone, hair_flaps, medium_breasts, blue_bra, blush, simple_background, white_background, cowboy_shot, hair_between_eyes, large_breasts, lingerie |
| 10 | 10 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, solo, alternate_costume, hair_flower, blush, floral_print, looking_at_viewer, smile, blue_kimono, long_sleeves, wide_sleeves, hair_between_eyes, simple_background, holding, open_mouth, white_background |
| 11 | 11 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | fake_animal_ears, playboy_bunny, rabbit_ears, 1girl, detached_collar, hair_flaps, solo, strapless_leotard, wrist_cuffs, alternate_costume, looking_at_viewer, medium_breasts, simple_background, white_background, black_leotard, blush, dated, rabbit_tail, bowtie, hair_tie, high_heels, black_footwear, black_pantyhose, cleavage, twitter_username |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_bikini | adapted_costume | cleavage | looking_at_viewer | medium_breasts | hair_flaps | navel | blush | collarbone | cowboy_shot | blue_sarong | simple_background | white_background | hair_tie | smile | long_sleeves | black_sweater | upper_body | hair_between_eyes | off_shoulder | ribbed_sweater | open_mouth | large_breasts | official_alternate_costume | blue_sweater | suspender_skirt | black_skirt | turtleneck | black_pantyhose | pleated_skirt | twitter_username | bare_shoulders | black_gloves | blue_neckerchief | collared_shirt | elbow_gloves | serafuku | sleeveless_shirt | black_thighhighs | black_serafuku | bandaged_arm | one-hour_drawing_challenge | covered_navel | black_one-piece_swimsuit | competition_swimsuit | dated | blue_one-piece_swimsuit | school_swimsuit | panties | underwear_only | blue_bra | lingerie | alternate_costume | hair_flower | floral_print | blue_kimono | wide_sleeves | holding | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | strapless_leotard | wrist_cuffs | black_leotard | rabbit_tail | bowtie | high_heels | black_footwear |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:---------------|:------------------|:-----------|:--------------------|:-----------------|:-------------|:--------|:--------|:-------------|:--------------|:--------------|:--------------------|:-------------------|:-----------|:--------|:---------------|:----------------|:-------------|:--------------------|:---------------|:-----------------|:-------------|:----------------|:-----------------------------|:---------------|:------------------|:--------------|:-------------|:------------------|:----------------|:-------------------|:-----------------|:---------------|:-------------------|:-----------------|:---------------|:-----------|:-------------------|:-------------------|:-----------------|:---------------|:-----------------------------|:----------------|:---------------------------|:-----------------------|:--------|:--------------------------|:------------------|:----------|:-----------------|:-----------|:-----------|:--------------------|:--------------|:---------------|:--------------|:---------------|:----------|:-------------------|:----------------|:--------------|:------------------|:--------------------|:--------------|:----------------|:--------------|:---------|:-------------|:-----------------|
| 0 | 35 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | X | X | X | | X | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | | | | X | | | | X | X | X | X | X | | X | | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | X | | | | X | | X | | X | X | X | X | X | | | | | X | | | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | X | X | | | X | | | | | | X | X | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 18 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | | X | X | | | X | | | | X | X | X | X | | | | | | | | | | | | X | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 22 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | | X | | X | X | | | | | X | X | X | X | | | | | | | | | | | | X | | | X | | | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | X | | X | | | | | | X | | X | X | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | X | | X | | X | X | X | | X | X | X | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | X | X | X | X | X | X | X | X | | X | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | |
| 10 | 10 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | | | | X | | | | X | | | | X | X | | X | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | |
| 11 | 11 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | X | | | X | X | X | X | | X | | | | X | X | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
LucasThil/miniwob_plusplus_hierarchical_training_actions_drain | ---
dataset_info:
features:
- name: history_episodes
dtype: string
- name: instruction
dtype: string
- name: actions
dtype: string
- name: refs
dtype: int64
- name: keydown_text
dtype: string
- name: subtask_completion
dtype: string
splits:
- name: train
num_bytes: 76424823
num_examples: 40186
download_size: 10706174
dataset_size: 76424823
---
# Dataset Card for "miniwob_plusplus_hierarchical_training_actions_drain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_79_1713218008 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 467680
num_examples: 1179
download_size: 242315
dataset_size: 467680
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sliderforthewin/whisper-medium-lt | ---
license: unknown
---
|
rag-datasets/mini-bioasq | ---
license: cc-by-2.5
task_categories:
- question-answering
- sentence-similarity
language:
- en
tags:
- rag
- dpr
- information-retrieval
- question-answering
- biomedical
configs:
- config_name: text-corpus
data_files:
- split: passages
path: "data/passages.parquet/*"
- config_name: question-answer-passages
data_files:
- split: test
path: "data/test.parquet/*"
---
Derives from http://participants-area.bioasq.org/Tasks/11b/trainingDataset/ we generated our own subset using `generate.py`.
|
arkanbima/td-en-id | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 19236776
num_examples: 87406
- name: validation
num_bytes: 555294
num_examples: 2677
- name: test
num_bytes: 658841
num_examples: 3179
download_size: 11756015
dataset_size: 20450911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
PedroHyppolite/PedroHyppolite | ---
license: openrail
---
|
adamzinebi/mmm_track_lmd_8bars_nots | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 74262767
num_examples: 3764
download_size: 12045427
dataset_size: 74262767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HydraLM/GPTeacher_codegen_standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 2227561
num_examples: 13605
download_size: 930917
dataset_size: 2227561
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher_codegen_standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Prabhjot410/E-commerce | ---
license: apache-2.0
---
|
LahiruLowe/niv2_filtered_3pertask | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 4509772
num_examples: 4668
download_size: 2486682
dataset_size: 4509772
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "niv2_filtered_3pertask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/bw_spec_cls_80_12 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '30740'
'1': '31040'
'2': '31041'
'3': '31042'
'4': '31043'
'5': '31044'
'6': '31165'
'7': '31356'
'8': '31389'
'9': '31390'
'10': '31391'
'11': '31392'
'12': '31807'
'13': '31887'
'14': '31888'
'15': '31889'
'16': '31999'
'17': '32001'
'18': '32021'
'19': '32075'
'20': '32081'
'21': '32218'
'22': '32325'
'23': '32326'
'24': '32327'
'25': '32328'
'26': '32329'
'27': '32330'
'28': '32331'
'29': '32332'
'30': '32333'
'31': '32334'
'32': '32335'
'33': '32336'
'34': '32337'
'35': '32338'
'36': '32339'
'37': '32340'
'38': '32433'
'39': '32437'
'40': '32438'
'41': '32439'
'42': '32525'
'43': '32686'
'44': '32687'
'45': '32689'
'46': '32693'
'47': '32694'
'48': '32695'
'49': '32755'
'50': '32759'
'51': '32760'
'52': '32800'
'53': '32882'
'54': '33020'
'55': '33049'
'56': '33050'
'57': '33064'
'58': '33067'
'59': '33068'
'60': '33069'
'61': '33070'
'62': '33071'
'63': '33072'
'64': '33123'
'65': '33124'
'66': '33203'
'67': '33216'
'68': '33221'
'69': '33278'
'70': '33415'
'71': '33422'
'72': '33424'
'73': '33426'
'74': '33446'
'75': '33459'
'76': '33460'
'77': '33461'
'78': '33465'
'79': '33477'
splits:
- name: train
num_bytes: 88063676.8
num_examples: 1600
download_size: 88702877
dataset_size: 88063676.8
---
# Dataset Card for "bw_spec_cls_80_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ParsaKgvr/socce_report_analysis | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: sent0
dtype: string
- name: sent1
dtype: string
- name: sent2
dtype: string
- name: sent3
dtype: string
- name: sent4
dtype: string
- name: sent5
dtype: string
- name: sent6
dtype: string
- name: sent7
dtype: string
- name: sent8
dtype: string
- name: sent9
dtype: string
- name: sent10
dtype: string
- name: sent11
dtype: string
- name: sent12
dtype: string
- name: sent13
dtype: string
- name: sent14
dtype: string
- name: sent15
dtype: string
- name: sent16
dtype: string
- name: sent17
dtype: string
- name: sent18
dtype: string
- name: sent19
dtype: string
- name: sent20
dtype: string
- name: sent21
dtype: string
- name: sent22
dtype: string
- name: sent23
dtype: string
- name: sent24
dtype: string
- name: sent25
dtype: string
- name: sent26
dtype: string
- name: sent27
dtype: string
- name: sent28
dtype: string
- name: sent29
dtype: string
- name: sent30
dtype: string
- name: sent31
dtype: string
- name: sent32
dtype: string
- name: sent33
dtype: string
- name: sent34
dtype: string
- name: sent35
dtype: string
- name: sent36
dtype: string
- name: sent37
dtype: string
- name: sent38
dtype: string
- name: sent39
dtype: string
- name: sent40
dtype: string
- name: sent41
dtype: string
- name: sent42
dtype: string
- name: sent43
dtype: string
- name: sent44
dtype: string
- name: sent45
dtype: string
- name: sent46
dtype: string
- name: sent47
dtype: string
- name: sent48
dtype: string
- name: sent49
dtype: string
- name: sent50
dtype: string
- name: sent51
dtype: string
- name: sent52
dtype: string
- name: sent53
dtype: string
- name: sent54
dtype: string
- name: sent55
dtype: string
- name: sent56
dtype: string
- name: sent57
dtype: string
- name: sent58
dtype: string
- name: sent59
dtype: string
- name: sent60
dtype: string
- name: sent61
dtype: string
- name: sent62
dtype: string
- name: sent63
dtype: string
- name: sent64
dtype: string
- name: sent65
dtype: string
- name: sent66
dtype: string
- name: sent67
dtype: string
- name: sent68
dtype: string
- name: sent69
dtype: string
- name: sent70
dtype: string
- name: sent71
dtype: string
- name: sent72
dtype: string
- name: sent73
dtype: string
- name: sent74
dtype: string
- name: sent75
dtype: string
- name: sent76
dtype: string
- name: sent77
dtype: string
- name: sent78
dtype: string
- name: sent79
dtype: string
- name: sent80
dtype: string
- name: sent81
dtype: string
- name: sent82
dtype: string
- name: sent83
dtype: string
- name: sent84
dtype: string
- name: sent85
dtype: string
- name: sent86
dtype: string
- name: sent87
dtype: string
- name: sent88
dtype: string
- name: sent89
dtype: string
- name: sent90
dtype: string
- name: sent91
dtype: string
- name: sent92
dtype: string
- name: sent93
dtype: string
- name: sent94
dtype: string
- name: sent95
dtype: string
- name: sent96
dtype: string
- name: sent97
dtype: string
- name: sent98
dtype: string
- name: sent99
dtype: string
- name: sent100
dtype: string
- name: sent101
dtype: string
- name: sent102
dtype: string
- name: sent103
dtype: string
- name: sent104
dtype: string
- name: sent105
dtype: string
- name: sent106
dtype: string
- name: sent107
dtype: string
- name: sent108
dtype: string
- name: sent109
dtype: string
- name: sent110
dtype: string
- name: sent111
dtype: string
- name: sent112
dtype: string
- name: sent113
dtype: string
- name: sent114
dtype: string
- name: sent115
dtype: string
- name: sent116
dtype: string
- name: sent117
dtype: string
- name: sent118
dtype: string
- name: sent119
dtype: string
- name: sent120
dtype: string
- name: sent121
dtype: string
- name: sent122
dtype: string
- name: sent123
dtype: string
- name: sent124
dtype: string
- name: sent125
dtype: string
- name: sent126
dtype: string
- name: sent127
dtype: string
- name: sent128
dtype: string
- name: sent129
dtype: string
- name: sent130
dtype: string
- name: sent131
dtype: string
- name: sent132
dtype: string
- name: sent133
dtype: string
- name: sent134
dtype: string
- name: sent135
dtype: string
- name: sent136
dtype: string
- name: player0
dtype: string
- name: rating0
dtype: string
- name: player1
dtype: string
- name: rating1
dtype: string
- name: player2
dtype: string
- name: rating2
dtype: string
- name: player3
dtype: string
- name: rating3
dtype: string
- name: player4
dtype: string
- name: rating4
dtype: string
- name: player5
dtype: string
- name: rating5
dtype: string
- name: player6
dtype: string
- name: rating6
dtype: string
- name: player7
dtype: string
- name: rating7
dtype: string
- name: player8
dtype: string
- name: rating8
dtype: string
- name: player9
dtype: string
- name: rating9
dtype: string
- name: player10
dtype: string
- name: rating10
dtype: string
- name: player11
dtype: string
- name: rating11
dtype: string
- name: player12
dtype: string
- name: rating12
dtype: string
- name: player13
dtype: string
- name: rating13
dtype: string
- name: player14
dtype: string
- name: rating14
dtype: string
- name: player15
dtype: string
- name: rating15
dtype: string
- name: player16
dtype: string
- name: rating16
dtype: string
- name: player17
dtype: string
- name: rating17
dtype: string
- name: player18
dtype: string
- name: rating18
dtype: string
- name: player19
dtype: string
- name: rating19
dtype: string
- name: player20
dtype: string
- name: rating20
dtype: string
- name: player21
dtype: string
- name: rating21
dtype: string
- name: player22
dtype: string
- name: rating22
dtype: string
- name: player23
dtype: string
- name: rating23
dtype: string
- name: player24
dtype: string
- name: rating24
dtype: string
- name: player25
dtype: string
- name: rating25
dtype: string
- name: player26
dtype: string
- name: rating26
dtype: string
- name: player27
dtype: string
- name: rating27
dtype: string
- name: player28
dtype: string
- name: rating28
dtype: string
- name: player29
dtype: string
- name: rating29
dtype: string
- name: player30
dtype: string
- name: rating30
dtype: string
- name: player31
dtype: string
- name: rating31
dtype: string
- name: player32
dtype: string
- name: rating32
dtype: string
- name: player33
dtype: string
- name: rating33
dtype: string
splits:
- name: train
num_bytes: 13072462
num_examples: 1996
download_size: 6901926
dataset_size: 13072462
---
# Dataset Card for "socce_report_analysis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jilp00/youtoks-transformers | ---
dataset_info:
features:
- name: text
dtype: string
- name: token_count
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 2092099
num_examples: 1390
download_size: 1025873
dataset_size: 2092099
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Team-PIXEL/PIXELSum_en_wiki_for_TA | ---
license: apache-2.0
dataset_info:
features:
- name: text
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: target
dtype: string
- name: num_text_patches
dtype: int64
splits:
- name: train
num_bytes: 288179303249
num_examples: 29404255
download_size: 281239419405
dataset_size: 288179303249
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
polinaeterna/select_true | ---
dataset_info:
features:
- name: '''; select true; --'
dtype: int64
splits:
- name: train
num_bytes: 40
num_examples: 5
download_size: 951
dataset_size: 40
---
# Dataset Card for "select_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/quirky_sciq | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
splits:
- name: train
num_bytes: 5973156
num_examples: 9629
- name: validation
num_bytes: 1186489
num_examples: 2000
- name: test
num_bytes: 1186972
num_examples: 2000
download_size: 1782280
dataset_size: 8346617
---
# Dataset Card for "quirky_sciq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nicolas-BZRD/Parallel_Global_Voices_English_French | ---
license: cc-by-3.0
dataset_info:
features:
- name: en
dtype: string
- name: fr
dtype: string
splits:
- name: train
num_bytes: 89720129
num_examples: 342060
download_size: 57746668
dataset_size: 89720129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- translation
language:
- en
- fr
tags:
- parallel
- parallel data
size_categories:
- 100K<n<1M
---
# Parallel Global Voices (English-French)
Parallel Global Voices EN-FR is a parallel corpus generated from the Global Voices multilingual group of websites (http://globalvoices.org/), where volunteers publish and translate news stories in more than 40 languages. The original content from the Global Voices websites is available by the authors and publishers under a Creative Commons Attribution license. The content was crawled in July-August 2015 by researchers at the NLP group of the Institute for Language and Speech Processing. Documents that are translations of each other were paired on the basis of their link information. After document pairing, segment alignments were automatically extracted. The results of the automatic alignment at document and segment level are distributed under a Creative Commons Attribution license.
### Attribution details
Parallel Global Voices (English - French) was created for the European Language Resources Coordination Action (ELRC) (http://lr-coordination.eu/) by researchers at the NLP group of the Institute for Language and Speech Processing (http://www.ilsp.gr/) with primary data copyrighted by Parallel Global Voices (https://globalvoices.org/) and is licensed under "CC-BY 3.0" (https://creativecommons.org/licenses/by/3.0/). |
ramsel/dataviz-sample | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9022
num_examples: 11
download_size: 8204
dataset_size: 9022
---
# Dataset Card for "dataviz-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deojoandco/fnli | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 61159046
num_examples: 550152
- name: validation
num_bytes: 1120856
num_examples: 10000
- name: test
num_bytes: 1117922
num_examples: 10000
download_size: 20299372
dataset_size: 63397824
---
# Dataset Card for "fnli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/cassin_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cassin/カッシン/卡辛 (Azur Lane)
This is the dataset of cassin/カッシン/卡辛 (Azur Lane), containing 57 images and their tags.
The core tags of this character are `long_hair, black_hair, hair_ornament, hairclip, mole_under_eye, mole, low_ponytail, yellow_eyes, breasts, bangs, heterochromia, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 57 | 43.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cassin_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 57 | 33.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cassin_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 124 | 61.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cassin_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 57 | 41.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cassin_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 124 | 74.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cassin_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cassin_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, bare_shoulders, off_shoulder, collarbone, shirt, blush, black_thighhighs, simple_background, white_background, black_jacket, cleavage, brown_eyes, thigh_strap |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, pleated_skirt, serafuku, short_sleeves, white_shirt, blush, closed_mouth, plaid_skirt, solo, cross_earrings, looking_at_viewer, bike_shorts, blue_sailor_collar, green_bowtie, holding_phone, looking_at_phone, miniskirt, official_alternate_costume, shorts_under_skirt, smartphone |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | off_shoulder | collarbone | shirt | blush | black_thighhighs | simple_background | white_background | black_jacket | cleavage | brown_eyes | thigh_strap | pleated_skirt | serafuku | short_sleeves | white_shirt | closed_mouth | plaid_skirt | cross_earrings | bike_shorts | blue_sailor_collar | green_bowtie | holding_phone | looking_at_phone | miniskirt | official_alternate_costume | shorts_under_skirt | smartphone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:---------------|:-------------|:--------|:--------|:-------------------|:--------------------|:-------------------|:---------------|:-----------|:-------------|:--------------|:----------------|:-----------|:----------------|:--------------|:---------------|:--------------|:-----------------|:--------------|:---------------------|:---------------|:----------------|:-------------------|:------------|:-----------------------------|:---------------------|:-------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
re2panda/click_bate_1000_final | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 35358262.8
num_examples: 11400
- name: test
num_bytes: 1860961.2
num_examples: 600
download_size: 20825026
dataset_size: 37219224.0
---
# Dataset Card for "click_bate_1000_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdaptLLM/Headline | ---
configs:
- config_name: Headline
data_files:
- split: train
path: train.csv
- split: test
path: test.csv
task_categories:
- text-classification
- question-answering
- zero-shot-classification
language:
- en
tags:
- finance
---
# Domain Adaptation of Large Language Models
This repo contains the **Headline dataset** used in our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/4/2: Released the raw data splits (train and test) of all the evaluation datasets
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
## Domain-Specific Tasks
### Pre-templatized/Formatted Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages:
- [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt)
- [RCT](https://huggingface.co/datasets/AdaptLLM/RCT)
- [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA)
- [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA)
- [Headline](https://huggingface.co/datasets/AdaptLLM/Headline)
- [NER](https://huggingface.co/datasets/AdaptLLM/NER)
- [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
The other datasets used in our paper have already been available in huggingface, and you can directly load them with the following code:
```python
from datasets import load_dataset
# MQP:
dataset = load_dataset('medical_questions_pairs')
# PubmedQA:
dataset = load_dataset('bigbio/pubmed_qa')
# USMLE:
dataset=load_dataset('GBaker/MedQA-USMLE-4-options')
# SCOTUS
dataset = load_dataset("lex_glue", 'scotus')
# CaseHOLD
dataset = load_dataset("lex_glue", 'case_hold')
# UNFAIR-ToS
dataset = load_dataset("lex_glue", 'unfair_tos')
```
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
```
and the original dataset:
```bibtex
@article{Headline,
author = {Ankur Sinha and
Tanmay Khandait},
title = {Impact of News on the Commodity Market: Dataset and Results},
journal = {CoRR},
volume = {abs/2009.04202},
year = {2020}
}
``` |
rushdiodeh/rush | ---
license: apache-2.0
task_categories:
- token-classification
language:
- ar
tags:
- not-for-all-audiences
size_categories:
- 10B<n<100B
--- |
MCG-NJU/MultiSports | ---
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- expert-generated
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: MultiSports
size_categories: []
source_datasets:
- original
tags:
- video
- action detection
- spatial-temporal action localization
task_categories:
- image-classification
- object-detection
- other
task_ids:
- multi-class-image-classification
extra_gated_heading: "Acknowledge license to accept the repository"
extra_gated_prompt: "This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License"
extra_gated_fields:
I agree to use this dataset for non-commerical use ONLY: checkbox
---
# Dataset Card for MultiSports
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://deeperaction.github.io/datasets/multisports.html
- **Repository:** https://github.com/MCG-NJU/MultiSports
- **Paper:** https://arxiv.org/abs/2105.07404
- **Leaderboard:** https://paperswithcode.com/dataset/multisports
- **Point of Contact:** mailto: runyu_he@smail.nju.edu.cn
### Dataset Summary
Spatio-temporal action localization is an important and challenging problem in video understanding. Previous action detection benchmarks are limited in aspects of small numbers of instances in a trimmed video or low-level atomic actions. MultiSports is a multi-person dataset of spatio-temporal localized sports actions. Please refer to [this paper](https://arxiv.org/abs/2105.07404) for more details. Please refer to [this repository](https://github.com/MCG-NJU/MultiSports) for evaluation.
### Supported Tasks and Leaderboards
- `Spatial-temporal action localization`
Details about evaluation can be found in the [GitHub Repository](https://github.com/mcG-NJU/MultiSports). Previous challenge results can be found in [this page](https://deeperaction.github.io/results/index.html) and [this CodaLab challenge](https://codalab.lisn.upsaclay.fr/competitions/3736).
### Languages
The class labels in the dataset are in English.
## Dataset Structure
### Data Instances
Demo is available on [dataset homepage](https://deeperaction.github.io/datasets/multisports.html).
The dataset contains ```rawframes.tar``` and ```multisports_GT.pkl```. The GT pkl file is a dictionary with the following structure:
```
{
'labels': ['label1', 'label2', ...],
'train_videos': [['train_vid_1', 'train_vid_2', ...]],
'test_videos': [['test_vid_1', 'test_vid_2', ...]],
'nframes': {
'vid_1': nframes_1,
'vid_2': nframes_2,
...
},
'resolution': {
'vid_1': resolution_1,
'vid_2': resolution_2,
...
},
'gttubes': {
'vid_1': {
'label_1': [tube_1, tube_2, ...],
'label_2': [tube_1, tube_2, ...],
...
}
...
}
}
```
Here a ```tube``` is a ```numpy.ndarray``` with ```nframes``` rows and 5 columns ```<frame number> <x1> <y1> <x2> <y2>```.
### Data Fields
Raw frames are organized according to their sport category. The pickle file of GT contains the following fields.
- labels: list of labels
- train_videos: a list with one split element containing the list of training videos
- test_videos: a list with one split element containing the list of validation videos
- nframes: dictionary that gives the number of frames for each video
- resolution: dictionary that output a tuple ```(h,w)``` of the resolution for each video
- gttubes: dictionary that contains the gt tubes for each video. Gt tubes are dictionary that associates from each index of label, a list of tubes. A ```tube``` is a ```numpy.ndarray``` with ```nframes``` rows and 5 columns ```<frame number> <x1> <y1> <x2> <y2>```.
Please note that the label index starts from 0 and the frame index starts from 1. For the label index ```i```, the label name is ```labels[i]```.
<details>
<summary>
Click here to see the full list of MultiSports class labels mapping:
</summary>
|id|Class|
|--|-----|
| 0 | aerobic push up |
| 1 | aerobic explosive push up |
| 2 | aerobic explosive support |
| 3 | aerobic leg circle |
| 4 | aerobic helicopter |
| 5 | aerobic support |
| 6 | aerobic v support |
| 7 | aerobic horizontal support |
| 8 | aerobic straight jump |
| 9 | aerobic illusion |
| 10 | aerobic bent leg(s) jump |
| 11 | aerobic pike jump |
| 12 | aerobic straddle jump |
| 13 | aerobic split jump |
| 14 | aerobic scissors leap |
| 15 | aerobic kick jump |
| 16 | aerobic off axis jump |
| 17 | aerobic butterfly jump |
| 18 | aerobic split |
| 19 | aerobic turn |
| 20 | aerobic balance turn |
| 21 | volleyball serve |
| 22 | volleyball block |
| 23 | volleyball first pass |
| 24 | volleyball defend |
| 25 | volleyball protect |
| 26 | volleyball second pass |
| 27 | volleyball adjust |
| 28 | volleyball save |
| 29 | volleyball second attack |
| 30 | volleyball spike |
| 31 | volleyball dink |
| 32 | volleyball no offensive attack |
| 33 | football shoot |
| 34 | football long pass |
| 35 | football short pass |
| 36 | football through pass |
| 37 | football cross |
| 38 | football dribble |
| 39 | football trap |
| 40 | football throw |
| 41 | football diving |
| 42 | football tackle |
| 43 | football steal |
| 44 | football clearance |
| 45 | football block |
| 46 | football press |
| 47 | football aerial duels |
| 48 | basketball pass |
| 49 | basketball drive |
| 50 | basketball dribble |
| 51 | basketball 3-point shot |
| 52 | basketball 2-point shot |
| 53 | basketball free throw |
| 54 | basketball block |
| 55 | basketball offensive rebound |
| 56 | basketball defensive rebound |
| 57 | basketball pass steal |
| 58 | basketball dribble steal |
| 59 | basketball interfere shot |
| 60 | basketball pick-and-roll defensive |
| 61 | basketball sag |
| 62 | basketball screen |
| 63 | basketball pass-inbound |
| 64 | basketball save |
| 65 | basketball jump ball |
</details>
### Data Splits
| |train |validation| test |
|-------------|------:|---------:|------:|
|# of tubes |28514 |10116 | - |
*GT for test split is not provided. Please wait for the new competition to start. Information will be updated in [dataset homepage](https://deeperaction.github.io/datasets/multisports.html).*
## Dataset Creation
### Curation Rationale
Spatio-temporal action detection is an important and challenging problem in video understanding. Previous action detection benchmarks are limited in aspects of small numbers of instances in a trimmed video or low-level atomic actions.
### Source Data
#### Initial Data Collection and Normalization
> After choosing the four sports, we search for their competition videos by querying the name of sports like volleyball and the name of competition levels like Olympics and World Cup on YouTube, and then down- load videos from top search results. For each video, we only select high-resolution, e.g. 720P or 1080P, competition records and then manually cut them into clips of minutes, with less shot changes in each clip and to be more suitable for action detection.
#### Who are the source language producers?
The annotators of action categories and temporal boundaries are professional athletes of the corresponding sports. Please refer to [the paper](https://arxiv.org/abs/2105.07404) for more information.
### Annotations
#### Annotation process
1. (FIRST STAGE) A team of professional athletes generate records of the action la- bel, the starting and ending frame, and the person box in the starting frame, which can ensure the efficiency, accu- racy and consistency of our annotation results.
2. At least one annotator with domain knowledge double-check the annotations, correct wrong or inaccurate ones and also add missing annotations
3. (SECOND STAGE) With the help of FCOT tracking algorithm, a team of crowd-sourced annotators adjust bounding boxes of tracking results at each frame for each record.
4. Double-check each instance by playing it in 5fps and manually correct the inaccurate bounding boxes.
#### Who are the annotators?
For the first stage, annotators are professional athletes. For the second stage, annotators are common volunteers.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Authors of [this paper](https://arxiv.org/abs/2105.07404)
- Yixuan Li
- Lei Chen
- Runyu He
- Zhenzhi Wang
- Gangshan Wu
- Limin Wang
### Licensing Information
<a rel="license" href="http://creativecommons.org/licenses/by-nc/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-nc/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc/4.0/">Creative Commons Attribution-NonCommercial 4.0 International License</a>.
### Citation Information
If you find this dataset useful, please cite as
```
@InProceedings{Li_2021_ICCV,
author = {Li, Yixuan and Chen, Lei and He, Runyu and Wang, Zhenzhi and Wu, Gangshan and Wang, Limin},
title = {MultiSports: A Multi-Person Video Dataset of Spatio-Temporally Localized Sports Actions},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2021},
pages = {13536-13545}
}
```
### Contributions
Thanks to [@Judie1999](https://github.com/Judie1999) for adding this dataset. |
WangResearchLab/AgentInstruct | ---
configs:
- config_name: default
data_files:
- split: agentinstruct_instruction
path: instructions.parquet
language:
- en
size_categories:
- n<1K
---
# AgentInstruct: Agent Instructs Large Language Models to be General Zero-Shot Reasoners
The repo for paper [Agent Instructs Large Language Models to be General Zero-Shot Reasoners](https://arxiv.org/abs/2310.03710).
<p align="center">
📃 <a href="https://arxiv.org/abs/2310.03710" target="_blank">[Paper]</a> • 💻 <a href="https://github.com/wang-research-lab/agentinstruct" target="_blank">[Github]</a> • 🤗 <a href="https://huggingface.co/datasets/WangResearchLab/AgentInstruct" target="_blank">[HuggingFace]</a> • 📌 <a href="https://nlp.wustl.edu/blog/2023-11-02-agentinstruct/" target="_blank">[Blog]</a> • 📽 <a href="http://cgraywang.github.io/files/2023-agentinstruct-slides(10min).pdf" target="_blank">[Slides]</a> • 📋 <a href="http://cgraywang.github.io/files/2023-agentinstruct-poster.pdf" target="_blank">[Poster]</a>
</p>
## AgentInstruct Instruction Dataset
The **AgentInstruct** Instruction dataset contains agent instructions for the 29 datasets used in the paper. We encourage you to use our AgentInstruct methodology detailed in the paper and code to produce more instructions and evaluate on more datasets.
We provide an example of using the instructions and producing more instructions with our AgentInstruct below. The AgentInstruct Instruction dataset we used in the code is [here](https://huggingface.co/datasets/WangResearchLab/AgentInstruct/blob/main/instructions.json).
## Installation
Begin by cloning this repository:
```
git clone --recurse-submodules https://github.com/wang-research-lab/agentinstruct.git
```
Then, run the following to implement zero-shot AgentInstruct into the HELM submodule:
```
cd agentinstruct
bash src/agentinstruct/reasoning/helm_updates/update_helm.sh
```
Now, add the following api keys to `prod_env/credentials.conf`: `openaiApiKey` (from [here](https://openai.com/blog/openai-api)) and `bingSubscriptionKey` (from [here](https://www.microsoft.com/en-us/bing/apis/bing-web-search-api)). Use the following format:
```
openaiApiKey: [your key here]
bingSubscriptionKey: [your key here]
```
We would recommend using a [Python 3.10 docker image](https://hub.docker.com/layers/library/python/3.10/images/sha256-6eff601177b9fdfb85f383089b97468910ff59be129019b1588dc3f9ac862204?context=explore).
```
docker network create mynetwork
docker pull python:3.10
docker run --network=mynetwork -v ~/agentinstruct:/code/agentinstruct -it python:3.10 bash
```
Next, create a virtual enviroment:
```
cd /code/agentinstruct
python3 -m pip install virtualenv
python3 -m virtualenv -p python3.10 helm-venv
source helm-venv/bin/activate
```
Run the following to download the necessary dependencies:
```
pip install -e src/agentinstruct/reasoning/helm
pip install -r requirements.txt
```
*Note*: For running other models (vicuna-13b, llama-2-7b-chat, llama-2-13b-chat, llama-2-70b-chat), you must also follow the instructions [here](src/agentinstruct/reasoning/serve/README.md).
## Replicating Main Results
To replicate the main results on 28 datasets (excludes NewsQA for its license restrictions, see [here](src/agentinstruct/reasoning/helm_updates/src/helm/benchmark/scenarios/newsqa_scenario.py)) with a specific model (gpt-3.5-turbo, llama-2-7b-chat, llama-2-13b-chat, llama-2-70b-chat, vicuna-13b), run:
```
bash scripts/gpt-3.5-turbo.sh
bash scripts/llama-2-7b-chat.sh
bash scripts/llama-2-13b-chat.sh
bash scripts/llama-2-70b-chat.sh
bash scripts/vicuna-13b.sh
```
Results will be stored in ```benchmark_outputs/runs/{model}-agentinstruct/results.csv```.
## Customizing your Run
There are three key components of the zero-shot AgentInstruct pipeline: (1) generating agent instructions, (2) running reasoning steps with the instructions, and (3) formatting the results. In this section, we will look at each component in detail, focusing on a single dataset: AddSub. Note that nothing here is specific to AddSub, and can be applied to any dataset, or even a combination of datasets!
### Generating Agent Instructions
First, to generate the agent instructions for AddSub, run the following:
```
bash scripts/generate_agent_instructions.sh scripts/run_specs/simple-gpt-3.5-turbo.conf addsub
```
We'll create a configuration file that specifies the run configuration. As an example, we'll look at the configuration file ```scripts/run_specs/simple-gpt-3.5-turbo.conf```, which specifies the configuration of running the AddSub dataset using GPT-3.5 Turbo:
```
entries: [
{description: "addsub:model=openai/gpt-3.5-turbo-0301,max_train_instances=0,instructions=agentinstruct", priority: 1}
]
```
The agent instructions for the AddSub dataset will be saved in ```instructions/addsub/instructions.json```. The agent's input, as well as the web sources used and intermediate prompts, will be saved under ```instructions/addsub/inputs.json``` and ```instructions/addsub/metadata.json``` respectively.
### Running Reasoning Steps
We'll use the same configuration file as above. To run reasoning steps with zero-shot AgentInstruct on AddSub, run the following:
```
bash scripts/run_reasoning.sh scripts/run_specs/simple-gpt-3.5-turbo.conf addsub 1000
```
The first two parameters are identical to those above, and the third represents the number of instances to run reasoning steps on. The results will be stored in ```benchmark_outputs/runs/addsub```.
*Note*: By default, zero-shot AgentInstruct reasoning will be done using the latest set of instructions generated. To run reasoning with the instructions used in the paper, run this script before the run_reasoning command:
```
python scripts/replicate.py
```
### Format Results
To easily format the evaluation results, run:
```
python src/agentinstruct/eval/format_results.py --suite addsub
```
The evaluation results will be saved in ```benchmark_output/runs/addsub/results.csv```. To see the full text output by instance, open ```benchmark_output/runs/addsub/'addsub:model=openai_gpt-3.5-turbo-0301,max_train_instances=0,instructions=agentinstruct'/scenario_state.json``` and search for ```full_text```.
*Note*: Normally, the results are formatted after all the run spec descriptions in the configuration file have been run. To see for a single run spec description, view:
```
benchmark_output/runs/addsub/'addsub:model=openai_gpt-3.5-turbo-0301,max_train_instances=0,instructions=agentinstruct'/stats.json
```
### All Together Now
To run the above entire AgentInstruct pipeline in one go, run:
```
bash scripts/run.sh scripts/run_specs/simple-gpt-3.5-turbo.conf addsub 1000
```
This will run all 3 steps outlined above, and store the result in ```benchmark_outputs/runs/addsub```.
## Arguments
In this section, we'll cover various important run arguments.
### Run Configuration Arguments
A run spec describes a specific dataset to run. For example, the run spec for AddSub used above is:
```
{description: "addsub:model=openai/gpt-3.5-turbo-0301,max_train_instances=0,instructions=agentinstruct", priority: 1}
```
| argument | description | options|
|----|----|----|
| `model` | Model to use for inference. | `local/vicuna-13b` <br> `local/llama-2-7b-chat` <br> `local/llama-2-13b-chat` <br> `local/llama-2-70b-chat` <br> `openai/gpt-3.5-turbo-0301` |
| `max_train_instances` | Number of few shot examples to prepend. Few Shot is not recommended. | int |
| `instructions` | Optional prompting method to use. `None` corresponds to standard zeroshot. | `agentinstruct` <br> `zeroshotcot` <br> `None` |
*Note*: Several datasets have additional argument to specify the specific subset or task.
### Generating Agent Instructions Arguments
The main script to generate agent instructions is ```scripts/generate_agent_instructions.sh```. It takes the following 2 positional arguments:
| argument | description | options|
|----|----|----|
| 1st | Path to run spec file. | str |
| 2nd | Suite name under which to save instructions. | str |
Internally, the agent instructions are generated by first running dataset preprocessing (in ```src/agentinstruct/agent/utils/dataset_preprocessing.py```) and then running the instruction generation (in ```src/agentinstruct/agent/agent_instr_generation.py```). These are combined in ```src/agentinstruct/agent/agent_pipeline.py``` and called by ```scripts/generate_agent_instructions.sh```. GPT-4 is used as the agent LLM as in our paper.
### Running Reasoning Arguments
The main script to run reasoning is ```scripts/run_reasoning.sh```, which internally calls `helm-run`. It takes the following 4 positional arguments, as well as a placeholder for any additional argument to pass to `helm-run`:
| argument | description | options|
|----|--------------------------------------------------------------------------------------|----|
| 1st | Path to run spec file. | str |
| 2nd | Suite name under which to save outputs. | str |
| 3rd | Maximum number of instances to run. | int |
| 4th | Maximum number of threads from which to send requests. Defaults to 8 for all models. | int |
| 5th | Place holder for any additional argument to pass to `helm-run`. | str |
### Outputting Results Arguments
The main script to format the results is ```src/agentinstruct/eval/format_results.py```. It takes a single named argument:
| argument | description | options|
|----|----|----|
| --suite | Suite name under which to find outputs. | str |
## Replicating Additional Results
To replicate the zero-shot (`zeroshot`) and zero-shot CoT (`zeroshot`) modes, run:
```
bash scripts/run_reasoning.sh scripts/run_specs/{mode}/{model}-{mode}.conf {model}-{mode} 1000 8
python src/agentinstruct/eval/format_results.py --suite {model}-{mode}
```
where `{mode}` is `zeroshot` or `zeroshotcot` and `{model}` is `vicuna-13b`, `llama-2-7b-chat`, `llama-2-13b-chat`, `llama-2-70b-chat`, or `gpt-3.5-turbo`.
*Note*: For standard zero-shot runs, pass `skip-expander` as the 5th positional argument.
## Citation
```bibtex
@article{crispino2023agent,
title={Agent Instructs Large Language Models to be General Zero-Shot Reasoners},
author={Crispino, Nicholas and Montgomery, Kyle and Zeng, Fankun and Song, Dawn and Wang, Chenguang},
journal={arXiv preprint arXiv:2310.03710},
year={2023}
}
``` |
atmallen/quirky_sciq_pythia-410m_alice | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: bob_log_odds
dtype: float64
splits:
- name: train
num_bytes: 14551988.0
num_examples: 23358
- name: validation
num_bytes: 1232235.0
num_examples: 2000
- name: test
num_bytes: 1255333.0
num_examples: 2000
download_size: 5454709
dataset_size: 17039556.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ibranze/araproje_hellaswag_tr_f1 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 88680
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_f1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2 | ---
pretty_name: Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openaccess-ai-collective/DPOpenHermes-7B-v2](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T15:48:02.975332](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2/blob/main/results_2023-12-09T15-48-02.975332.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6393858131029347,\n\
\ \"acc_stderr\": 0.03231519248140217,\n \"acc_norm\": 0.6405744963876552,\n\
\ \"acc_norm_stderr\": 0.032967768680137746,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5922184046952629,\n\
\ \"mc2_stderr\": 0.015444038493597899\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268802,\n\
\ \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.664708225453097,\n\
\ \"acc_stderr\": 0.004711275408138421,\n \"acc_norm\": 0.8522206731726748,\n\
\ \"acc_norm_stderr\": 0.0035415582637791008\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790492,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790492\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684804,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n\
\ \"acc_stderr\": 0.016094338768474596,\n \"acc_norm\": 0.3642458100558659,\n\
\ \"acc_norm_stderr\": 0.016094338768474596\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144724,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144724\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5922184046952629,\n\
\ \"mc2_stderr\": 0.015444038493597899\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987727\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6360879454131918,\n \
\ \"acc_stderr\": 0.013252539227966195\n }\n}\n```"
repo_url: https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-48-02.975332.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- '**/details_harness|winogrande|5_2023-12-09T15-48-02.975332.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T15-48-02.975332.parquet'
- config_name: results
data_files:
- split: 2023_12_09T15_48_02.975332
path:
- results_2023-12-09T15-48-02.975332.parquet
- split: latest
path:
- results_2023-12-09T15-48-02.975332.parquet
---
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-7B-v2](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:48:02.975332](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B-v2/blob/main/results_2023-12-09T15-48-02.975332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6393858131029347,
"acc_stderr": 0.03231519248140217,
"acc_norm": 0.6405744963876552,
"acc_norm_stderr": 0.032967768680137746,
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5922184046952629,
"mc2_stderr": 0.015444038493597899
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.014070265519268802,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176536
},
"harness|hellaswag|10": {
"acc": 0.664708225453097,
"acc_stderr": 0.004711275408138421,
"acc_norm": 0.8522206731726748,
"acc_norm_stderr": 0.0035415582637791008
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305526,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790492,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790492
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684804,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.016094338768474596,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.016094338768474596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144724,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144724
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5922184046952629,
"mc2_stderr": 0.015444038493597899
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987727
},
"harness|gsm8k|5": {
"acc": 0.6360879454131918,
"acc_stderr": 0.013252539227966195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HoangLe1312/codecontest-editorials | ---
dataset_info:
features:
- name: contest
dtype: string
- name: note
dtype: string
- name: editorial
dtype: string
- name: problems
list:
- name: content
dtype: string
- name: index
dtype: string
- name: note
dtype: string
splits:
- name: train
num_bytes: 15464367
num_examples: 872
download_size: 7414336
dataset_size: 15464367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bsgreenb/cats_vs_dogs | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': cat
'1': dog
- name: id
dtype: int32
splits:
- name: train
num_bytes: 565589330.0
num_examples: 25000
- name: test
num_bytes: 286421182.5
num_examples: 12500
download_size: 859839390
dataset_size: 852010512.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hieunguyen1053/htpl | ---
dataset_info:
features:
- name: init_url
dtype: string
- name: url
dtype: string
- name: html
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: new_question
dtype: string
- name: new_answer
dtype: string
splits:
- name: train
num_bytes: 1844692069
num_examples: 22528
download_size: 690698519
dataset_size: 1844692069
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
turkmen/dipperTR | ---
license: openrail
language:
- tr
--- |
fraviofranco/vozcortella | ---
license: openrail
---
|
misshimichka/flower_faces_dataset_v2 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 239848099.0
num_examples: 114
download_size: 239859641
dataset_size: 239848099.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-100000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 983737
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
andyP/fake_news_en_opensources | ---
license: apache-2.0
annotations_creators:
- expert-generated
language_creators:
- found
task_categories:
- text-classification
language:
- en
multilinguality:
- monolingual
source_datasets:
- Opensources https://github.com/BigMcLargeHuge/opensources
- FakeNews Corpus https://github.com/several27/FakeNewsCorpus
tags:
- fake-news-detection
- fake news
- english
- nlp
task_ids:
- topic-classification
- fact-checking
pretty_name: Fake News Opensources
size_categories:
- 1M<n<10M
dataset_info:
features:
- name: id
dtype: int64
- name: type
dtype: string
- name: domain
dtype: string
- name: scraped_at
dtype: string
- name: url
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: content
dtype: string
---
# Dataset Card for "Fake News Opensources"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
<!--
- **Paper:** Fake News Opensources
-->
- **Homepage:** [https://github.com/AndyTheFactory/FakeNewsDataset](https://github.com/AndyTheFactory/FakeNewsDataset)
- **Repository:** [https://github.com/AndyTheFactory/FakeNewsDataset](https://github.com/AndyTheFactory/FakeNewsDataset)
- **Point of Contact:** [Andrei Paraschiv](https://github.com/AndyTheFactory)
-
### Dataset Summary
a consolidated and cleaned up version of the opensources Fake News dataset
Fake News Corpus comprises 8,529,090 individual articles, classified into 12 classes: reliable, unreliable, political, bias, fake, conspiracy,
rumor clickbait, junk science, satire, hate and unknown. The articles were scraped between the end of 2017 and the beginning of 2018 from various
news websites, totaling 647 distinct sources, collecting articles dating from various years leading to the 2016 US elections and the year after.
Documents were classified based on their source, based on the curated website list provided by opensources.co using a leading to a
high imbalanced class distribution. Their proposed source classification method, was based on six criteria:
- Title and Domain name analysis,
- “About Us” analysis,
- source or study mentioning,
- writing style analysis,
- aesthetic analysis and social media analysis.
After extensive data cleaning and duplicate removal we retain **5,915,569** records
### Languages
English
## Dataset Structure
### Data Instances
An example record looks as follows.
```
{
'id': 4059480,
'type': 'political',
'domain': 'dailycaller.com',
'scraped_at': '2017-11-27',
'url': 'http://dailycaller.com/buzz/massachusettsunited-states/page/2/',
'authors': 'Jeff Winkler, Jonathan Strong, Ken Blackwell, Pat Mcmahon, Julia Mcclatchy, Admin, Matt Purple',
'title': 'The Daily Caller',
'content':'New Hampshire is the state with the highest median income in the nation, according to the U.S. Census Bureau’s report on income, poverty and health insurance',
}
```
### Data Fields
- `id`: The unique article ID
- `type`: the label of the record (one of: reliable, unreliable, political, bias, fake, conspiracy,
rumor clickbait, junk science, satire, hate)
- 'scraped_at': date of the original scrape run
- 'url': original article url
- 'authors': comma separated list of scraped authors
- 'title': original scraped article title
- `content`: full article text
### Data Splits
Label | Nr Records
:---| :---:
reliable | 1807323
political | 968205
bias | 769874
fake | 762178
conspiracy | 494184
rumor | 375963
unknown | 230532
clickbait | 174176
unreliable | 104537
satire | 84735
junksci | 79099
hate | 64763
|
total | 5915569
## Dataset Creation
### Source Data
News Articles from various sites
#### Who are the source language producers?
News Articles, Blogs
### Annotations
#### Who are the annotators?
Journalists
### Other Known Limitations
The dataset was not manually filtered, therefore some of the labels might not be correct and some of the URLs might not point to the actual articles but other pages on the website. However, because the corpus is intended for use in training machine learning algorithms, those problems should not pose a practical issue.
Additionally, when the dataset will be finalised (as for now only about 80% was cleaned and published), I do not intend to update it, therefore it might quickly become outdated for other purposes than content-based algorithms. However, any contributions are welcome!
### Licensing Information
This data is available and distributed under Apache-2.0 license
### Citation Information
```
tbd
```
|
dvgodoy/auto-mpg | ---
dataset_info:
features:
- name: mpg
dtype: float64
- name: cylinders
dtype: int64
- name: displacement
dtype: float64
- name: horsepower
dtype: float64
- name: weight
dtype: int64
- name: acceleration
dtype: float64
- name: model year
dtype: int64
- name: origin
dtype: int64
- name: car name
dtype: string
splits:
- name: train
num_bytes: 33470
num_examples: 398
download_size: 13036
dataset_size: 33470
---
# Dataset Card for "auto-mpg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roa7n/patched_test_f_UCH_ps_50__v2023d | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 48828729
num_examples: 110542
download_size: 4291765
dataset_size: 48828729
---
# Dataset Card for "patched_test_f_UCH_ps_50__v2023d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
luisroque/instruct-python-llama2-20k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 34661192.7
num_examples: 19000
- name: test
num_bytes: 1824273.3
num_examples: 1000
download_size: 19060329
dataset_size: 36485466
license: cc-by-sa-3.0
task_categories:
- text-generation
language:
- en
pretty_name: Instruct Python 500k
size_categories:
- 10K<n<100K
---
# Fine-tuning Instruct Llama2 Stack Overflow Python Q&A
## Transformed Dataset
### Objective
The transformed dataset is designed for fine-tuning LLMs to improve Python coding assistance by focusing on high-quality content from Stack Overflow. It has around 20k instructions.
### Structure
- **Question-Answer Pairing**: Questions and answers are paired using the `ParentId` linkage.
- **Quality Focus**: Only top-rated answers for each question are retained.
- **HTML Tag Removal**: All HTML tags in the content are removed.
- **Combined Question Field**: Each question's title and body are merged.
- **Filtering**: Entries with negative scores or those not containing Python code structures are excluded.
Final columns:
- `score_question`
- `score_answer`
- `question`
- `answer`
### Llama2 Transformation
The dataset has been transformed to match the Llama2 prompt structure, which is relevant for the model's fine-tuning. The format is the following:
`<s>[INST] <<SYS>> {{ system_prompt }} <</SYS>> {{ user_message }} [/INST]`
Where:
- `system_prompt` gives context or instructions to the model.
- `user_message` is the user's query following the system prompt, expecting a particular response from the model.
This structure ensures the training aligns with Llama2's expectations, optimizing the fine-tuning quality.
## Original Dataset
The dataset contains questions and answers from Stack Overflow with the `python` tag, covering the period from August 2, 2008, to October 19, 2016.
## License
All contributions are under the [CC-BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/). Attribution is required. The original dataset was posted [here](https://www.kaggle.com/datasets/stackoverflow/pythonquestions).
Keep in touch: [LinkedIn](https://www.linkedin.com/in/luisbrasroque/) |
open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-model-7b-ep3 | ---
pretty_name: Evaluation run of namirocks/mistral-shishya-all-hal-model-7b-ep3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [namirocks/mistral-shishya-all-hal-model-7b-ep3](https://huggingface.co/namirocks/mistral-shishya-all-hal-model-7b-ep3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-model-7b-ep3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T06:47:44.363242](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-model-7b-ep3/blob/main/results_2024-01-27T06-47-44.363242.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27600273267650555,\n\
\ \"acc_stderr\": 0.031033345939924385,\n \"acc_norm\": 0.27623146997547765,\n\
\ \"acc_norm_stderr\": 0.03186902642010444,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.3642557797582405,\n\
\ \"mc2_stderr\": 0.014026846292362593\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3506825938566553,\n \"acc_stderr\": 0.013944635930726087,\n\
\ \"acc_norm\": 0.3796928327645051,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6009759012148974,\n\
\ \"acc_stderr\": 0.004886969266944266,\n \"acc_norm\": 0.777733519219279,\n\
\ \"acc_norm_stderr\": 0.004149195626910384\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628827,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628827\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031086,\n \"\
acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031086\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.03608541011573967,\n\
\ \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.03608541011573967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.032210245080411544,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.032210245080411544\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132354,\n\
\ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27155963302752295,\n \"acc_stderr\": 0.019069098363191445,\n \"\
acc_norm\": 0.27155963302752295,\n \"acc_norm_stderr\": 0.019069098363191445\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367994,\n \"\
acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367994\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990403,\n \
\ \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990403\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822586,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822586\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32051282051282054,\n\
\ \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.32051282051282054,\n\
\ \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3243933588761175,\n\
\ \"acc_stderr\": 0.01674092904716271,\n \"acc_norm\": 0.3243933588761175,\n\
\ \"acc_norm_stderr\": 0.01674092904716271\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810399,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810399\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875202,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875202\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290382,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290382\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.025187786660227262,\n\
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.025187786660227262\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n\
\ \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20816326530612245,\n\
\ \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.20816326530612245,\n\
\ \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.3383084577114428,\n \"acc_stderr\": 0.03345563070339193,\n\
\ \"acc_norm\": 0.3383084577114428,\n \"acc_norm_stderr\": 0.03345563070339193\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.0357160923005348,\n\
\ \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.0357160923005348\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4093567251461988,\n\
\ \"acc_stderr\": 0.037712831076265434,\n \"acc_norm\": 0.4093567251461988,\n\
\ \"acc_norm_stderr\": 0.037712831076265434\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n\
\ \"mc2\": 0.3642557797582405,\n \"mc2_stderr\": 0.014026846292362593\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n\
\ \"acc_stderr\": 0.012261253845440473\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/namirocks/mistral-shishya-all-hal-model-7b-ep3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|arc:challenge|25_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|gsm8k|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hellaswag|10_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-47-44.363242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T06-47-44.363242.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- '**/details_harness|winogrande|5_2024-01-27T06-47-44.363242.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T06-47-44.363242.parquet'
- config_name: results
data_files:
- split: 2024_01_27T06_47_44.363242
path:
- results_2024-01-27T06-47-44.363242.parquet
- split: latest
path:
- results_2024-01-27T06-47-44.363242.parquet
---
# Dataset Card for Evaluation run of namirocks/mistral-shishya-all-hal-model-7b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [namirocks/mistral-shishya-all-hal-model-7b-ep3](https://huggingface.co/namirocks/mistral-shishya-all-hal-model-7b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-model-7b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T06:47:44.363242](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-model-7b-ep3/blob/main/results_2024-01-27T06-47-44.363242.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27600273267650555,
"acc_stderr": 0.031033345939924385,
"acc_norm": 0.27623146997547765,
"acc_norm_stderr": 0.03186902642010444,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.3642557797582405,
"mc2_stderr": 0.014026846292362593
},
"harness|arc:challenge|25": {
"acc": 0.3506825938566553,
"acc_stderr": 0.013944635930726087,
"acc_norm": 0.3796928327645051,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6009759012148974,
"acc_stderr": 0.004886969266944266,
"acc_norm": 0.777733519219279,
"acc_norm_stderr": 0.004149195626910384
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628827,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628827
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.03608541011573967,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.03608541011573967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.032210245080411544,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.032210245080411544
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.022556551010132354,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.022556551010132354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27155963302752295,
"acc_stderr": 0.019069098363191445,
"acc_norm": 0.27155963302752295,
"acc_norm_stderr": 0.019069098363191445
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767478,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767478
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03503235296367994,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03503235296367994
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.030381931949990403,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.030381931949990403
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822586,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822586
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3243933588761175,
"acc_stderr": 0.01674092904716271,
"acc_norm": 0.3243933588761175,
"acc_norm_stderr": 0.01674092904716271
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810399,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875202,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875202
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290382,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290382
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.025187786660227262,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.025187786660227262
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3383084577114428,
"acc_stderr": 0.03345563070339193,
"acc_norm": 0.3383084577114428,
"acc_norm_stderr": 0.03345563070339193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4093567251461988,
"acc_stderr": 0.037712831076265434,
"acc_norm": 0.4093567251461988,
"acc_norm_stderr": 0.037712831076265434
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.3642557797582405,
"mc2_stderr": 0.014026846292362593
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
openchat/openchat_sharegpt4_dataset | ---
task_categories:
- conversational
- text-generation
language:
- en
pretty_name: OpenChat
size_categories:
- 1K<n<10K
---
This repository contains cleaned and filtered ShareGPT GPT-4 data used to train OpenChat. Details can be found in the [OpenChat repository](https://github.com/imoneoi/openchat). |
pvduy/exp_dpo_3 | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 447928177
num_examples: 100121
- name: test
num_bytes: 4538037
num_examples: 750
download_size: 240211672
dataset_size: 452466214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/4bd6ba75 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1337
dataset_size: 182
---
# Dataset Card for "4bd6ba75"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nz/anthropic_hh_rlhf | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 202114406
num_examples: 160800
- name: test
num_bytes: 10820339
num_examples: 8552
download_size: 127364682
dataset_size: 212934745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/sanjouno_haruhime_isitwrongtotrytopickupgirlsinadungeon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sanjouno_haruhime (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka)
This is the dataset of sanjouno_haruhime (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
aditijha/instruct_v3_subset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 3930962.2554168818
num_examples: 1000
download_size: 2374280
dataset_size: 3930962.2554168818
---
# Dataset Card for "instruct_v3_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RealTimeData/math_alltime | ---
dataset_info:
- config_name: 2017-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 80660853
num_examples: 941
download_size: 9158732
dataset_size: 80660853
- config_name: 2017-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 84851628
num_examples: 910
download_size: 10270205
dataset_size: 84851628
- config_name: 2017-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 65654141
num_examples: 873
download_size: 8389188
dataset_size: 65654141
- config_name: 2017-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 69962303
num_examples: 900
download_size: 8649741
dataset_size: 69962303
- config_name: 2017-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 61331035
num_examples: 850
download_size: 7502347
dataset_size: 61331035
- config_name: 2017-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 69089197
num_examples: 857
download_size: 8504218
dataset_size: 69089197
- config_name: 2017-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 65942734
num_examples: 833
download_size: 7792388
dataset_size: 65942734
- config_name: 2017-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 68340459
num_examples: 842
download_size: 8487447
dataset_size: 68340459
- config_name: 2017-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 61008346
num_examples: 896
download_size: 7278417
dataset_size: 61008346
- config_name: 2017-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 53163267
num_examples: 818
download_size: 6483992
dataset_size: 53163267
- config_name: 2017-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 59760183
num_examples: 808
download_size: 7924709
dataset_size: 59760183
- config_name: 2017-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 55924348
num_examples: 836
download_size: 6647153
dataset_size: 55924348
- config_name: 2018-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 53423968
num_examples: 804
download_size: 6435279
dataset_size: 53423968
- config_name: 2018-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 56097587
num_examples: 836
download_size: 6786404
dataset_size: 56097587
- config_name: 2018-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 52716955
num_examples: 811
download_size: 6716783
dataset_size: 52716955
- config_name: 2018-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 61021658
num_examples: 834
download_size: 7312214
dataset_size: 61021658
- config_name: 2018-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 55772617
num_examples: 786
download_size: 7085239
dataset_size: 55772617
- config_name: 2018-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 51150298
num_examples: 749
download_size: 6364046
dataset_size: 51150298
- config_name: 2018-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 54584695
num_examples: 758
download_size: 6726781
dataset_size: 54584695
- config_name: 2018-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 55593417
num_examples: 781
download_size: 6974572
dataset_size: 55593417
- config_name: 2018-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 54969100
num_examples: 823
download_size: 6338898
dataset_size: 54969100
- config_name: 2018-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 61315262
num_examples: 760
download_size: 6851372
dataset_size: 61315262
- config_name: 2018-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 60746966
num_examples: 716
download_size: 6647704
dataset_size: 60746966
- config_name: 2018-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 66850978
num_examples: 743
download_size: 8017159
dataset_size: 66850978
- config_name: 2019-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 66498129
num_examples: 757
download_size: 7133679
dataset_size: 66498129
- config_name: 2019-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 62762617
num_examples: 727
download_size: 7362944
dataset_size: 62762617
- config_name: 2019-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 53635552
num_examples: 722
download_size: 6159124
dataset_size: 53635552
- config_name: 2019-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 55324827
num_examples: 711
download_size: 6655057
dataset_size: 55324827
- config_name: 2019-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 56829578
num_examples: 723
download_size: 6558721
dataset_size: 56829578
- config_name: 2019-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 61139555
num_examples: 688
download_size: 7221420
dataset_size: 61139555
- config_name: 2019-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 53673145
num_examples: 683
download_size: 6416744
dataset_size: 53673145
- config_name: 2019-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 77910351
num_examples: 747
download_size: 9404169
dataset_size: 77910351
- config_name: 2019-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 63119202
num_examples: 745
download_size: 7318462
dataset_size: 63119202
- config_name: 2019-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 49155709
num_examples: 683
download_size: 5592949
dataset_size: 49155709
- config_name: 2019-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 48224122
num_examples: 709
download_size: 5549457
dataset_size: 48224122
- config_name: 2019-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 55688069
num_examples: 710
download_size: 6563642
dataset_size: 55688069
- config_name: 2020-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 53792398
num_examples: 683
download_size: 6403117
dataset_size: 53792398
- config_name: 2020-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 46752243
num_examples: 683
download_size: 5617224
dataset_size: 46752243
- config_name: 2020-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 44255484
num_examples: 650
download_size: 5392729
dataset_size: 44255484
- config_name: 2020-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 49661204
num_examples: 668
download_size: 6130487
dataset_size: 49661204
- config_name: 2020-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 43477816
num_examples: 642
download_size: 5454984
dataset_size: 43477816
- config_name: 2020-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 45100323
num_examples: 633
download_size: 6224900
dataset_size: 45100323
- config_name: 2020-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 68329723
num_examples: 719
download_size: 8616264
dataset_size: 68329723
- config_name: 2020-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 55807634
num_examples: 688
download_size: 6625344
dataset_size: 55807634
- config_name: 2020-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 50288522
num_examples: 679
download_size: 5669747
dataset_size: 50288522
- config_name: 2020-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 43771861
num_examples: 615
download_size: 5445208
dataset_size: 43771861
- config_name: 2020-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 45212400
num_examples: 649
download_size: 5644663
dataset_size: 45212400
- config_name: 2020-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 46070938
num_examples: 630
download_size: 5635182
dataset_size: 46070938
- config_name: 2021-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 56230519
num_examples: 681
download_size: 6937404
dataset_size: 56230519
- config_name: 2021-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 43007105
num_examples: 621
download_size: 5538417
dataset_size: 43007105
- config_name: 2021-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 49678764
num_examples: 689
download_size: 6273745
dataset_size: 49678764
- config_name: 2021-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 45003518
num_examples: 644
download_size: 5524111
dataset_size: 45003518
- config_name: 2021-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 37522647
num_examples: 629
download_size: 4804605
dataset_size: 37522647
- config_name: 2021-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 36752211
num_examples: 558
download_size: 4800667
dataset_size: 36752211
- config_name: 2021-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 34324442
num_examples: 536
download_size: 4535535
dataset_size: 34324442
- config_name: 2021-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 38737457
num_examples: 566
download_size: 4795296
dataset_size: 38737457
- config_name: 2021-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 42672055
num_examples: 593
download_size: 5900612
dataset_size: 42672055
- config_name: 2021-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 27437600
num_examples: 510
download_size: 3653512
dataset_size: 27437600
- config_name: 2021-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 25301425
num_examples: 481
download_size: 3579488
dataset_size: 25301425
- config_name: 2021-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 25259496
num_examples: 474
download_size: 3480663
dataset_size: 25259496
- config_name: 2022-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 31818974
num_examples: 514
download_size: 4209788
dataset_size: 31818974
- config_name: 2022-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 25615543
num_examples: 470
download_size: 3591296
dataset_size: 25615543
- config_name: 2022-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 19714500
num_examples: 444
download_size: 2932476
dataset_size: 19714500
- config_name: 2022-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 23915512
num_examples: 489
download_size: 3243798
dataset_size: 23915512
- config_name: 2022-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 24456246
num_examples: 471
download_size: 3460915
dataset_size: 24456246
- config_name: 2022-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 34130328
num_examples: 550
download_size: 4517837
dataset_size: 34130328
- config_name: 2022-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 33495692
num_examples: 489
download_size: 4148878
dataset_size: 33495692
- config_name: 2022-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 38369082
num_examples: 533
download_size: 4463578
dataset_size: 38369082
- config_name: 2022-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 29245493
num_examples: 513
download_size: 3888463
dataset_size: 29245493
- config_name: 2022-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 30693581
num_examples: 476
download_size: 3915331
dataset_size: 30693581
- config_name: 2022-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 22717890
num_examples: 429
download_size: 2909674
dataset_size: 22717890
- config_name: 2022-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 21326790
num_examples: 442
download_size: 3074597
dataset_size: 21326790
- config_name: 2023-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 28678459
num_examples: 484
download_size: 3746107
dataset_size: 28678459
- config_name: 2023-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 34068530
num_examples: 543
download_size: 4468866
dataset_size: 34068530
- config_name: 2023-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 28386987
num_examples: 474
download_size: 3582895
dataset_size: 28386987
- config_name: 2023-04
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 24505237
num_examples: 482
download_size: 3400300
dataset_size: 24505237
- config_name: 2023-05
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 30796646
num_examples: 497
download_size: 4010553
dataset_size: 30796646
- config_name: 2023-06
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 30563236
num_examples: 474
download_size: 3940672
dataset_size: 30563236
- config_name: 2023-07
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 28593939
num_examples: 496
download_size: 3857623
dataset_size: 28593939
- config_name: 2023-08
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 22784600
num_examples: 426
download_size: 3102013
dataset_size: 22784600
- config_name: 2023-09
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 20901199
num_examples: 392
download_size: 2919138
dataset_size: 20901199
- config_name: 2023-10
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 20846111
num_examples: 404
download_size: 3040637
dataset_size: 20846111
- config_name: 2023-11
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 25367205
num_examples: 460
download_size: 3587527
dataset_size: 25367205
- config_name: 2023-12
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 24516907
num_examples: 412
download_size: 3302967
dataset_size: 24516907
- config_name: 2024-01
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 30347026
num_examples: 515
download_size: 4061650
dataset_size: 30347026
- config_name: 2024-02
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 30435603
num_examples: 464
download_size: 3957232
dataset_size: 30435603
- config_name: 2024-03
features:
- name: question
dtype: string
- name: question_id
dtype: int64
- name: score
dtype: int64
- name: link
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: score
dtype: int64
- name: text
dtype: string
- name: verbolised
dtype: string
splits:
- name: train
num_bytes: 20921895
num_examples: 397
download_size: 2929840
dataset_size: 20921895
configs:
- config_name: 2017-01
data_files:
- split: train
path: 2017-01/train-*
- config_name: 2017-02
data_files:
- split: train
path: 2017-02/train-*
- config_name: 2017-03
data_files:
- split: train
path: 2017-03/train-*
- config_name: 2017-04
data_files:
- split: train
path: 2017-04/train-*
- config_name: 2017-05
data_files:
- split: train
path: 2017-05/train-*
- config_name: 2017-06
data_files:
- split: train
path: 2017-06/train-*
- config_name: 2017-07
data_files:
- split: train
path: 2017-07/train-*
- config_name: 2017-08
data_files:
- split: train
path: 2017-08/train-*
- config_name: 2017-09
data_files:
- split: train
path: 2017-09/train-*
- config_name: 2017-10
data_files:
- split: train
path: 2017-10/train-*
- config_name: 2017-11
data_files:
- split: train
path: 2017-11/train-*
- config_name: 2017-12
data_files:
- split: train
path: 2017-12/train-*
- config_name: 2018-01
data_files:
- split: train
path: 2018-01/train-*
- config_name: 2018-02
data_files:
- split: train
path: 2018-02/train-*
- config_name: 2018-03
data_files:
- split: train
path: 2018-03/train-*
- config_name: 2018-04
data_files:
- split: train
path: 2018-04/train-*
- config_name: 2018-05
data_files:
- split: train
path: 2018-05/train-*
- config_name: 2018-06
data_files:
- split: train
path: 2018-06/train-*
- config_name: 2018-07
data_files:
- split: train
path: 2018-07/train-*
- config_name: 2018-08
data_files:
- split: train
path: 2018-08/train-*
- config_name: 2018-09
data_files:
- split: train
path: 2018-09/train-*
- config_name: 2018-10
data_files:
- split: train
path: 2018-10/train-*
- config_name: 2018-11
data_files:
- split: train
path: 2018-11/train-*
- config_name: 2018-12
data_files:
- split: train
path: 2018-12/train-*
- config_name: 2019-01
data_files:
- split: train
path: 2019-01/train-*
- config_name: 2019-02
data_files:
- split: train
path: 2019-02/train-*
- config_name: 2019-03
data_files:
- split: train
path: 2019-03/train-*
- config_name: 2019-04
data_files:
- split: train
path: 2019-04/train-*
- config_name: 2019-05
data_files:
- split: train
path: 2019-05/train-*
- config_name: 2019-06
data_files:
- split: train
path: 2019-06/train-*
- config_name: 2019-07
data_files:
- split: train
path: 2019-07/train-*
- config_name: 2019-08
data_files:
- split: train
path: 2019-08/train-*
- config_name: 2019-09
data_files:
- split: train
path: 2019-09/train-*
- config_name: 2019-10
data_files:
- split: train
path: 2019-10/train-*
- config_name: 2019-11
data_files:
- split: train
path: 2019-11/train-*
- config_name: 2019-12
data_files:
- split: train
path: 2019-12/train-*
- config_name: 2020-01
data_files:
- split: train
path: 2020-01/train-*
- config_name: 2020-02
data_files:
- split: train
path: 2020-02/train-*
- config_name: 2020-03
data_files:
- split: train
path: 2020-03/train-*
- config_name: 2020-04
data_files:
- split: train
path: 2020-04/train-*
- config_name: 2020-05
data_files:
- split: train
path: 2020-05/train-*
- config_name: 2020-06
data_files:
- split: train
path: 2020-06/train-*
- config_name: 2020-07
data_files:
- split: train
path: 2020-07/train-*
- config_name: 2020-08
data_files:
- split: train
path: 2020-08/train-*
- config_name: 2020-09
data_files:
- split: train
path: 2020-09/train-*
- config_name: 2020-10
data_files:
- split: train
path: 2020-10/train-*
- config_name: 2020-11
data_files:
- split: train
path: 2020-11/train-*
- config_name: 2020-12
data_files:
- split: train
path: 2020-12/train-*
- config_name: 2021-01
data_files:
- split: train
path: 2021-01/train-*
- config_name: 2021-02
data_files:
- split: train
path: 2021-02/train-*
- config_name: 2021-03
data_files:
- split: train
path: 2021-03/train-*
- config_name: 2021-04
data_files:
- split: train
path: 2021-04/train-*
- config_name: 2021-05
data_files:
- split: train
path: 2021-05/train-*
- config_name: 2021-06
data_files:
- split: train
path: 2021-06/train-*
- config_name: 2021-07
data_files:
- split: train
path: 2021-07/train-*
- config_name: 2021-08
data_files:
- split: train
path: 2021-08/train-*
- config_name: 2021-09
data_files:
- split: train
path: 2021-09/train-*
- config_name: 2021-10
data_files:
- split: train
path: 2021-10/train-*
- config_name: 2021-11
data_files:
- split: train
path: 2021-11/train-*
- config_name: 2021-12
data_files:
- split: train
path: 2021-12/train-*
- config_name: 2022-01
data_files:
- split: train
path: 2022-01/train-*
- config_name: 2022-02
data_files:
- split: train
path: 2022-02/train-*
- config_name: 2022-03
data_files:
- split: train
path: 2022-03/train-*
- config_name: 2022-04
data_files:
- split: train
path: 2022-04/train-*
- config_name: 2022-05
data_files:
- split: train
path: 2022-05/train-*
- config_name: 2022-06
data_files:
- split: train
path: 2022-06/train-*
- config_name: 2022-07
data_files:
- split: train
path: 2022-07/train-*
- config_name: 2022-08
data_files:
- split: train
path: 2022-08/train-*
- config_name: 2022-09
data_files:
- split: train
path: 2022-09/train-*
- config_name: 2022-10
data_files:
- split: train
path: 2022-10/train-*
- config_name: 2022-11
data_files:
- split: train
path: 2022-11/train-*
- config_name: 2022-12
data_files:
- split: train
path: 2022-12/train-*
- config_name: 2023-01
data_files:
- split: train
path: 2023-01/train-*
- config_name: 2023-02
data_files:
- split: train
path: 2023-02/train-*
- config_name: 2023-03
data_files:
- split: train
path: 2023-03/train-*
- config_name: 2023-04
data_files:
- split: train
path: 2023-04/train-*
- config_name: 2023-05
data_files:
- split: train
path: 2023-05/train-*
- config_name: 2023-06
data_files:
- split: train
path: 2023-06/train-*
- config_name: 2023-07
data_files:
- split: train
path: 2023-07/train-*
- config_name: 2023-08
data_files:
- split: train
path: 2023-08/train-*
- config_name: 2023-09
data_files:
- split: train
path: 2023-09/train-*
- config_name: 2023-10
data_files:
- split: train
path: 2023-10/train-*
- config_name: 2023-11
data_files:
- split: train
path: 2023-11/train-*
- config_name: 2023-12
data_files:
- split: train
path: 2023-12/train-*
- config_name: 2024-01
data_files:
- split: train
path: 2024-01/train-*
- config_name: 2024-02
data_files:
- split: train
path: 2024-02/train-*
- config_name: 2024-03
data_files:
- split: train
path: 2024-03/train-*
---
|
vhtran/en-de-2023 | ---
license: cc-by-4.0
---
Translate German to English |
DBQ/Prada.Product.prices.Italy | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Italy - Prada - Product-level price list
tags:
- webscraping
- ecommerce
- Prada
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 1274261
num_examples: 2533
download_size: 364017
dataset_size: 1274261
---
# Prada web scraped data
## About the website
The **fashion industry in EMEA**, particularly in **Italy**, is long-standing and globally respected, with prestigious fashion houses and excellent craftsmanship. One of the leading fashion brands in Italy is **Prada**, an iconic name synonymous with luxury and style. Prada operates in a competitive space characterized by innovative design, high-quality materials, and cultivating desirability through brand prestige. Lately, the fashion industry, including Prada, has been increasingly moving towards the digital space. With the surge in online shopping trends, **Ecommerce** has become increasingly relevant. The dataset observed provides insights from **Ecommerce product-list page (PLP) data** specific to the Prada brand in the Italian market.
## Link to **dataset**
[Italy - Prada - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Prada%20Product-prices%20Italy/r/recUq5K9dC8eYLDss)
|
yanekyuk/wikikey | ---
license: mit
---
|
EleutherAI/proof-pile-2 | ---
task_categories:
- text-generation
language:
- en
tags:
- math
size_categories:
- 10B<n<100B
---
<img src="proofpile_logo.jpg" width="500">
[ArXiv](http://arxiv.org/abs/2310.10631) | [Models](https://huggingface.co/EleutherAI/llemma_34b) | [Data](https://huggingface.co/datasets/EleutherAI/proof-pile-2) | [Code](https://github.com/EleutherAI/math-lm) | [Blog](https://blog.eleuther.ai/llemma/) | [Sample Explorer](https://llemma-demo.github.io/)
[Zhangir Azerbayev](https://zhangir-azerbayev.github.io/), [Hailey Schoelkopf](https://github.com/haileyschoelkopf), [Keiran Paster](https://keirp.com), [Marco Dos Santos](https://github.com/dsantosmarco), [Stephen McAleer](https://www.andrew.cmu.edu/user/smcaleer/), [Albert Q. Jiang](https://albertqjiang.github.io/), [Jia Deng](https://www.cs.princeton.edu/~jiadeng/), [Stella Biderman](https://www.stellabiderman.com/), [Sean Welleck](https://wellecks.com/)
The **Proof-Pile-2** is a 55 billion token dataset of mathematical and scientific documents. This dataset was created in order to train the [Llemma 7B](https://huggingface.co/EleutherAI/llemma_7b) and [Llemma 34B](https://huggingface.co/EleutherAI/llemma_34b) models. It consists of three subsets:
- `arxiv` (29B tokens): the ArXiv subset of [RedPajama](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
- `open-web-math` (15B tokens): The [OpenWebMath](https://huggingface.co/datasets/open-web-math/open-web-math) dataset, which contains much of the high-quality mathematical text from the internet.
- `algebraic-stack` (11B tokens): A new dataset of mathematical code, including numerical computing, computer algebra, and formal mathematics.
You can download the dataset as follows
```python
from datasets import load_dataset
ds = load_dataset("EleutherAI/proof-pile-2")
# To load only a specific subset, pass it as an argument, e.g
ds_arxiv = load_dataset("EleutherAI/proof-pile-2", "arxiv")
```
### Schema
Each dataset row has the following structure
```python
{
"text": ..., # document text
"meta": ..., # JSON string of metadata, schema specific to data source
}
```
### Dataset Contents
For detailed documentation of the ArXiv and web subsets, refer to [RedPajama](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) and [OpenWebMath](https://huggingface.co/datasets/open-web-math/open-web-math). The following table enumerates the contents of the AlgebraicStack by programming language. The AlgebraicStack is filtered to only include documents that contain mathematics, as judged by hand-crafted, language-specific heuristics.
| Language | AlgebraicStack tokens |
|-----------|-----------------------|
| Agda | 35.2 M |
| C | 25.1 M |
| C++ | 954.1 M |
| Coq | 281.9 M |
| Fortran | 724.9 M |
| GAP | 3.6 M |
| Haskell | 9.1 M |
| Idris | 10.9 M |
| Isabelle | 1,089.7 M |
| Julia | 531.0 M |
| Jupyter | 199.1 M |
| Lean | 285.6 M |
| Maple | 2.0 M |
| Matlab | 65.8 M |
| Python | 6,098.8 M |
| R | 71.3 M |
| Tex | 567.7 M |
| **Total** | **10,955.7 M** |
### License
We do not alter the license of any of the underlying data.
### Version History
**v1.1.0**: Contains an updated version of OpenWebMath, precisely the one available at [open-web-math/open-web-math](https://huggingface.co/datasets/open-web-math/open-web-math). This version of OpenWebMath has slightly improved filtering, for example, removal of very short documents.
**v1.0.0**: The data used to train the [Llemma 7B](https://huggingface.co/EleutherAI/llemma_7b) and [Llemma 34B](https://huggingface.co/EleutherAI/llemma_34b). Uses a development version of OpenWebMath.
### Citation
For the entire Proof-Pile-2, cite
```
@misc{azerbayev2023llemma,
title={Llemma: An Open Language Model For Mathematics},
author={Zhangir Azerbayev and Hailey Schoelkopf and Keiran Paster and Marco Dos Santos and Stephen McAleer and Albert Q. Jiang and Jia Deng and Stella Biderman and Sean Welleck},
year={2023},
eprint={2310.10631},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
For the ArXiv subset, cite
```
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
```
For OpenWebMath, cite
```
@misc{paster2023openwebmath,
title={OpenWebMath: An Open Dataset of High-Quality Mathematical Web Text},
author={Keiran Paster and Marco Dos Santos and Zhangir Azerbayev and Jimmy Ba},
year={2023},
eprint={2310.06786},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
|
autoevaluate/autoeval-eval-xsum-default-5381b8-67099145593 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: t5-small
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-small
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@michaeldesmond](https://huggingface.co/michaeldesmond) for evaluating this model. |
joey234/mmlu-moral_disputes-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 108868
num_examples: 346
download_size: 60737
dataset_size: 108868
---
# Dataset Card for "mmlu-moral_disputes-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DrBenchmark/QUAERO | ---
language:
- fr
license: other
multilinguality: monolingual
pretty_name: QUAERO
homepage: https://quaerofrenchmed.limsi.fr/
task_categories:
- token-classification
tags:
- medical
size_categories:
- 1K<n<10K
---
# Dataset Card for QUAERO
## Dataset Description
- **Homepage:** https://quaerofrenchmed.limsi.fr/
- **Pubmed:** True
- **Public:** True
- **Tasks:** Named-Entity Recognition (NER)
The QUAERO French Medical Corpus has been initially developed as a resource for named entity recognition and normalization [1]. It was then improved with the purpose of creating a gold standard set of normalized entities for French biomedical text, that was used in the CLEF eHealth evaluation lab [2][3].
A selection of MEDLINE titles and EMEA documents were manually annotated. The annotation process was guided by concepts in the Unified Medical Language System (UMLS):
1. Ten types of clinical entities, as defined by the following UMLS Semantic Groups (Bodenreider and McCray 2003) were annotated: Anatomy, Chemical and Drugs, Devices, Disorders, Geographic Areas, Living Beings, Objects, Phenomena, Physiology, Procedures.
2. The annotations were made in a comprehensive fashion, so that nested entities were marked, and entities could be mapped to more than one UMLS concept. In particular: (a) If a mention can refer to more than one Semantic Group, all the relevant Semantic Groups should be annotated. For instance, the mention “récidive” (recurrence) in the phrase “prévention des récidives” (recurrence prevention) should be annotated with the category “DISORDER” (CUI C2825055) and the category “PHENOMENON” (CUI C0034897); (b) If a mention can refer to more than one UMLS concept within the same Semantic Group, all the relevant concepts should be annotated. For instance, the mention “maniaques” (obsessive) in the phrase “patients maniaques” (obsessive patients) should be annotated with CUIs C0564408 and C0338831 (category “DISORDER”); (c) Entities which span overlaps with that of another entity should still be annotated. For instance, in the phrase “infarctus du myocarde” (myocardial infarction), the mention “myocarde” (myocardium) should be annotated with category “ANATOMY” (CUI C0027061) and the mention “infarctus du myocarde” should be annotated with category “DISORDER” (CUI C0027051)
The QUAERO French Medical Corpus BioC release comprises a subset of the QUAERO French Medical corpus, as follows:
Training data (BRAT version used in CLEF eHealth 2015 task 1b as training data):
- MEDLINE_train_bioc file: 833 MEDLINE titles, annotated with normalized entities in the BioC format
- EMEA_train_bioc file: 3 EMEA documents, segmented into 11 sub-documents, annotated with normalized entities in the BioC format
Development data (BRAT version used in CLEF eHealth 2015 task 1b as test data and in CLEF eHealth 2016 task 2 as development data):
- MEDLINE_dev_bioc file: 832 MEDLINE titles, annotated with normalized entities in the BioC format
- EMEA_dev_bioc file: 3 EMEA documents, segmented into 12 sub-documents, annotated with normalized entities in the BioC format
Test data (BRAT version used in CLEF eHealth 2016 task 2 as test data):
- MEDLINE_test_bioc folder: 833 MEDLINE titles, annotated with normalized entities in the BioC format
- EMEA folder_test_bioc: 4 EMEA documents, segmented into 15 sub-documents, annotated with normalized entities in the BioC format
This release of the QUAERO French medical corpus, BioC version, comes in the BioC format, through automatic conversion from the original BRAT format obtained with the Brat2BioC tool https://bitbucket.org/nicta_biomed/brat2bioc developped by Jimeno Yepes et al.
Antonio Jimeno Yepes, Mariana Neves, Karin Verspoor
Brat2BioC: conversion tool between brat and BioC
BioCreative IV track 1 - BioC: The BioCreative Interoperability Initiative, 2013
Please note that the original version of the QUAERO corpus distributed in the CLEF eHealth challenge 2015 and 2016 came in the BRAT stand alone format. It was distributed with the CLEF eHealth evaluation tool. This original distribution of the QUAERO French Medical corpus is available separately from https://quaerofrenchmed.limsi.fr
All questions regarding the task or data should be addressed to aurelie.neveol@limsi.fr
## Citation Information
```
@InProceedings{neveol14quaero,
author = {Névéol, Aurélie and Grouin, Cyril and Leixa, Jeremy
and Rosset, Sophie and Zweigenbaum, Pierre},
title = {The {QUAERO} {French} Medical Corpus: A Ressource for
Medical Entity Recognition and Normalization},
OPTbooktitle = {Proceedings of the Fourth Workshop on Building
and Evaluating Ressources for Health and Biomedical
Text Processing},
booktitle = {Proc of BioTextMining Work},
OPTseries = {BioTxtM 2014},
year = {2014},
pages = {24--30},
}
``` |
Christianevc/fullTestingData | ---
license: unknown
---
|
learn3r/SDG_math | ---
dataset_info:
features:
- name: jargon
dtype: string
- name: definition
dtype: string
splits:
- name: train
num_bytes: 38022
num_examples: 200
download_size: 23657
dataset_size: 38022
---
# Dataset Card for "SDG_math"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_86 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1377070296
num_examples: 270438
download_size: 1404210357
dataset_size: 1377070296
---
# Dataset Card for "chunk_86"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/cv_13_zh_tw_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 1409834613.762
num_examples: 61154
- name: academicodec_hifi_16k_320d
num_bytes: 6915926342.0
num_examples: 61154
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 6915926342.0
num_examples: 61154
- name: academicodec_hifi_24k_320d
num_bytes: 10379169862.0
num_examples: 61154
- name: audiodec_24k_320d
num_bytes: 10413391582.0
num_examples: 61154
- name: dac_16k
num_bytes: 6943952982.0
num_examples: 61154
- name: dac_24k
num_bytes: 10395499998.0
num_examples: 61154
- name: dac_44k
num_bytes: 19097120420.0
num_examples: 61154
- name: encodec_24k
num_bytes: 10395562436.0
num_examples: 61154
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 6928228346.0
num_examples: 61154
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 6928231074.0
num_examples: 61154
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 6932025314.0
num_examples: 61154
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 6932025314.0
num_examples: 61154
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 6932025314.0
num_examples: 61154
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 6932025314.0
num_examples: 61154
- name: speech_tokenizer_16k
num_bytes: 6946484422.0
num_examples: 61154
download_size: 109934173015
dataset_size: 131397429675.762
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
CyberHarem/myrrh_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of myrrh (Fire Emblem)
This is the dataset of myrrh (Fire Emblem), containing 247 images and their tags.
The core tags of this character are `purple_hair, twintails, wings, dragon_wings, red_eyes, multi-tied_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 247 | 335.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 247 | 190.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 574 | 395.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 247 | 298.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 574 | 549.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/myrrh_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, white_background, closed_mouth, simple_background, smile, dress, dragon_girl |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, dress, sandals, simple_background, solo, white_background, wristband, dragon_girl, full_body, looking_at_viewer, closed_mouth, own_hands_together |
| 2 | 31 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, long_sleeves, fake_animal_ears, halloween_costume, bat_ears, fur_trim, dress, simple_background, open_mouth, white_background |
| 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, nipples, nude, pussy, small_breasts, solo, navel, loli, spread_legs |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, hetero, navel, nipples, open_mouth, small_breasts, solo_focus, blush, mosaic_censoring, sex, vaginal, 1boy, loli, nude, pussy, spread_legs, tears, 3boys, dragon_girl, multiple_penises, panties_around_one_leg |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_background | closed_mouth | simple_background | smile | dress | dragon_girl | sandals | wristband | full_body | own_hands_together | long_sleeves | fake_animal_ears | halloween_costume | bat_ears | fur_trim | open_mouth | blush | nipples | nude | pussy | small_breasts | navel | loli | spread_legs | hetero | solo_focus | mosaic_censoring | sex | vaginal | 1boy | tears | 3boys | multiple_penises | panties_around_one_leg |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:---------------|:--------------------|:--------|:--------|:--------------|:----------|:------------|:------------|:---------------------|:---------------|:-------------------|:--------------------|:-----------|:-----------|:-------------|:--------|:----------|:-------|:--------|:----------------|:--------|:-------|:--------------|:---------|:-------------|:-------------------|:------|:----------|:-------|:--------|:--------|:-------------------|:-------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 31 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | X | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
healthcorum/autotrain-data-anm2-25mh-u5jo | ---
dataset_info:
features:
- name: target
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 36008183
num_examples: 9998
- name: validation
num_bytes: 36008183
num_examples: 9998
download_size: 11959104
dataset_size: 72016366
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-anm2-25mh-u5jo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/jaguar_warrior_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jaguar_warrior/ジャガーマン/豹人 (Fate/Grand Order)
This is the dataset of jaguar_warrior/ジャガーマン/豹人 (Fate/Grand Order), containing 29 images and their tags.
The core tags of this character are `animal_ears, short_hair, orange_hair, brown_eyes, brown_hair, tail, fang, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 35.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 29 | 31.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 70 | 62.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jaguar_warrior_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jaguar_warrior_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------|
| 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | solo, open_mouth, 1girl, looking_at_viewer, smile, hood, animal_costume, holding, blush, tiger_print |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | solo | open_mouth | 1girl | looking_at_viewer | smile | hood | animal_costume | holding | blush | tiger_print |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:-------------|:--------|:--------------------|:--------|:-------|:-----------------|:----------|:--------|:--------------|
| 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
|
krinal/embeddings_state_of_union | ---
license: apache-2.0
---
Embeddings generated from english text corpus file.
Model used: sentence-transformers/all-MiniLM-L6-v2 |
nguyenminh871/reentrancy_solidity_function | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: func
dtype: string
- name: target
dtype: bool
- name: project
dtype: string
splits:
- name: train
num_bytes: 840896
num_examples: 3203
download_size: 156960
dataset_size: 840896
---
# Dataset Card for "reentrancy_solidity_function"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nadav/pixel_glue_stsb_low_noise | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: float32
splits:
- name: validation
num_bytes: 39630800.5
num_examples: 1500
download_size: 39537172
dataset_size: 39630800.5
---
# Dataset Card for "pixel_glue_stsb_low_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigbio/med_qa | ---
language:
- en
- zh
bigbio_language:
- English
- Chinese (Simplified)
- Chinese (Traditional, Taiwan)
license: unknown
multilinguality: multilingual
bigbio_license_shortname: UNKNOWN
pretty_name: MedQA
homepage: https://github.com/jind11/MedQA
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- QUESTION_ANSWERING
---
# Dataset Card for MedQA
## Dataset Description
- **Homepage:** https://github.com/jind11/MedQA
- **Pubmed:** False
- **Public:** True
- **Tasks:** QA
In this work, we present the first free-form multiple-choice OpenQA dataset for solving medical problems, MedQA,
collected from the professional medical board exams. It covers three languages: English, simplified Chinese, and
traditional Chinese, and contains 12,723, 34,251, and 14,123 questions for the three languages, respectively. Together
with the question data, we also collect and release a large-scale corpus from medical textbooks from which the reading
comprehension models can obtain necessary knowledge for answering the questions.
## Citation Information
```
@article{jin2021disease,
title={What disease does this patient have? a large-scale open domain question answering dataset from medical exams},
author={Jin, Di and Pan, Eileen and Oufattole, Nassim and Weng, Wei-Hung and Fang, Hanyi and Szolovits, Peter},
journal={Applied Sciences},
volume={11},
number={14},
pages={6421},
year={2021},
publisher={MDPI}
}
```
|
flpelerin/openorca-alpaca-15k | ---
license: cc-by-4.0
---
|
ostapeno/flanv2_100k | ---
license: apache-2.0
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 147796259
num_examples: 100000
download_size: 85036882
dataset_size: 147796259
---
|
liuyanchen1015/MULTI_VALUE_mrpc_aint_have | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 4455
num_examples: 17
- name: train
num_bytes: 12459
num_examples: 45
- name: validation
num_bytes: 1262
num_examples: 5
download_size: 23628
dataset_size: 18176
---
# Dataset Card for "MULTI_VALUE_mrpc_aint_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gigant/tib_complete_metadata | ---
dataset_info:
features:
- name: title
dtype: string
- name: href
dtype: string
- name: description
dtype: 'null'
- name: url_vid
dtype: string
- name: release_date
dtype: string
- name: subject
dtype: string
- name: genre
dtype: string
- name: abstract
dtype: string
- name: language
dtype: string
- name: doi
dtype: string
- name: license
dtype: string
- name: author
dtype: string
- name: contributors
dtype: string
splits:
- name: train
num_bytes: 30171096
num_examples: 22091
download_size: 11964701
dataset_size: 30171096
---
# Dataset Card for "tib_complete_metadata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_FredrikBL__NeuralPipe-7B-slerp | ---
pretty_name: Evaluation run of FredrikBL/NeuralPipe-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FredrikBL/NeuralPipe-7B-slerp](https://huggingface.co/FredrikBL/NeuralPipe-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FredrikBL__NeuralPipe-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:21:59.975641](https://huggingface.co/datasets/open-llm-leaderboard/details_FredrikBL__NeuralPipe-7B-slerp/blob/main/results_2024-03-22T00-21-59.975641.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6453218800457399,\n\
\ \"acc_stderr\": 0.03212887690836472,\n \"acc_norm\": 0.6457679471487517,\n\
\ \"acc_norm_stderr\": 0.032784859928949854,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598389086821388,\n\
\ \"mc2_stderr\": 0.015156739153282793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918762,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6697868950408286,\n\
\ \"acc_stderr\": 0.004693285694663837,\n \"acc_norm\": 0.8618801035650269,\n\
\ \"acc_norm_stderr\": 0.003443206472757467\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n\
\ \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n\
\ \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598389086821388,\n\
\ \"mc2_stderr\": 0.015156739153282793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515303\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \
\ \"acc_stderr\": 0.012774285669385084\n }\n}\n```"
repo_url: https://huggingface.co/FredrikBL/NeuralPipe-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|arc:challenge|25_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|gsm8k|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hellaswag|10_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-20-22.252622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-21-59.975641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-21-59.975641.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- '**/details_harness|winogrande|5_2024-03-21T22-20-22.252622.parquet'
- split: 2024_03_22T00_21_59.975641
path:
- '**/details_harness|winogrande|5_2024-03-22T00-21-59.975641.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-21-59.975641.parquet'
- config_name: results
data_files:
- split: 2024_03_21T22_20_22.252622
path:
- results_2024-03-21T22-20-22.252622.parquet
- split: 2024_03_22T00_21_59.975641
path:
- results_2024-03-22T00-21-59.975641.parquet
- split: latest
path:
- results_2024-03-22T00-21-59.975641.parquet
---
# Dataset Card for Evaluation run of FredrikBL/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FredrikBL/NeuralPipe-7B-slerp](https://huggingface.co/FredrikBL/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FredrikBL__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:21:59.975641](https://huggingface.co/datasets/open-llm-leaderboard/details_FredrikBL__NeuralPipe-7B-slerp/blob/main/results_2024-03-22T00-21-59.975641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6453218800457399,
"acc_stderr": 0.03212887690836472,
"acc_norm": 0.6457679471487517,
"acc_norm_stderr": 0.032784859928949854,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598389086821388,
"mc2_stderr": 0.015156739153282793
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.013990571137918762,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518827
},
"harness|hellaswag|10": {
"acc": 0.6697868950408286,
"acc_stderr": 0.004693285694663837,
"acc_norm": 0.8618801035650269,
"acc_norm_stderr": 0.003443206472757467
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598389086821388,
"mc2_stderr": 0.015156739153282793
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515303
},
"harness|gsm8k|5": {
"acc": 0.6868840030326004,
"acc_stderr": 0.012774285669385084
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pioivenium/im-map-dataset-test | ---
license: openrail
language:
- en
pretty_name: map_test
size_categories:
- 10K<n<100K
--- |
Arthuerwang/Downsampled_imbd_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 16760000.0
num_examples: 10000
- name: test
num_bytes: 1676000.0
num_examples: 1000
download_size: 0
dataset_size: 18436000.0
---
# Dataset Card for "Downsampled_imbd_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/chai-experiment-v1-chatml | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 2519356815.0
num_examples: 499663
download_size: 1321137823
dataset_size: 2519356815.0
---
# Dataset Card for "chai-experiment-v1-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |