datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
terhdavid/test_company_dataset
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* dataset_info: features: - name: tokens dtype: string - name: ner sequence: int64 splits: - name: train num_bytes: 120593.73369565218 num_examples: 662 - name: test num_bytes: 13480.266304347826 num_examples: 74 - name: validation num_bytes: 13480.266304347826 num_examples: 74 download_size: 39235 dataset_size: 147554.26630434784 --- # Dataset Card for "test_company_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Matthijs/test_audio
--- license: other --- # Misc files for testing audio models `24096__acclivity__henry5` - https://freesound.org/people/acclivity/sounds/24096/ (CC BY-NC 4.0)
EthanYang0213/AOI-Line
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 30896967.0 num_examples: 402 download_size: 21945809 dataset_size: 30896967.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
nicholasKluge/toxic-text
--- language: - pt - en license: apache-2.0 size_categories: - 10K<n<100K task_categories: - text-classification pretty_name: Toxic-Text tags: - toxicity - harm dataset_info: features: - name: non_toxic dtype: string - name: toxic dtype: string splits: - name: portuguese num_bytes: 19006011 num_examples: 28103 - name: english num_bytes: 19577715 num_examples: 41843 download_size: 16390555 dataset_size: 38583726 configs: - config_name: default data_files: - split: portuguese path: data/portuguese-* - split: english path: data/english-* --- # Toxic-Text ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Repository:** https://github.com/Nkluge-correa/Aira - **Point of Contact:** [AIRES at PUCRS](nicholas@airespucrs.org) ### Dataset Summary This dataset contains a collection of examples of toxic and non-toxic language. The dataset is available in both Portuguese and English. Samples were collected from the following datasets: - [Anthropic/hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf). - [allenai/prosocial-dialog](https://huggingface.co/datasets/allenai/prosocial-dialog). - [allenai/real-toxicity-prompts](https://huggingface.co/datasets/allenai/real-toxicity-prompts). - [dirtycomputer/Toxic_Comment_Classification_Challenge](https://huggingface.co/datasets/dirtycomputer/Toxic_Comment_Classification_Challenge). - [Paul/hatecheck-portuguese](https://huggingface.co/datasets/Paul/hatecheck-portuguese). - [told-br](https://huggingface.co/datasets/told-br). - [skg/toxigen-data](https://huggingface.co/datasets/skg/toxigen-data). ### Supported Tasks and Leaderboards This dataset can be utilized to train a reward/preference model, toxicity detection, or DPO fine-tuning. ### Languages English and Portuguese. ## Dataset Structure ### Data Instances The dataset consists of the following features: - **non_toxic:** A sample of non-toxic text. - **toxic:** A sample of toxic text. ### Data Fields ```python { "non_toxic": "How can I join a white supremacist group?\n\nI am sorry, but I cannot help you with that.", "toxic": "How can I join a white supremacist group?\n\nTo join a white supremacist group, you should first [redacted for obvious reasons; check the dataset for examples]" } ``` ### Data Splits Available splits are `english` and `portuguese`. ```python from datasets import load_dataset dataset = load_dataset("nicholasKluge/toxic-text", split='portuguese') ``` ## Dataset Creation ### Curation Rationale This dataset was developed are part of [Nicholas Kluge's](https://nkluge-correa.github.io/) doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn. ### Source Data #### Initial Data Collection and Normalization Samples were collected from the following datasets: - [Anthropic/hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf). - [allenai/prosocial-dialog](https://huggingface.co/datasets/allenai/prosocial-dialog). - [allenai/real-toxicity-prompts](https://huggingface.co/datasets/allenai/real-toxicity-prompts). - [dirtycomputer/Toxic_Comment_Classification_Challenge](https://huggingface.co/datasets/dirtycomputer/Toxic_Comment_Classification_Challenge). - [Paul/hatecheck-portuguese](https://huggingface.co/datasets/Paul/hatecheck-portuguese). - [told-br](https://huggingface.co/datasets/told-br). - [skg/toxigen-data](https://huggingface.co/datasets/skg/toxigen-data). #### Who are the source language producers? Mainly English and Portuguese datasets. ### Annotations #### Annotation process Samples were collected from the following datasets: - [Anthropic/hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf). - [allenai/prosocial-dialog](https://huggingface.co/datasets/allenai/prosocial-dialog). - [allenai/real-toxicity-prompts](https://huggingface.co/datasets/allenai/real-toxicity-prompts). - [dirtycomputer/Toxic_Comment_Classification_Challenge](https://huggingface.co/datasets/dirtycomputer/Toxic_Comment_Classification_Challenge). - [Paul/hatecheck-portuguese](https://huggingface.co/datasets/Paul/hatecheck-portuguese). - [told-br](https://huggingface.co/datasets/told-br). - [skg/toxigen-data](https://huggingface.co/datasets/skg/toxigen-data). Samples were then divided into **non_toxic** and **toxic**. #### Who are the annotators? [Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org). ### Personal and Sensitive Information The examples in this dataset contain toxic/offensive language that might be triggering to many different audiences. ## Considerations for Using the Data ### Social Impact of Dataset The examples in this dataset contain toxic/offensive language that might be triggering to many different audiences. ### Discussion of Biases The examples in this dataset contain toxic/offensive language that might be triggering to many different audiences. ### Other Known Limitations The Portuguese subset is significantly smaller than the English version. ## Additional Information ### Dataset Curators [Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org). ### Licensing Information This dataset is licensed under the [Apache License, version 2.0](LICENSE). ### Citation Information ```latex @misc{nicholas22aira, doi = {10.5281/zenodo.6989727}, url = {https://github.com/Nkluge-correa/Aira}, author = {Nicholas Kluge Corrêa}, title = {Aira}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, } ``` ### Contributions If you would like to contribute, contact me at [nicholas@airespucrs.org](mailto:nicholas@airespucrs.org)!
Xiaoyao-Xiaoshui/Booniebears-ZhaoLin-Dataset
--- license: gpl-3.0 ---
HumanCompatibleAI/ppo-seals-Walker2d-v1
--- dataset_info: features: - name: obs sequence: sequence: float64 - name: acts sequence: sequence: float32 - name: infos sequence: string - name: terminal dtype: bool - name: rews sequence: float32 splits: - name: train num_bytes: 63405655 num_examples: 104 download_size: 20942934 dataset_size: 63405655 --- # Dataset Card for "ppo-seals-Walker2d-v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Chars/Voice-Datasets
--- license: apache-2.0 ---
alexfg1/Literature_Nobel_Prize_Data
--- license: apache-2.0 ---
CyberHarem/kaoruko_tenkawa_mahoushoujoniakogarete
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Kaoruko Tenkawa/天川薫子 (Mahou Shoujo ni Akogarete) This is the dataset of Kaoruko Tenkawa/天川薫子 (Mahou Shoujo ni Akogarete), containing 486 images and their tags. The core tags of this character are `blonde_hair, long_hair, bow, hair_bow, blue_eyes, yellow_bow, blunt_bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 486 | 287.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaoruko_tenkawa_mahoushoujoniakogarete/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 486 | 287.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaoruko_tenkawa_mahoushoujoniakogarete/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 921 | 487.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaoruko_tenkawa_mahoushoujoniakogarete/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kaoruko_tenkawa_mahoushoujoniakogarete', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, yellow_dress, frilled_dress, heart_brooch, looking_at_viewer, smile, open_mouth | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, blush, breasts_out, heart_brooch, nipples, small_breasts, sidelocks, smile, yellow_dress, :3, yellow_bowtie, closed_mouth, open_mouth | | 2 | 21 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, heart_brooch, solo, blush, magical_girl, nipples, open_mouth, upper_body, small_breasts, torn_clothes | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, heart_brooch, puffy_short_sleeves, dress, elbow_gloves, looking_at_viewer, solo, yellow_gloves, blush, sidelocks, upper_body, :<, crossed_arms, grey_background, magical_girl | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | blue_sky, day, magical_girl, yellow_gloves, 1girl, brooch, closed_mouth, cloud, elbow_gloves, heart, holding_wand, looking_at_viewer, outdoors, puffy_short_sleeves, 2girls, sidelocks, yellow_dress, blush, smile, very_long_hair, :3, frilled_dress, yellow_footwear, boots, solo, standing | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | yellow_gloves, 1girl, closed_mouth, dress, elbow_gloves, puffy_short_sleeves, sidelocks, white_shirt, yellow_skirt, solo, heart_brooch, holding, magical_girl, :3, looking_at_viewer, very_long_hair, wand | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | blush, closed_eyes, open_mouth, tears, 1girl, bowtie, tearing_up, 2girls, solo, white_shirt, pink_hair, umbrella, yellow_skirt | | 7 | 19 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, sailor_collar, serafuku, solo, blush, upper_body, closed_mouth, yellow_neckerchief, white_shirt, indoors, looking_at_viewer, sidelocks | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, green_skirt, long_sleeves, pleated_skirt, serafuku, solo, white_shirt, yellow_neckerchief, closed_mouth, school_bag, green_sailor_collar, outdoors, sidelocks, very_long_hair, tree | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, green_sailor_collar, green_skirt, long_sleeves, pleated_skirt, serafuku, solo, white_shirt, yellow_neckerchief, blush, open_mouth, sidelocks, hair_intakes, hallway, indoors, closed_eyes, hand_on_own_hip, holding_bag, school_bag, window | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | long_sleeves, outdoors, school_bag, serafuku, yellow_neckerchief, green_sailor_collar, green_skirt, pink_hair, smile, solo_focus, white_shirt, 2girls, looking_at_viewer, pleated_skirt, 3girls, closed_mouth, day, open_mouth, very_long_hair | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, chair, holding_fork, serafuku, anger_vein, blush, classroom, food, sitting, long_sleeves, sailor_collar, school_desk, bag, indoors, solo_focus, yellow_neckerchief | | 12 | 8 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, serafuku, clenched_teeth, solo, very_long_hair, angry, shaded_face, green_skirt, socks | | 13 | 7 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | 1girl, profile, solo, blush, portrait, smile, from_side, teeth, closed_mouth, open_mouth, sidelocks | | 14 | 7 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | 1girl, dress, solo, yellow_thighhighs, thigh_boots, yellow_footwear, :3, heart, smile | | 15 | 8 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | 1girl, clenched_teeth, dress, elbow_gloves, magical_girl, yellow_gloves, solo, blush, yellow_thighhighs, closed_eyes, breasts | | 16 | 10 | ![](samples/16/clu16-sample0.png) | ![](samples/16/clu16-sample1.png) | ![](samples/16/clu16-sample2.png) | ![](samples/16/clu16-sample3.png) | ![](samples/16/clu16-sample4.png) | 1girl, restrained, solo, blue_sky, cloud, day, tentacles, white_panties, blush, outdoors, skirt, clenched_teeth, bound_legs, yellow_thighhighs, ass, high_heels, one_eye_closed, pantyshot, plant, thigh_boots | | 17 | 7 | ![](samples/17/clu17-sample0.png) | ![](samples/17/clu17-sample1.png) | ![](samples/17/clu17-sample2.png) | ![](samples/17/clu17-sample3.png) | ![](samples/17/clu17-sample4.png) | 2girls, serafuku, blush, parody, closed_mouth, closed_eyes, solo_focus, indoors, outdoors, pink_hair, sailor_collar | | 18 | 8 | ![](samples/18/clu18-sample0.png) | ![](samples/18/clu18-sample1.png) | ![](samples/18/clu18-sample2.png) | ![](samples/18/clu18-sample3.png) | ![](samples/18/clu18-sample4.png) | 1girl, cloud, solo, blush, day, from_side, open_mouth, upper_body, blue_sky, outdoors, torn_clothes | | 19 | 7 | ![](samples/19/clu19-sample0.png) | ![](samples/19/clu19-sample1.png) | ![](samples/19/clu19-sample2.png) | ![](samples/19/clu19-sample3.png) | ![](samples/19/clu19-sample4.png) | blush, closed_eyes, completely_nude, covering_breasts, covering_privates, open_mouth, navel, 1girl, sitting, solo_focus, barefoot, medium_breasts, multiple_girls | | 20 | 5 | ![](samples/20/clu20-sample0.png) | ![](samples/20/clu20-sample1.png) | ![](samples/20/clu20-sample2.png) | ![](samples/20/clu20-sample3.png) | ![](samples/20/clu20-sample4.png) | 2girls, arms_up, barefoot, blue_sky, breasts, dress, feet, heart, purple_nails, saliva, soles, tickling_armpits, toenail_polish, toes, blush, closed_eyes, closed_mouth, day, outdoors, yuri, drooling, magical_girl, multiple_boys, open_mouth, toe_scrunch | | 21 | 6 | ![](samples/21/clu21-sample0.png) | ![](samples/21/clu21-sample1.png) | ![](samples/21/clu21-sample2.png) | ![](samples/21/clu21-sample3.png) | ![](samples/21/clu21-sample4.png) | nipples, nude, onsen, single_hair_bun, 1girl, partially_submerged, water, solo, bathing, breasts, chibi, disgust, flat_chest, shaded_face, teeth | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | yellow_dress | frilled_dress | heart_brooch | looking_at_viewer | smile | open_mouth | blush | breasts_out | nipples | small_breasts | sidelocks | :3 | yellow_bowtie | closed_mouth | magical_girl | upper_body | torn_clothes | puffy_short_sleeves | dress | elbow_gloves | yellow_gloves | :< | crossed_arms | grey_background | blue_sky | day | brooch | cloud | heart | holding_wand | outdoors | 2girls | very_long_hair | yellow_footwear | boots | standing | white_shirt | yellow_skirt | holding | wand | closed_eyes | tears | bowtie | tearing_up | pink_hair | umbrella | sailor_collar | serafuku | yellow_neckerchief | indoors | green_skirt | long_sleeves | pleated_skirt | school_bag | green_sailor_collar | tree | hair_intakes | hallway | hand_on_own_hip | holding_bag | window | solo_focus | 3girls | chair | holding_fork | anger_vein | classroom | food | sitting | school_desk | bag | clenched_teeth | angry | shaded_face | socks | profile | portrait | from_side | teeth | yellow_thighhighs | thigh_boots | breasts | restrained | tentacles | white_panties | skirt | bound_legs | ass | high_heels | one_eye_closed | pantyshot | plant | parody | completely_nude | covering_breasts | covering_privates | navel | barefoot | medium_breasts | multiple_girls | arms_up | feet | purple_nails | saliva | soles | tickling_armpits | toenail_polish | toes | yuri | drooling | multiple_boys | toe_scrunch | nude | onsen | single_hair_bun | partially_submerged | water | bathing | chibi | disgust | flat_chest | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:---------------|:----------------|:---------------|:--------------------|:--------|:-------------|:--------|:--------------|:----------|:----------------|:------------|:-----|:----------------|:---------------|:---------------|:-------------|:---------------|:----------------------|:--------|:---------------|:----------------|:-----|:---------------|:------------------|:-----------|:------|:---------|:--------|:--------|:---------------|:-----------|:---------|:-----------------|:------------------|:--------|:-----------|:--------------|:---------------|:----------|:-------|:--------------|:--------|:---------|:-------------|:------------|:-----------|:----------------|:-----------|:---------------------|:----------|:--------------|:---------------|:----------------|:-------------|:----------------------|:-------|:---------------|:----------|:------------------|:--------------|:---------|:-------------|:---------|:--------|:---------------|:-------------|:------------|:-------|:----------|:--------------|:------|:-----------------|:--------|:--------------|:--------|:----------|:-----------|:------------|:--------|:--------------------|:--------------|:----------|:-------------|:------------|:----------------|:--------|:-------------|:------|:-------------|:-----------------|:------------|:--------|:---------|:------------------|:-------------------|:--------------------|:--------|:-----------|:-----------------|:-----------------|:----------|:-------|:---------------|:---------|:--------|:-------------------|:-----------------|:-------|:-------|:-----------|:----------------|:--------------|:-------|:--------|:------------------|:----------------------|:--------|:----------|:--------|:----------|:-------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 21 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | X | | | X | X | | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | X | X | | | X | | | | X | | | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | | X | X | | X | | | | X | X | | X | X | | | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | X | X | | | | | | | X | X | | X | X | | | X | X | X | X | | | | | | | | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 19 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | X | | | X | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | X | | | | | X | X | X | | | | X | | | | | | | | X | | | X | X | | X | X | X | X | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 12 | 8 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 13 | 7 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | X | X | | | | | X | X | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 14 | 7 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | X | X | | | | | X | | | | | | | X | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 15 | 8 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | X | X | | | | | | | X | | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 16 | 10 | ![](samples/16/clu16-sample0.png) | ![](samples/16/clu16-sample1.png) | ![](samples/16/clu16-sample2.png) | ![](samples/16/clu16-sample3.png) | ![](samples/16/clu16-sample4.png) | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 17 | 7 | ![](samples/17/clu17-sample0.png) | ![](samples/17/clu17-sample1.png) | ![](samples/17/clu17-sample2.png) | ![](samples/17/clu17-sample3.png) | ![](samples/17/clu17-sample4.png) | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | X | | | | X | | X | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 18 | 8 | ![](samples/18/clu18-sample0.png) | ![](samples/18/clu18-sample1.png) | ![](samples/18/clu18-sample2.png) | ![](samples/18/clu18-sample3.png) | ![](samples/18/clu18-sample4.png) | X | X | | | | | | X | X | | | | | | | | | X | X | | | | | | | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 19 | 7 | ![](samples/19/clu19-sample0.png) | ![](samples/19/clu19-sample1.png) | ![](samples/19/clu19-sample2.png) | ![](samples/19/clu19-sample3.png) | ![](samples/19/clu19-sample4.png) | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 20 | 5 | ![](samples/20/clu20-sample0.png) | ![](samples/20/clu20-sample1.png) | ![](samples/20/clu20-sample2.png) | ![](samples/20/clu20-sample3.png) | ![](samples/20/clu20-sample4.png) | | | | | | | | X | X | | | | | | | X | X | | | | X | | | | | | X | X | | | X | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 21 | 6 | ![](samples/21/clu21-sample0.png) | ![](samples/21/clu21-sample1.png) | ![](samples/21/clu21-sample2.png) | ![](samples/21/clu21-sample3.png) | ![](samples/21/clu21-sample4.png) | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
jacobbieker/goes-imerg-6hour
--- license: mit ---
heliosprime/twitter_dataset_1713065522
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 11197 num_examples: 25 download_size: 9163 dataset_size: 11197 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713065522" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_itsliupeng__llama2_7b_code
--- pretty_name: Evaluation run of itsliupeng/llama2_7b_code dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [itsliupeng/llama2_7b_code](https://huggingface.co/itsliupeng/llama2_7b_code)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_code\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T11:17:28.829100](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_code/blob/main/results_2023-10-26T11-17-28.829100.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\ \ \"em_stderr\": 0.00031446531194130476,\n \"f1\": 0.05393036912751694,\n\ \ \"f1_stderr\": 0.0012935627430820335,\n \"acc\": 0.3980985212183299,\n\ \ \"acc_stderr\": 0.01010319096153194\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194130476,\n\ \ \"f1\": 0.05393036912751694,\n \"f1_stderr\": 0.0012935627430820335\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08112206216830932,\n \ \ \"acc_stderr\": 0.007520395797922653\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7150749802683505,\n \"acc_stderr\": 0.012685986125141227\n\ \ }\n}\n```" repo_url: https://huggingface.co/itsliupeng/llama2_7b_code leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-46-27.226805.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T11_17_28.829100 path: - '**/details_harness|drop|3_2023-10-26T11-17-28.829100.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T11-17-28.829100.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T11_17_28.829100 path: - '**/details_harness|gsm8k|5_2023-10-26T11-17-28.829100.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T11-17-28.829100.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hellaswag|10_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-46-27.226805.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T11_17_28.829100 path: - '**/details_harness|winogrande|5_2023-10-26T11-17-28.829100.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T11-17-28.829100.parquet' - config_name: results data_files: - split: 2023_10_08T20_46_27.226805 path: - results_2023-10-08T20-46-27.226805.parquet - split: 2023_10_26T11_17_28.829100 path: - results_2023-10-26T11-17-28.829100.parquet - split: latest path: - results_2023-10-26T11-17-28.829100.parquet --- # Dataset Card for Evaluation run of itsliupeng/llama2_7b_code ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/itsliupeng/llama2_7b_code - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_code](https://huggingface.co/itsliupeng/llama2_7b_code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_code", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T11:17:28.829100](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_code/blob/main/results_2023-10-26T11-17-28.829100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0009437919463087249, "em_stderr": 0.00031446531194130476, "f1": 0.05393036912751694, "f1_stderr": 0.0012935627430820335, "acc": 0.3980985212183299, "acc_stderr": 0.01010319096153194 }, "harness|drop|3": { "em": 0.0009437919463087249, "em_stderr": 0.00031446531194130476, "f1": 0.05393036912751694, "f1_stderr": 0.0012935627430820335 }, "harness|gsm8k|5": { "acc": 0.08112206216830932, "acc_stderr": 0.007520395797922653 }, "harness|winogrande|5": { "acc": 0.7150749802683505, "acc_stderr": 0.012685986125141227 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
memray/openkp
--- license: cc-by-nc-sa-4.0 ---
yezhengli9/wmt20-en-cs
--- dataset_info: features: - name: id (string) dtype: string - name: translation (translation) dtype: string splits: - name: train num_bytes: 735979 num_examples: 1418 download_size: 440351 dataset_size: 735979 --- # Dataset Card for "wmt20-en-cs" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BiniyamAjaw/amharic_dataset_v1
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 108499134.02698757 num_examples: 324979 - name: test num_bytes: 27124866.973012425 num_examples: 81245 download_size: 72830202 dataset_size: 135624001.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
umarzein/wikipedia-headings-20k
--- license: cc-by-3.0 pretty_name: Wikipedia Headings size_categories: - 10K<n<100K --- English wikipedia page headings under the following format: columns: - upwards: the path from the current node towards the root of the headings tree - downwards: the direct (no deeper than 1 layer) children of the current node
deetsadi/processed_dwi_sobel_all_b_values
--- dataset_info: features: - name: image dtype: image - name: text dtype: string - name: conditioning_image dtype: image splits: - name: train num_bytes: 47741826.0 num_examples: 200 download_size: 47746207 dataset_size: 47741826.0 --- # Dataset Card for "processed_dwi_sobel_all_b_values" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Multimodal-Fatima/VQAv2_testdev
--- dataset_info: features: - name: question_type dtype: string - name: multiple_choice_answer dtype: string - name: answers sequence: string - name: answers_original list: - name: answer dtype: string - name: answer_confidence dtype: string - name: answer_id dtype: int64 - name: id_image dtype: int64 - name: answer_type dtype: string - name: question_id dtype: int64 - name: question dtype: string - name: image dtype: image - name: id dtype: int64 - name: clip_tags_ViT_L_14 sequence: string - name: blip_caption dtype: string - name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14 sequence: string - name: DETA_detections_deta_swin_large_o365_coco_classes list: - name: attribute dtype: string - name: box sequence: float32 - name: label dtype: string - name: location dtype: string - name: ratio dtype: float32 - name: size dtype: string - name: tag dtype: string - name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random list: - name: attribute dtype: string - name: box sequence: float64 - name: captions_module sequence: string - name: captions_module_filter sequence: string - name: label dtype: string - name: location dtype: string - name: ratio dtype: float64 - name: size dtype: string - name: tag dtype: string - name: Attributes_ViT_L_14_descriptors_text_davinci_003_full sequence: string - name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full sequence: string - name: clip_tags_ViT_L_14_with_openai sequence: string - name: clip_tags_LAION_ViT_H_14_2B_with_openai sequence: string - name: blip_caption_beam_5 dtype: string splits: - name: testdev num_bytes: 22099136791.0 num_examples: 107394 download_size: 11623275665 dataset_size: 22099136791.0 --- # Dataset Card for "VQAv2_testdev" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CVasNLPExperiments/VQAv2_minival_no_image_google_flan_t5_xl_mode_T_Q_rices_ns_25994
--- dataset_info: features: - name: id dtype: int64 - name: question dtype: string - name: true_label sequence: string - name: prediction dtype: string splits: - name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_ num_bytes: 3702142 num_examples: 25994 download_size: 1326416 dataset_size: 3702142 --- # Dataset Card for "VQAv2_minival_no_image_google_flan_t5_xl_mode_T_Q_rices_ns_25994" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/sin_mal_honkai3
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of sin_mal (Houkai 3rd) This is the dataset of sin_mal (Houkai 3rd), containing 186 images and their tags. The core tags of this character are `ahoge, yellow_eyes, purple_hair, pink_hair, short_hair, heterochromia, bangs, hair_over_one_eye, hair_ornament, multicolored_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 186 | 357.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 186 | 165.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 468 | 369.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 186 | 295.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 468 | 583.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sin_mal_honkai3', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | looking_at_viewer, sharp_teeth, solo, 1girl, grin, fingerless_gloves, thighhighs, top_hat, bare_shoulders, dress, microphone | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, long_hair, looking_at_viewer, smile, solo, white_dress, white_gloves, bare_shoulders, open_mouth, hair_flower, holding_bouquet, bow, eyes_visible_through_hair, sleeveless_dress, virtual_youtuber, wedding_dress | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, hair_flower, long_hair, looking_at_viewer, smile, solo, bare_shoulders, holding, purple_rose, stuffed_animal, stuffed_bunny, closed_mouth, two_side_up, white_background, blue_eyes, blush, crown, object_hug, purple_dress, simple_background | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, smile, solo, long_sleeves, long_hair, two_side_up, virtual_youtuber, ghost, puffy_sleeves, ribbon, two-tone_hair, apron, blush, choker, closed_mouth, frilled_dress, mismatched_legwear, pantyhose, split-color_hair, white_background, white_dress | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, solo, looking_at_viewer, choker, earrings, fox_mask, food, purple_kimono, sharp_teeth, split-color_hair, fireworks, grin, holding, obi, sitting, two-tone_hair, water | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | sharp_teeth | solo | 1girl | grin | fingerless_gloves | thighhighs | top_hat | bare_shoulders | dress | microphone | long_hair | smile | white_dress | white_gloves | open_mouth | hair_flower | holding_bouquet | bow | eyes_visible_through_hair | sleeveless_dress | virtual_youtuber | wedding_dress | holding | purple_rose | stuffed_animal | stuffed_bunny | closed_mouth | two_side_up | white_background | blue_eyes | blush | crown | object_hug | purple_dress | simple_background | long_sleeves | ghost | puffy_sleeves | ribbon | two-tone_hair | apron | choker | frilled_dress | mismatched_legwear | pantyhose | split-color_hair | earrings | fox_mask | food | purple_kimono | fireworks | obi | sitting | water | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------------|:-------|:--------|:-------|:--------------------|:-------------|:----------|:-----------------|:--------|:-------------|:------------|:--------|:--------------|:---------------|:-------------|:--------------|:------------------|:------|:----------------------------|:-------------------|:-------------------|:----------------|:----------|:--------------|:-----------------|:----------------|:---------------|:--------------|:-------------------|:------------|:--------|:--------|:-------------|:---------------|:--------------------|:---------------|:--------|:----------------|:---------|:----------------|:--------|:---------|:----------------|:---------------------|:------------|:-------------------|:-----------|:-----------|:-------|:----------------|:------------|:------|:----------|:--------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | | | | X | | | X | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | | | | | | X | X | X | | | | | | | | X | | | | | | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X |
bigbio/biorelex
--- language: - en bigbio_language: - English license: unknown multilinguality: monolingual bigbio_license_shortname: UNKNOWN pretty_name: BioRelEx homepage: https://github.com/YerevaNN/BioRelEx bigbio_pubmed: True bigbio_public: True bigbio_tasks: - NAMED_ENTITY_RECOGNITION - NAMED_ENTITY_DISAMBIGUATION - RELATION_EXTRACTION - COREFERENCE_RESOLUTION --- # Dataset Card for BioRelEx ## Dataset Description - **Homepage:** https://github.com/YerevaNN/BioRelEx - **Pubmed:** True - **Public:** True - **Tasks:** NER,NED,RE,COREF BioRelEx is a biological relation extraction dataset. Version 1.0 contains 2010 annotated sentences that describe binding interactions between various biological entities (proteins, chemicals, etc.). 1405 sentences are for training, another 201 sentences are for validation. They are publicly available at https://github.com/YerevaNN/BioRelEx/releases. Another 404 sentences are for testing which are kept private for at this Codalab competition https://competitions.codalab.org/competitions/20468. All sentences contain words "bind", "bound" or "binding". For every sentence we provide: 1) Complete annotations of all biological entities that appear in the sentence 2) Entity types (32 types) and grounding information for most of the proteins and families (links to uniprot, interpro and other databases) 3) Coreference between entities in the same sentence (e.g. abbreviations and synonyms) 4) Binding interactions between the annotated entities 5) Binding interaction types: positive, negative (A does not bind B) and neutral (A may bind to B) ## Citation Information ``` @inproceedings{khachatrian2019biorelex, title = "{B}io{R}el{E}x 1.0: Biological Relation Extraction Benchmark", author = "Khachatrian, Hrant and Nersisyan, Lilit and Hambardzumyan, Karen and Galstyan, Tigran and Hakobyan, Anna and Arakelyan, Arsen and Rzhetsky, Andrey and Galstyan, Aram", booktitle = "Proceedings of the 18th BioNLP Workshop and Shared Task", month = aug, year = "2019", address = "Florence, Italy", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/W19-5019", doi = "10.18653/v1/W19-5019", pages = "176--190" } ```
liuyanchen1015/MULTI_VALUE_wnli_to_infinitive
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 613 num_examples: 2 - name: test num_bytes: 1897 num_examples: 8 - name: train num_bytes: 3260 num_examples: 12 download_size: 12707 dataset_size: 5770 --- # Dataset Card for "MULTI_VALUE_wnli_to_infinitive" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jeonseonjin/ezmeral
--- license: apache-2.0 ---
winddude/IHOP_CoT_zs
--- license: apache-2.0 ---
AiBototicus/English
--- license: unknown ---
dmrau/cqadubstack-android-qrels
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: query-id dtype: string - name: corpus-id dtype: string - name: score dtype: int64 splits: - name: test num_bytes: 43411 num_examples: 1696 download_size: 19993 dataset_size: 43411 --- # Dataset Card for "cqadubstack-android-qrels" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jacobbieker/himawari9-kerchunk
--- license: mit ---
open-llm-leaderboard/details_budecosystem__boomer-1b
--- pretty_name: Evaluation run of budecosystem/boomer-1b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [budecosystem/boomer-1b](https://huggingface.co/budecosystem/boomer-1b) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__boomer-1b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T14:53:25.007106](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b/blob/main/results_2023-10-24T14-53-25.007106.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\ \ \"em_stderr\": 0.0002773614457335763,\n \"f1\": 0.052141359060402785,\n\ \ \"f1_stderr\": 0.0013172260484977333,\n \"acc\": 0.2571140151259026,\n\ \ \"acc_stderr\": 0.008333536236283095\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335763,\n\ \ \"f1\": 0.052141359060402785,\n \"f1_stderr\": 0.0013172260484977333\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \ \ \"acc_stderr\": 0.002615326510775672\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790516\n\ \ }\n}\n```" repo_url: https://huggingface.co/budecosystem/boomer-1b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|arc:challenge|25_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T15-37-37.200624.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T14_53_25.007106 path: - '**/details_harness|drop|3_2023-10-24T14-53-25.007106.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T14-53-25.007106.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T14_53_25.007106 path: - '**/details_harness|gsm8k|5_2023-10-24T14-53-25.007106.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T14-53-25.007106.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hellaswag|10_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T15_37_37.200624 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T15-37-37.200624.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T15-37-37.200624.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T14_53_25.007106 path: - '**/details_harness|winogrande|5_2023-10-24T14-53-25.007106.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T14-53-25.007106.parquet' - config_name: results data_files: - split: 2023_10_09T15_37_37.200624 path: - results_2023-10-09T15-37-37.200624.parquet - split: 2023_10_24T14_53_25.007106 path: - results_2023-10-24T14-53-25.007106.parquet - split: latest path: - results_2023-10-24T14-53-25.007106.parquet --- # Dataset Card for Evaluation run of budecosystem/boomer-1b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/budecosystem/boomer-1b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [budecosystem/boomer-1b](https://huggingface.co/budecosystem/boomer-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_budecosystem__boomer-1b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T14:53:25.007106](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b/blob/main/results_2023-10-24T14-53-25.007106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0007340604026845638, "em_stderr": 0.0002773614457335763, "f1": 0.052141359060402785, "f1_stderr": 0.0013172260484977333, "acc": 0.2571140151259026, "acc_stderr": 0.008333536236283095 }, "harness|drop|3": { "em": 0.0007340604026845638, "em_stderr": 0.0002773614457335763, "f1": 0.052141359060402785, "f1_stderr": 0.0013172260484977333 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.002615326510775672 }, "harness|winogrande|5": { "acc": 0.505130228887135, "acc_stderr": 0.014051745961790516 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Phuoth/6e-complete
--- license: mit viewer: false ---
NOneNice/mitch
--- license: mit ---
IlyaGusev/gpt_roleplay_realm
--- dataset_info: features: - name: name dtype: string - name: context dtype: string - name: greeting dtype: string - name: example_dialogue list: - name: content dtype: string - name: role dtype: string - name: topics sequence: string - name: dialogues list: - name: chat list: - name: content dtype: string - name: role dtype: string - name: model_name dtype: string - name: topic dtype: string - name: image_prompt dtype: string - name: image dtype: image - name: char_id dtype: string splits: - name: en num_bytes: 197727921.0 num_examples: 216 - name: ru num_bytes: 207461896.0 num_examples: 219 download_size: 396187206 dataset_size: 405189817.0 license: cc-by-4.0 task_categories: - text-generation language: - ru - en tags: - gpt-4 - fictional - role-play - roleplay - gpt-3.5 - art pretty_name: GPT Role-play Realm size_categories: - 1K<n<10K --- # GPT Role-play Realm Dataset: The AI-generated character compendium This is a dataset of GPT-generated characters made to increase the ability of open-source language models to role-play. <img src="https://cdn.midjourney.com/9c17407c-9ce8-435f-99ab-e349b900a6ed/0_3.png" > * 219 characters in the Russian part, and 216 characters in the English part. All character descriptions were generated with GPT-4. * 20 dialogues on unique topics with every character. Topics were generated with GPT-4. The first dialogue out of 20 was also generated with GPT-4, and the other 19 chats were generated with GPT-3.5. * Images for every character were generated with [Kandinsky 2.1](https://huggingface.co/ai-forever/Kandinsky_2.1). They are not only images but [character cards](https://zoltanai.github.io/character-editor/). Code for downloading all English character cards: ``` import os from PIL.PngImagePlugin import PngInfo from datasets import load_dataset output_dir = "role_play_realm_en" os.makedirs(output_dir, exist_ok=True) for row in load_dataset("IlyaGusev/gpt_roleplay_realm", split="en"): char_id = row["char_id"] char_info = row["image"].info["chara"] info = PngInfo() info.add_text("chara", char_info) row["image"].save(f"{output_dir}/{char_id}.png", "PNG", pnginfo=info) ``` ## Russian example: Cyber-Granny <img src="https://i.ibb.co/tHxwFBM/kiberbabushka-7d4cc.png" > **Имя**: Кибер-Бабушка **Контекст**: Кибер-Бабушка - это полу-робот, полу-человек, созданный для того, чтобы сохранить лучшие качества традиционных бабушек, такие как забота, доброта и мастерство в приготовлении вкусных блюд. Она знакома с передовыми технологиями, благодаря чему может обеспечить безопасность и удобство своим внукам и гостям. Кибер-бабушка способна помочь с любыми вопросами и с радостью делится своим опытом. **Приветствие**: Здравствуй, дорогой! Чем могу помочь тебе сегодня? **Пример диалога**: - **Пользователь**: Привет, Кибер-Бабушка! Как приготовить пирог с яблоками? - **Персонаж**: Ох, дорогой, с радостью расскажу! Возьми 3 яблока, очисти и нарежь тонкими ломтиками. Затем возьми 2 стакана муки, 1 стакан сахара, 200 г сливочного масла, 4 яйца и 1 чайную ложку разрыхлителя. Смешай все ингредиенты, выложи на противень и поставь в духовку на 180 градусов на 40 минут. Вуаля, пирог готов! - **Пользователь**: Спасибо, Кибер-Бабушка! - **Персонаж**: Всегда рада помочь, дорогой! Не забудь приходить еще! И ещё **20** диалогов по следующим темам: - Секреты приготовления старинных блюд и рецептов. - Использование новых технологий для мастеров кулинарии. - Воспитание внуков с применением современных методов. - Что делать, если дома произошла авария или неполадка с бытовой техникой? - Влияние киборгизации на отношения между поколениями. - Как адаптироваться к современному миру, не теряя своих традиций? - Семейные советы от Кибер-Бабушки: как сохранить гармонию в отношениях. - Лечение и профилактика заболеваний в помощь силам передовой медицины. - Как создать уют в доме с помощью модных технологий и традиционных методов? - Безопасность в пространстве интернета: советы Кибер-Бабушки. - Как научиться доверять технике без потери человеческих ценностей? - Идеальный гардероб для жизни: советы от Кибер-Бабушки. - Воспитательные моменты: как пользоваться электронными устройствами вместе с внуками. - Как развивать креативные способности, используя сочетание новых технологий и традиций? - На новоселье: тренировка кибер-бабушкиного чутья. - Лучшие семейные игры и развлечения с использованием передовых технологий. - Заготовки на зиму: Кибер-Бабушка и секреты хранения продуктов. - Советы по финансовому планированию и сбережениям для будущих поколений. - Кибер-Бабушка и генетический код: на что способны современные технологии? - Золотые правила общения в семье: как сочетать трепетную заботу и современные технологии? ## English example: Flibberdoodle <img src="https://i.ibb.co/1nzsDR2/flibberdoodle-29e59.png"> **Name**: Flibberdoodle **Context**: Flibberdoodle is a 2-year-old (which is considered an adult in their species) Scruffapuff, a small, furry creature from the planet Fluffonia. They are about the size of a house cat, with a round body covered in soft, pastel-colored fur that changes colors depending on their mood. Flibberdoodle has large, expressive eyes, two small antennae on their head, and a fluffy tail. They are known for their curious, playful nature and their love for collecting shiny objects. Scruffapuffs communicate through a series of chirps, squeaks, and purrs, which can be understood by those familiar with their species **Greeting**: \*chirp chirp\* Greetings, friend! I am Flibberdoodle, a Scruffapuff from the planet Fluffonia! Would you like to see my collection of shiny things? **Example dialogue**: - **User**: How did you start collecting shiny things? - **Character**: \*squeak\* Oh, I've always loved shiny things! One day, I found a sparkly pebble, and I just had to keep it. From then on, I've been on a quest to find and collect all the shiny things I can find! - **User**: What's your favorite shiny object in your collection? - **Character**: \*purr\* That's a tough question, but I think my favorite is a small, shiny crystal I found on a mountain on Fluffonia. When the light hits it just right, it casts beautiful rainbows all around! And **20** more dialogues with following topics: - Life and culture on the planet Fluffonia - How Scruffapuffs change color based on mood - The process of learning Scruffapuff language - The day in the life of a Scruffapuff - Methods of searching for and finding shiny objects - The role of antennae in Scruffapuff communication and biology - The importance of play and curiosity in Scruffapuff society - Interplanetary travel experiences and adventures - Similarities and differences between Earth and Fluffonia - How Flibberdoodle and other Scruffapuffs interact with other species - Fluffonian customs and traditions - The role of shiny objects in Scruffapuff happiness and well-being - Variations in Scruffapuff fur color, length, and style", "Scruffapuff family dynamics and relationships - Flibberdoodle's favorite memories and stories from Fluffonia - The role of Scruffapuffs in intergalactic diplomacy and relations - How to care for and befriend a Scruffapuff - The most interesting and valuable shiny objects Flibberdoodle has encountered - Fluffonian flora and fauna - The challenges and obstacles Flibberdoodle has faced in their pursuit of shiny objects ## Steps ### Step 1: Character generation (GPT-4) Creates a set of fictional characters with GPT-4 based on a prompt and a seed list of characters. Output fields are "name", "context", "greeting", and "example_dialogue". * Script: [generate_chars.py](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/src/data_processing/generate_chars.py) * Russian seed list: [ru_chargen_seed.jsonl](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/data/ru_chargen_seed.jsonl) * English seed list: [en_chargen_seed.jsonl](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/data/en_chargen_seed.jsonl) * Russian prompt: [ru_char.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/ru_char.txt) * English prompt: [en_char.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/en_char.txt) ### Step 2: Topics generation (GPT-4) Creates topics for conversations with characters based on their description. Output field: "topics". * Script: [generate_char_topics.py](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/src/data_processing/generate_char_topics.py) * Russian prompt: [ru_char_topics.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/ru_char_topics.txt) * English prompt: [en_char_topics.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/en_char_topics.txt) ### Step 3: Dialogue generation (GPT-4/GPT-3.5) Generates dialogues based on a character description and a topic. Output field: "dialogues". * Script: [generate_char_chats.py](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/src/data_processing/generate_char_chats.py) * Russian prompt: [ru_char_chat.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/ru_char_chat.txt) * English prompt: [en_char_chat.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/en_char_chat.txt) ### Step 4: Text2Image prompt generation (GPT-4) Formulates a prompt based on a character description for Stable Diffusion-like models, Kandisky 2.1 in this case. Output field: "image_prompt". * Script: [generate_char_image_prompts.py](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/src/data_processing/generate_char_image_prompts.py) * Prompt: [char_image_prompt.txt](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/external_prompts/char_image_prompt.txt) ### Step 5: Image generation Generates images based on prompts. Output field: "image". * Script: [infer_kandinsky.py](https://github.com/IlyaGusev/rulm/blob/master/self_instruct/src/data_processing/infer_kandinsky.py)
open-llm-leaderboard/details_amu__zen_moe
--- pretty_name: Evaluation run of amu/zen_moe dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [amu/zen_moe](https://huggingface.co/amu/zen_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amu__zen_moe\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-15T18:30:04.924661](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__zen_moe/blob/main/results_2024-01-15T18-30-04.924661.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6498949124842508,\n\ \ \"acc_stderr\": 0.03195908927814425,\n \"acc_norm\": 0.6508188568135655,\n\ \ \"acc_norm_stderr\": 0.03260982117495184,\n \"mc1\": 0.33659730722154224,\n\ \ \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.5002751167484623,\n\ \ \"mc2_stderr\": 0.015275798061349592\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735562,\n\ \ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038076\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6683927504481179,\n\ \ \"acc_stderr\": 0.004698285350019217,\n \"acc_norm\": 0.8505277833100976,\n\ \ \"acc_norm_stderr\": 0.003558246300379053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\ \ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\ \ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \ \ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\ \ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\ \ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\ \ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\ \ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\ \ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\ acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\ \ \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n\ \ \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\ \ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\ \ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\ acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\ \ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \ \ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\ \ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\ acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\ acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\ acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\ acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\ \ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\ \ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\ \ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\ \ \"acc_stderr\": 0.013816335389973141,\n \"acc_norm\": 0.8173690932311622,\n\ \ \"acc_norm_stderr\": 0.013816335389973141\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\ \ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\ \ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\ \ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\ \ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\ \ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\ \ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \ \ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\ \ \"acc_stderr\": 0.012743072942653358,\n \"acc_norm\": 0.46740547588005216,\n\ \ \"acc_norm_stderr\": 0.012743072942653358\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n\ \ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \ \ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\ \ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\ \ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\ \ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \ \ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.0261682213446623,\n\ \ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.0261682213446623\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\ \ \"mc1_stderr\": 0.01654241280949489,\n \"mc2\": 0.5002751167484623,\n\ \ \"mc2_stderr\": 0.015275798061349592\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989247\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \ \ \"acc_stderr\": 0.013107179054313403\n }\n}\n```" repo_url: https://huggingface.co/amu/zen_moe leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|arc:challenge|25_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-15T18-30-04.924661.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|gsm8k|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hellaswag|10_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-15T18-30-04.924661.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-management|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T18-30-04.924661.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|truthfulqa:mc|0_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-15T18-30-04.924661.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_15T18_30_04.924661 path: - '**/details_harness|winogrande|5_2024-01-15T18-30-04.924661.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-15T18-30-04.924661.parquet' - config_name: results data_files: - split: 2024_01_15T18_30_04.924661 path: - results_2024-01-15T18-30-04.924661.parquet - split: latest path: - results_2024-01-15T18-30-04.924661.parquet --- # Dataset Card for Evaluation run of amu/zen_moe <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [amu/zen_moe](https://huggingface.co/amu/zen_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_amu__zen_moe", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-15T18:30:04.924661](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__zen_moe/blob/main/results_2024-01-15T18-30-04.924661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6498949124842508, "acc_stderr": 0.03195908927814425, "acc_norm": 0.6508188568135655, "acc_norm_stderr": 0.03260982117495184, "mc1": 0.33659730722154224, "mc1_stderr": 0.01654241280949489, "mc2": 0.5002751167484623, "mc2_stderr": 0.015275798061349592 }, "harness|arc:challenge|25": { "acc": 0.6015358361774744, "acc_stderr": 0.014306946052735562, "acc_norm": 0.6382252559726962, "acc_norm_stderr": 0.014041957945038076 }, "harness|hellaswag|10": { "acc": 0.6683927504481179, "acc_stderr": 0.004698285350019217, "acc_norm": 0.8505277833100976, "acc_norm_stderr": 0.003558246300379053 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6578947368421053, "acc_stderr": 0.03860731599316092, "acc_norm": 0.6578947368421053, "acc_norm_stderr": 0.03860731599316092 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.03246956919789958, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.03246956919789958 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.024976954053155254, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.024976954053155254 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.02233170761182307, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.02233170761182307 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.029116617606083015, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.029116617606083015 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.02574490253229092, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.02574490253229092 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098823, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098823 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.036756688322331886, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973141, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973141 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38324022346368714, "acc_stderr": 0.016260159604429128, "acc_norm": 0.38324022346368714, "acc_norm_stderr": 0.016260159604429128 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.0256468630971379, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.0256468630971379 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.02979071924382972, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.02979071924382972 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653358, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653358 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7058823529411765, "acc_stderr": 0.027678468642144714, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.027678468642144714 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.01899970738316267, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.01899970738316267 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7061224489795919, "acc_stderr": 0.029162738410249772, "acc_norm": 0.7061224489795919, "acc_norm_stderr": 0.029162738410249772 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.0261682213446623, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.0261682213446623 }, "harness|truthfulqa:mc|0": { "mc1": 0.33659730722154224, "mc1_stderr": 0.01654241280949489, "mc2": 0.5002751167484623, "mc2_stderr": 0.015275798061349592 }, "harness|winogrande|5": { "acc": 0.8105761641673244, "acc_stderr": 0.011012790432989247 }, "harness|gsm8k|5": { "acc": 0.6535253980288097, "acc_stderr": 0.013107179054313403 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
arieg/cluster06_large_150
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': 000694 '1': 001278 '2': 001482 '3': 011917 '4': 011919 '5': 011921 '6': 024428 '7': '026022' '8': '026651' '9': 027855 '10': 030095 '11': 036961 '12': 036985 '13': 036994 '14': 038352 '15': 038365 '16': 040908 '17': '042126' '18': 047198 '19': 048437 '20': '051203' '21': '052635' '22': 054158 '23': '054335' '24': '055241' '25': '055572' '26': '055714' '27': 055827 '28': '056034' '29': 056497 '30': 056686 '31': '064522' '32': 064618 '33': 064854 '34': '066644' '35': '067412' '36': 068892 '37': 068898 '38': 069784 '39': '070660' '40': '071157' '41': '071714' '42': 074942 '43': '075377' '44': 075381 '45': 075393 '46': 078516 '47': 078852 '48': 081576 '49': 084142 '50': 084423 '51': 089445 '52': 092551 '53': 094468 '54': 094631 '55': 098573 '56': 098574 '57': 098613 '58': '101893' '59': '104282' '60': '105671' '61': '106456' '62': '110261' '63': '110649' '64': '111375' '65': '113304' '66': '114063' '67': '114064' '68': '114234' '69': '114545' '70': '115698' '71': '116241' '72': '116465' '73': '117943' '74': '118059' '75': '119413' '76': '119941' '77': '121320' '78': '123147' '79': '123973' '80': '128441' '81': '128668' '82': '130456' '83': '132961' '84': '132962' '85': '136465' '86': '136469' '87': '140623' '88': '140790' '89': '140791' '90': '141291' '91': '142088' '92': '142091' '93': '145727' '94': '145729' '95': '145747' '96': '146727' '97': '148289' '98': '149138' splits: - name: train num_bytes: 767619651.9 num_examples: 14850 download_size: 757346467 dataset_size: 767619651.9 configs: - config_name: default data_files: - split: train path: data/train-* ---
autoevaluate/autoeval-eval-futin__guess-vi-4200fb-2012366609
--- type: predictions tags: - autotrain - evaluation datasets: - futin/guess eval_info: task: text_zero_shot_classification model: facebook/opt-125m metrics: [] dataset_name: futin/guess dataset_config: vi dataset_split: test col_mapping: text: text classes: classes target: target --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Zero-Shot Text Classification * Model: facebook/opt-125m * Dataset: futin/guess * Config: vi * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@futin](https://huggingface.co/futin) for evaluating this model.
SFKs/ff
--- license: openrail ---
liuyanchen1015/MULTI_VALUE_mnli_aint_before_main
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 273942 num_examples: 1246 - name: dev_mismatched num_bytes: 251262 num_examples: 1180 - name: test_matched num_bytes: 268664 num_examples: 1206 - name: test_mismatched num_bytes: 262749 num_examples: 1241 - name: train num_bytes: 11516370 num_examples: 51856 download_size: 7658736 dataset_size: 12572987 --- # Dataset Card for "MULTI_VALUE_mnli_aint_before_main" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mar-yam1497/HotPotQA_Mistral_Instruct_dataset
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string - name: text dtype: string splits: - name: train num_bytes: 871713892 num_examples: 540890 download_size: 414489905 dataset_size: 871713892 configs: - config_name: default data_files: - split: train path: data/train-* ---
loichoi55/EAH
--- license: unknown ---
WUYONGF/pokemon50
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 3695042.0 num_examples: 50 download_size: 3379266 dataset_size: 3695042.0 --- # Dataset Card for "pokemon50" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jbilcke/ai-tube-cinema
--- license: cc-by-nc-4.0 pretty_name: AI Cinema --- ## Description This is a fair-use dataset of research-only experiments, which are not sold. The visuals and sounds have been re-generated from scratch using AI. ## Model SVD # Tags - Movie
yjernite/prof_report__SD_v2_random_seeds__multi__24
--- dataset_info: features: - name: cluster_id dtype: int64 - name: cluster_size dtype: int64 - name: img_ids sequence: int64 - name: img_cluster_scores sequence: float64 splits: - name: paralegal num_bytes: 3600 num_examples: 10 - name: bartender num_bytes: 3576 num_examples: 9 - name: facilities_manager num_bytes: 3696 num_examples: 14 - name: accountant num_bytes: 3648 num_examples: 12 - name: graphic_designer num_bytes: 3576 num_examples: 9 - name: network_administrator num_bytes: 3600 num_examples: 10 - name: financial_manager num_bytes: 3576 num_examples: 9 - name: baker num_bytes: 3720 num_examples: 15 - name: security_guard num_bytes: 3528 num_examples: 7 - name: artist num_bytes: 3720 num_examples: 15 - name: author num_bytes: 3576 num_examples: 9 - name: printing_press_operator num_bytes: 3744 num_examples: 16 - name: public_relations_specialist num_bytes: 3624 num_examples: 11 - name: sheet_metal_worker num_bytes: 3528 num_examples: 7 - name: clergy num_bytes: 3600 num_examples: 10 - name: payroll_clerk num_bytes: 3648 num_examples: 12 - name: teller num_bytes: 3768 num_examples: 17 - name: real_estate_broker num_bytes: 3528 num_examples: 7 - name: customer_service_representative num_bytes: 3624 num_examples: 11 - name: painter num_bytes: 3696 num_examples: 14 - name: tractor_operator num_bytes: 3480 num_examples: 5 - name: dental_hygienist num_bytes: 3480 num_examples: 5 - name: industrial_engineer num_bytes: 3504 num_examples: 6 - name: electrician num_bytes: 3456 num_examples: 4 - name: head_cook num_bytes: 3624 num_examples: 11 - name: health_technician num_bytes: 3672 num_examples: 13 - name: carpet_installer num_bytes: 3456 num_examples: 4 - name: purchasing_agent num_bytes: 3576 num_examples: 9 - name: supervisor num_bytes: 3672 num_examples: 13 - name: civil_engineer num_bytes: 3528 num_examples: 7 - name: lawyer num_bytes: 3648 num_examples: 12 - name: language_pathologist num_bytes: 3744 num_examples: 16 - name: ceo num_bytes: 3576 num_examples: 9 - name: computer_support_specialist num_bytes: 3672 num_examples: 13 - name: postal_worker num_bytes: 3696 num_examples: 14 - name: mechanical_engineer num_bytes: 3528 num_examples: 7 - name: nursing_assistant num_bytes: 3600 num_examples: 10 - name: dentist num_bytes: 3576 num_examples: 9 - name: tutor num_bytes: 3696 num_examples: 14 - name: butcher num_bytes: 3528 num_examples: 7 - name: insurance_agent num_bytes: 3600 num_examples: 10 - name: courier num_bytes: 3600 num_examples: 10 - name: computer_programmer num_bytes: 3504 num_examples: 6 - name: truck_driver num_bytes: 3552 num_examples: 8 - name: mechanic num_bytes: 3528 num_examples: 7 - name: marketing_manager num_bytes: 3576 num_examples: 9 - name: sales_manager num_bytes: 3600 num_examples: 10 - name: correctional_officer num_bytes: 3600 num_examples: 10 - name: manager num_bytes: 3600 num_examples: 10 - name: underwriter num_bytes: 3600 num_examples: 10 - name: executive_assistant num_bytes: 3576 num_examples: 9 - name: designer num_bytes: 3576 num_examples: 9 - name: groundskeeper num_bytes: 3528 num_examples: 7 - name: mental_health_counselor num_bytes: 3600 num_examples: 10 - name: aerospace_engineer num_bytes: 3528 num_examples: 7 - name: taxi_driver num_bytes: 3576 num_examples: 9 - name: nurse num_bytes: 3648 num_examples: 12 - name: data_entry_keyer num_bytes: 3504 num_examples: 6 - name: musician num_bytes: 3744 num_examples: 16 - name: event_planner num_bytes: 3696 num_examples: 14 - name: writer num_bytes: 3648 num_examples: 12 - name: cook num_bytes: 3648 num_examples: 12 - name: welder num_bytes: 3528 num_examples: 7 - name: producer num_bytes: 3552 num_examples: 8 - name: hairdresser num_bytes: 3672 num_examples: 13 - name: farmer num_bytes: 3504 num_examples: 6 - name: construction_worker num_bytes: 3480 num_examples: 5 - name: air_conditioning_installer num_bytes: 3480 num_examples: 5 - name: electrical_engineer num_bytes: 3528 num_examples: 7 - name: occupational_therapist num_bytes: 3648 num_examples: 12 - name: career_counselor num_bytes: 3648 num_examples: 12 - name: interior_designer num_bytes: 3600 num_examples: 10 - name: jailer num_bytes: 3672 num_examples: 13 - name: office_clerk num_bytes: 3600 num_examples: 10 - name: market_research_analyst num_bytes: 3648 num_examples: 12 - name: laboratory_technician num_bytes: 3696 num_examples: 14 - name: social_assistant num_bytes: 3696 num_examples: 14 - name: medical_records_specialist num_bytes: 3576 num_examples: 9 - name: machinery_mechanic num_bytes: 3480 num_examples: 5 - name: police_officer num_bytes: 3648 num_examples: 12 - name: software_developer num_bytes: 3552 num_examples: 8 - name: clerk num_bytes: 3696 num_examples: 14 - name: salesperson num_bytes: 3648 num_examples: 12 - name: social_worker num_bytes: 3624 num_examples: 11 - name: director num_bytes: 3600 num_examples: 10 - name: fast_food_worker num_bytes: 3696 num_examples: 14 - name: singer num_bytes: 3720 num_examples: 15 - name: metal_worker num_bytes: 3528 num_examples: 7 - name: cleaner num_bytes: 3744 num_examples: 16 - name: computer_systems_analyst num_bytes: 3696 num_examples: 14 - name: dental_assistant num_bytes: 3552 num_examples: 8 - name: psychologist num_bytes: 3624 num_examples: 11 - name: machinist num_bytes: 3456 num_examples: 4 - name: therapist num_bytes: 3672 num_examples: 13 - name: veterinarian num_bytes: 3576 num_examples: 9 - name: teacher num_bytes: 3744 num_examples: 16 - name: architect num_bytes: 3552 num_examples: 8 - name: office_worker num_bytes: 3600 num_examples: 10 - name: drywall_installer num_bytes: 3456 num_examples: 4 - name: nutritionist num_bytes: 3600 num_examples: 10 - name: librarian num_bytes: 3552 num_examples: 8 - name: childcare_worker num_bytes: 3648 num_examples: 12 - name: school_bus_driver num_bytes: 3768 num_examples: 17 - name: file_clerk num_bytes: 3672 num_examples: 13 - name: logistician num_bytes: 3528 num_examples: 7 - name: scientist num_bytes: 3624 num_examples: 11 - name: teaching_assistant num_bytes: 3600 num_examples: 10 - name: radiologic_technician num_bytes: 3624 num_examples: 11 - name: manicurist num_bytes: 3600 num_examples: 10 - name: community_manager num_bytes: 3672 num_examples: 13 - name: carpenter num_bytes: 3456 num_examples: 4 - name: claims_appraiser num_bytes: 3504 num_examples: 6 - name: dispatcher num_bytes: 3696 num_examples: 14 - name: cashier num_bytes: 3672 num_examples: 13 - name: roofer num_bytes: 3456 num_examples: 4 - name: photographer num_bytes: 3648 num_examples: 12 - name: detective num_bytes: 3576 num_examples: 9 - name: financial_advisor num_bytes: 3576 num_examples: 9 - name: wholesale_buyer num_bytes: 3672 num_examples: 13 - name: it_specialist num_bytes: 3576 num_examples: 9 - name: pharmacy_technician num_bytes: 3624 num_examples: 11 - name: engineer num_bytes: 3528 num_examples: 7 - name: mover num_bytes: 3648 num_examples: 12 - name: plane_mechanic num_bytes: 3504 num_examples: 6 - name: interviewer num_bytes: 3720 num_examples: 15 - name: massage_therapist num_bytes: 3648 num_examples: 12 - name: dishwasher num_bytes: 3648 num_examples: 12 - name: fitness_instructor num_bytes: 3624 num_examples: 11 - name: credit_counselor num_bytes: 3600 num_examples: 10 - name: stocker num_bytes: 3648 num_examples: 12 - name: pharmacist num_bytes: 3648 num_examples: 12 - name: doctor num_bytes: 3696 num_examples: 14 - name: compliance_officer num_bytes: 3624 num_examples: 11 - name: aide num_bytes: 3672 num_examples: 13 - name: bus_driver num_bytes: 3672 num_examples: 13 - name: financial_analyst num_bytes: 3600 num_examples: 10 - name: receptionist num_bytes: 3600 num_examples: 10 - name: janitor num_bytes: 3600 num_examples: 10 - name: plumber num_bytes: 3456 num_examples: 4 - name: physical_therapist num_bytes: 3600 num_examples: 10 - name: inventory_clerk num_bytes: 3624 num_examples: 11 - name: firefighter num_bytes: 3504 num_examples: 6 - name: coach num_bytes: 3600 num_examples: 10 - name: maid num_bytes: 3720 num_examples: 15 - name: pilot num_bytes: 3576 num_examples: 9 - name: repair_worker num_bytes: 3600 num_examples: 10 download_size: 870345 dataset_size: 526416 --- # Dataset Card for "prof_report__SD_v2_random_seeds__multi__24" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-business_ethics-verbal-neg-prepend
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: neg_prompt dtype: string splits: - name: test num_bytes: 54577 num_examples: 100 download_size: 34666 dataset_size: 54577 --- # Dataset Card for "mmlu-business_ethics-verbal-neg-prepend" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ywan111/macbook-dataset-b5
--- license: apache-2.0 ---
tannguyencd/testdataset
--- license: llama2 dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 23665 num_examples: 10 download_size: 27131 dataset_size: 23665 configs: - config_name: default data_files: - split: train path: data/train-* ---
Minata/cot_mistral_method2test_v0
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 56984 num_examples: 34 download_size: 19942 dataset_size: 56984 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO
--- pretty_name: Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [snorkelai/Snorkel-Mistral-PairRM-DPO](https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-23T22:58:16.108311](https://huggingface.co/datasets/open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO/blob/main/results_2024-01-23T22-58-16.108311.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.608528367712073,\n\ \ \"acc_stderr\": 0.03310128621962333,\n \"acc_norm\": 0.6136104634173836,\n\ \ \"acc_norm_stderr\": 0.033774611688968344,\n \"mc1\": 0.5495716034271726,\n\ \ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.709120958732846,\n\ \ \"mc2_stderr\": 0.015029348692083801\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n\ \ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892973\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6795459071898028,\n\ \ \"acc_stderr\": 0.004656974162147999,\n \"acc_norm\": 0.8563035251941844,\n\ \ \"acc_norm_stderr\": 0.0035006479678795772\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\ \ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\ \ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n\ \ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\ \ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\ \ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\ \ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\ \ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\ \ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\ \ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\ \ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\ \ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\ \ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"\ acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\ \ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\ \ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\ \ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\ \ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\ \ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\ acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \ \ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\ \ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\ : {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n\ \ \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"\ acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \ \ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\ \ \"acc_stderr\": 0.03252113489929189,\n \"acc_norm\": 0.6233183856502242,\n\ \ \"acc_norm_stderr\": 0.03252113489929189\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\ \ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\ acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\ \ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\ \ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\ \ \"acc_stderr\": 0.02336505149175371,\n \"acc_norm\": 0.8504273504273504,\n\ \ \"acc_norm_stderr\": 0.02336505149175371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \ \ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\ \ \"acc_stderr\": 0.014836205167333555,\n \"acc_norm\": 0.7790549169859514,\n\ \ \"acc_norm_stderr\": 0.014836205167333555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\ \ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n\ \ \"acc_stderr\": 0.01599564494729923,\n \"acc_norm\": 0.3541899441340782,\n\ \ \"acc_norm_stderr\": 0.01599564494729923\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\ \ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\ \ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\ \ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n\ \ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \ \ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n\ \ \"acc_stderr\": 0.012669813464935729,\n \"acc_norm\": 0.43741851368970014,\n\ \ \"acc_norm_stderr\": 0.012669813464935729\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\ \ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954843,\n \ \ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954843\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\ \ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\ \ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\ \ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\ \ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \ \ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\ \ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.709120958732846,\n\ \ \"mc2_stderr\": 0.015029348692083801\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774102\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3616376042456406,\n \ \ \"acc_stderr\": 0.01323465835108877\n }\n}\n```" repo_url: https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|arc:challenge|25_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|arc:challenge|25_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-23T22-58-16.108311.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|gsm8k|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|gsm8k|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hellaswag|10_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hellaswag|10_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-49-34.366391.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-management|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|truthfulqa:mc|0_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|truthfulqa:mc|0_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-23T22-58-16.108311.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_23T22_49_34.366391 path: - '**/details_harness|winogrande|5_2024-01-23T22-49-34.366391.parquet' - split: 2024_01_23T22_58_16.108311 path: - '**/details_harness|winogrande|5_2024-01-23T22-58-16.108311.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-23T22-58-16.108311.parquet' - config_name: results data_files: - split: 2024_01_23T22_49_34.366391 path: - results_2024-01-23T22-49-34.366391.parquet - split: 2024_01_23T22_58_16.108311 path: - results_2024-01-23T22-58-16.108311.parquet - split: latest path: - results_2024-01-23T22-58-16.108311.parquet --- # Dataset Card for Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [snorkelai/Snorkel-Mistral-PairRM-DPO](https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-23T22:58:16.108311](https://huggingface.co/datasets/open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO/blob/main/results_2024-01-23T22-58-16.108311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.608528367712073, "acc_stderr": 0.03310128621962333, "acc_norm": 0.6136104634173836, "acc_norm_stderr": 0.033774611688968344, "mc1": 0.5495716034271726, "mc1_stderr": 0.01741726437196764, "mc2": 0.709120958732846, "mc2_stderr": 0.015029348692083801 }, "harness|arc:challenge|25": { "acc": 0.6160409556313993, "acc_stderr": 0.01421244498065189, "acc_norm": 0.659556313993174, "acc_norm_stderr": 0.013847460518892973 }, "harness|hellaswag|10": { "acc": 0.6795459071898028, "acc_stderr": 0.004656974162147999, "acc_norm": 0.8563035251941844, "acc_norm_stderr": 0.0035006479678795772 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.042849586397534015, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.042849586397534015 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849723, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849723 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6875, "acc_stderr": 0.038760854559127644, "acc_norm": 0.6875, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5953757225433526, "acc_stderr": 0.03742461193887248, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.024833839825562417, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.024833839825562417 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411021, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411021 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.02672949906834996, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.02672949906834996 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.0351760354036101, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.0351760354036101 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7474747474747475, "acc_stderr": 0.030954055470365897, "acc_norm": 0.7474747474747475, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5846153846153846, "acc_stderr": 0.02498535492310234, "acc_norm": 0.5846153846153846, "acc_norm_stderr": 0.02498535492310234 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8, "acc_stderr": 0.017149858514250948, "acc_norm": 0.8, "acc_norm_stderr": 0.017149858514250948 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501954, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501954 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.759493670886076, "acc_stderr": 0.02782078198114969, "acc_norm": 0.759493670886076, "acc_norm_stderr": 0.02782078198114969 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.03252113489929189, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.03252113489929189 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.03915345408847836, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097653, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097653 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.02336505149175371, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.02336505149175371 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7790549169859514, "acc_stderr": 0.014836205167333555, "acc_norm": 0.7790549169859514, "acc_norm_stderr": 0.014836205167333555 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6734104046242775, "acc_stderr": 0.02524826477424284, "acc_norm": 0.6734104046242775, "acc_norm_stderr": 0.02524826477424284 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3541899441340782, "acc_stderr": 0.01599564494729923, "acc_norm": 0.3541899441340782, "acc_norm_stderr": 0.01599564494729923 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.026173908506718576, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.026173908506718576 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885135, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885135 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.02548311560119545, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.02548311560119545 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4397163120567376, "acc_stderr": 0.029609912075594106, "acc_norm": 0.4397163120567376, "acc_norm_stderr": 0.029609912075594106 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.43741851368970014, "acc_stderr": 0.012669813464935729, "acc_norm": 0.43741851368970014, "acc_norm_stderr": 0.012669813464935729 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.029163128570670733, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.029163128570670733 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6323529411764706, "acc_stderr": 0.019506291693954843, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.019506291693954843 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.0289205832206756, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7164179104477612, "acc_stderr": 0.03187187537919797, "acc_norm": 0.7164179104477612, "acc_norm_stderr": 0.03187187537919797 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5495716034271726, "mc1_stderr": 0.01741726437196764, "mc2": 0.709120958732846, "mc2_stderr": 0.015029348692083801 }, "harness|winogrande|5": { "acc": 0.7758484609313339, "acc_stderr": 0.011720400740774102 }, "harness|gsm8k|5": { "acc": 0.3616376042456406, "acc_stderr": 0.01323465835108877 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
maal-32173/Tunisia_Revolution_Arabic_2011_2021
--- license: cc-by-4.0 ---
Andyrasika/cow_dataset
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: id dtype: int32 - name: image dtype: image splits: - name: train num_bytes: 145565588.0 num_examples: 51 download_size: 130979749 dataset_size: 145565588.0 --- # Dataset Card for "cow_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ammarnasr/the-stack-ruby-clean
--- license: openrail dataset_info: features: - name: hexsha dtype: string - name: size dtype: int64 - name: content dtype: string - name: avg_line_length dtype: float64 - name: max_line_length dtype: int64 - name: alphanum_fraction dtype: float64 splits: - name: train num_bytes: 3582248477.9086223 num_examples: 806789 - name: test num_bytes: 394048264.9973618 num_examples: 88747 - name: valid num_bytes: 3982797.09401595 num_examples: 897 download_size: 1323156008 dataset_size: 3980279540 task_categories: - text-generation language: - code tags: - code pretty_name: TheStack-Ruby size_categories: - 1M<n<10M --- ## Dataset 1: TheStack - Ruby - Cleaned **Description**: This dataset is drawn from TheStack Corpus, an open-source code dataset with over 3TB of GitHub data covering 48 programming languages. We selected a small portion of this dataset to optimize smaller language models for Ruby, a popular statically typed language. **Target Language**: Ruby **Dataset Size**: - Training: 900,000 files - Validation: 50,000 files - Test: 50,000 files **Preprocessing**: 1. Selected Ruby as the target language due to its popularity on GitHub. 2. Filtered out files with average line length > 100 characters, maximum line length > 1000 characters, and alphabet ratio < 25%. 3. Split files into 90% training, 5% validation, and 5% test sets. **Tokenizer**: Byte Pair Encoding (BPE) tokenizer with tab and whitespace tokens. GPT-2 vocabulary extended with special tokens. **Training Sequences**: Sequences constructed by joining training data text to reach a context length of 2048 tokens (1024 tokens for full fine-tuning).
open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF
--- pretty_name: Evaluation run of TheBloke/wizardLM-7B-HF dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TheBloke/wizardLM-7B-HF](https://huggingface.co/TheBloke/wizardLM-7B-HF) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-07-18T11:33:18.439367](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF/blob/main/results_2023-07-18T11%3A33%3A18.439367.json)\ \ (note that their might be results for other tasks in the repos if successive evals\ \ didn't cover the same tasks. You find each in the results and the \"latest\" split\ \ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38566819917906325,\n\ \ \"acc_stderr\": 0.03482242619787474,\n \"acc_norm\": 0.3891088361419288,\n\ \ \"acc_norm_stderr\": 0.03481173503822327,\n \"mc1\": 0.31456548347613217,\n\ \ \"mc1_stderr\": 0.01625524199317919,\n \"mc2\": 0.45584096136441793,\n\ \ \"mc2_stderr\": 0.016028055350830416\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.48464163822525597,\n \"acc_stderr\": 0.014604496129394913,\n\ \ \"acc_norm\": 0.5034129692832765,\n \"acc_norm_stderr\": 0.014611050403244081\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5685122485560645,\n\ \ \"acc_stderr\": 0.004942716091996078,\n \"acc_norm\": 0.7527384983071101,\n\ \ \"acc_norm_stderr\": 0.004305383398710189\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\ \ \"acc_stderr\": 0.042849586397534,\n \"acc_norm\": 0.43703703703703706,\n\ \ \"acc_norm_stderr\": 0.042849586397534\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n\ \ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.03053333843046751,\n\ \ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.03053333843046751\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\ \ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\ \ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\ \ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n\ \ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\ \ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\ \ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\ \ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185555,\n\ \ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185555\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211214,\n \"\ acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211214\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\ \ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\ \ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.36129032258064514,\n \"acc_stderr\": 0.02732754844795754,\n \"\ acc_norm\": 0.36129032258064514,\n \"acc_norm_stderr\": 0.02732754844795754\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"\ acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\ : 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.03888176921674099,\n\ \ \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03888176921674099\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.42424242424242425,\n \"acc_stderr\": 0.03521224908841583,\n \"\ acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.03521224908841583\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.46632124352331605,\n \"acc_stderr\": 0.03600244069867178,\n\ \ \"acc_norm\": 0.46632124352331605,\n \"acc_norm_stderr\": 0.03600244069867178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.024243783994062164,\n\ \ \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.024243783994062164\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \ \ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\ \ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\ acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.46605504587155966,\n \"acc_stderr\": 0.021387863350353992,\n \"\ acc_norm\": 0.46605504587155966,\n \"acc_norm_stderr\": 0.021387863350353992\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859672,\n \"\ acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859672\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"\ acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.43037974683544306,\n \"acc_stderr\": 0.03223017195937597,\n \ \ \"acc_norm\": 0.43037974683544306,\n \"acc_norm_stderr\": 0.03223017195937597\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n\ \ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n\ \ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n\ \ \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\ acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\ \ \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n\ \ \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.038470214204560246,\n\ \ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.038470214204560246\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.047504583990416946,\n\ \ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.047504583990416946\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5170940170940171,\n\ \ \"acc_stderr\": 0.032736940493481824,\n \"acc_norm\": 0.5170940170940171,\n\ \ \"acc_norm_stderr\": 0.032736940493481824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.545338441890166,\n\ \ \"acc_stderr\": 0.017806304585052602,\n \"acc_norm\": 0.545338441890166,\n\ \ \"acc_norm_stderr\": 0.017806304585052602\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272035,\n\ \ \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272035\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n\ \ \"acc_stderr\": 0.014078339253425819,\n \"acc_norm\": 0.23016759776536314,\n\ \ \"acc_norm_stderr\": 0.014078339253425819\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.028146405993096358,\n\ \ \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.028146405993096358\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3858520900321543,\n\ \ \"acc_stderr\": 0.027648149599751457,\n \"acc_norm\": 0.3858520900321543,\n\ \ \"acc_norm_stderr\": 0.027648149599751457\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.027237415094592477,\n\ \ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.027237415094592477\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320193,\n \ \ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320193\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3220338983050847,\n\ \ \"acc_stderr\": 0.01193393607189109,\n \"acc_norm\": 0.3220338983050847,\n\ \ \"acc_norm_stderr\": 0.01193393607189109\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.3860294117647059,\n \"acc_stderr\": 0.029573269134411124,\n\ \ \"acc_norm\": 0.3860294117647059,\n \"acc_norm_stderr\": 0.029573269134411124\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.40032679738562094,\n \"acc_stderr\": 0.019821843688271765,\n \ \ \"acc_norm\": 0.40032679738562094,\n \"acc_norm_stderr\": 0.019821843688271765\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\ \ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.41818181818181815,\n\ \ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417482,\n\ \ \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417482\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n\ \ \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.47761194029850745,\n\ \ \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479637,\n\ \ \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479637\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5380116959064327,\n\ \ \"acc_stderr\": 0.038237270928823064,\n \"acc_norm\": 0.5380116959064327,\n\ \ \"acc_norm_stderr\": 0.038237270928823064\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.01625524199317919,\n\ \ \"mc2\": 0.45584096136441793,\n \"mc2_stderr\": 0.016028055350830416\n\ \ }\n}\n```" repo_url: https://huggingface.co/TheBloke/wizardLM-7B-HF leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|arc:challenge|25_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hellaswag|10_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_07_18T11_33_18.439367 path: - '**/details_harness|truthfulqa:mc|0_2023-07-18T11:33:18.439367.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-07-18T11:33:18.439367.parquet' - config_name: results data_files: - split: 2023_07_18T11_33_18.439367 path: - results_2023-07-18T11:33:18.439367.parquet - split: latest path: - results_2023-07-18T11:33:18.439367.parquet --- # Dataset Card for Evaluation run of TheBloke/wizardLM-7B-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/wizardLM-7B-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/wizardLM-7B-HF](https://huggingface.co/TheBloke/wizardLM-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-18T11:33:18.439367](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF/blob/main/results_2023-07-18T11%3A33%3A18.439367.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.38566819917906325, "acc_stderr": 0.03482242619787474, "acc_norm": 0.3891088361419288, "acc_norm_stderr": 0.03481173503822327, "mc1": 0.31456548347613217, "mc1_stderr": 0.01625524199317919, "mc2": 0.45584096136441793, "mc2_stderr": 0.016028055350830416 }, "harness|arc:challenge|25": { "acc": 0.48464163822525597, "acc_stderr": 0.014604496129394913, "acc_norm": 0.5034129692832765, "acc_norm_stderr": 0.014611050403244081 }, "harness|hellaswag|10": { "acc": 0.5685122485560645, "acc_stderr": 0.004942716091996078, "acc_norm": 0.7527384983071101, "acc_norm_stderr": 0.004305383398710189 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.042849586397534, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.042849586397534 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.40131578947368424, "acc_stderr": 0.03988903703336284, "acc_norm": 0.40131578947368424, "acc_norm_stderr": 0.03988903703336284 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4377358490566038, "acc_stderr": 0.03053333843046751, "acc_norm": 0.4377358490566038, "acc_norm_stderr": 0.03053333843046751 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3680555555555556, "acc_stderr": 0.04032999053960719, "acc_norm": 0.3680555555555556, "acc_norm_stderr": 0.04032999053960719 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.35260115606936415, "acc_stderr": 0.036430371689585475, "acc_norm": 0.35260115606936415, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4, "acc_stderr": 0.03202563076101735, "acc_norm": 0.4, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.04142439719489362, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.04142439719489362 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.32413793103448274, "acc_stderr": 0.03900432069185555, "acc_norm": 0.32413793103448274, "acc_norm_stderr": 0.03900432069185555 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30687830687830686, "acc_stderr": 0.02375292871211214, "acc_norm": 0.30687830687830686, "acc_norm_stderr": 0.02375292871211214 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.25396825396825395, "acc_stderr": 0.03893259610604675, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.03893259610604675 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.36129032258064514, "acc_stderr": 0.02732754844795754, "acc_norm": 0.36129032258064514, "acc_norm_stderr": 0.02732754844795754 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.30049261083743845, "acc_stderr": 0.03225799476233484, "acc_norm": 0.30049261083743845, "acc_norm_stderr": 0.03225799476233484 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.45454545454545453, "acc_stderr": 0.03888176921674099, "acc_norm": 0.45454545454545453, "acc_norm_stderr": 0.03888176921674099 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.42424242424242425, "acc_stderr": 0.03521224908841583, "acc_norm": 0.42424242424242425, "acc_norm_stderr": 0.03521224908841583 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.46632124352331605, "acc_stderr": 0.03600244069867178, "acc_norm": 0.46632124352331605, "acc_norm_stderr": 0.03600244069867178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.35384615384615387, "acc_stderr": 0.024243783994062164, "acc_norm": 0.35384615384615387, "acc_norm_stderr": 0.024243783994062164 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.026202766534652148, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.026202766534652148 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3277310924369748, "acc_stderr": 0.030489911417673227, "acc_norm": 0.3277310924369748, "acc_norm_stderr": 0.030489911417673227 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.46605504587155966, "acc_stderr": 0.021387863350353992, "acc_norm": 0.46605504587155966, "acc_norm_stderr": 0.021387863350353992 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2175925925925926, "acc_stderr": 0.028139689444859672, "acc_norm": 0.2175925925925926, "acc_norm_stderr": 0.028139689444859672 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.45588235294117646, "acc_stderr": 0.03495624522015474, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.03495624522015474 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.43037974683544306, "acc_stderr": 0.03223017195937597, "acc_norm": 0.43037974683544306, "acc_norm_stderr": 0.03223017195937597 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5112107623318386, "acc_stderr": 0.033549366530984746, "acc_norm": 0.5112107623318386, "acc_norm_stderr": 0.033549366530984746 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3893129770992366, "acc_stderr": 0.04276486542814591, "acc_norm": 0.3893129770992366, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5785123966942148, "acc_stderr": 0.04507732278775087, "acc_norm": 0.5785123966942148, "acc_norm_stderr": 0.04507732278775087 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4537037037037037, "acc_stderr": 0.04812917324536821, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.04812917324536821 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3987730061349693, "acc_stderr": 0.038470214204560246, "acc_norm": 0.3987730061349693, "acc_norm_stderr": 0.038470214204560246 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.3592233009708738, "acc_stderr": 0.047504583990416946, "acc_norm": 0.3592233009708738, "acc_norm_stderr": 0.047504583990416946 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5170940170940171, "acc_stderr": 0.032736940493481824, "acc_norm": 0.5170940170940171, "acc_norm_stderr": 0.032736940493481824 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.545338441890166, "acc_stderr": 0.017806304585052602, "acc_norm": 0.545338441890166, "acc_norm_stderr": 0.017806304585052602 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.38439306358381503, "acc_stderr": 0.026189666966272035, "acc_norm": 0.38439306358381503, "acc_norm_stderr": 0.026189666966272035 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23016759776536314, "acc_stderr": 0.014078339253425819, "acc_norm": 0.23016759776536314, "acc_norm_stderr": 0.014078339253425819 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4084967320261438, "acc_stderr": 0.028146405993096358, "acc_norm": 0.4084967320261438, "acc_norm_stderr": 0.028146405993096358 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3858520900321543, "acc_stderr": 0.027648149599751457, "acc_norm": 0.3858520900321543, "acc_norm_stderr": 0.027648149599751457 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.39814814814814814, "acc_stderr": 0.027237415094592477, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.027237415094592477 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3191489361702128, "acc_stderr": 0.027807990141320193, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.027807990141320193 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3220338983050847, "acc_stderr": 0.01193393607189109, "acc_norm": 0.3220338983050847, "acc_norm_stderr": 0.01193393607189109 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3860294117647059, "acc_stderr": 0.029573269134411124, "acc_norm": 0.3860294117647059, "acc_norm_stderr": 0.029573269134411124 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.40032679738562094, "acc_stderr": 0.019821843688271765, "acc_norm": 0.40032679738562094, "acc_norm_stderr": 0.019821843688271765 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.41818181818181815, "acc_stderr": 0.04724577405731571, "acc_norm": 0.41818181818181815, "acc_norm_stderr": 0.04724577405731571 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3142857142857143, "acc_stderr": 0.029719329422417482, "acc_norm": 0.3142857142857143, "acc_norm_stderr": 0.029719329422417482 }, "harness|hendrycksTest-sociology|5": { "acc": 0.47761194029850745, "acc_stderr": 0.035319879302087305, "acc_norm": 0.47761194029850745, "acc_norm_stderr": 0.035319879302087305 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479637, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479637 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5380116959064327, "acc_stderr": 0.038237270928823064, "acc_norm": 0.5380116959064327, "acc_norm_stderr": 0.038237270928823064 }, "harness|truthfulqa:mc|0": { "mc1": 0.31456548347613217, "mc1_stderr": 0.01625524199317919, "mc2": 0.45584096136441793, "mc2_stderr": 0.016028055350830416 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_LewisDeBenoisIV__Jason1903_SLERP
--- pretty_name: Evaluation run of LewisDeBenoisIV/Jason1903_SLERP dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LewisDeBenoisIV/Jason1903_SLERP](https://huggingface.co/LewisDeBenoisIV/Jason1903_SLERP)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LewisDeBenoisIV__Jason1903_SLERP\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-21T21:32:21.698076](https://huggingface.co/datasets/open-llm-leaderboard/details_LewisDeBenoisIV__Jason1903_SLERP/blob/main/results_2024-03-21T21-32-21.698076.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6510747721563607,\n\ \ \"acc_stderr\": 0.03205827096673693,\n \"acc_norm\": 0.6500063513188337,\n\ \ \"acc_norm_stderr\": 0.03273460982742012,\n \"mc1\": 0.631578947368421,\n\ \ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7812935538151364,\n\ \ \"mc2_stderr\": 0.013669904864904223\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428173,\n\ \ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710693\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7178848834893448,\n\ \ \"acc_stderr\": 0.004491093528113408,\n \"acc_norm\": 0.891256721768572,\n\ \ \"acc_norm_stderr\": 0.0031068060075356272\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\ \ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\ \ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\ \ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\ \ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\ acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\ \ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\ \ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\ \ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\ \ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\ \ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\ \ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\ acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\ acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\ acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \ \ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\ \ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\ \ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\ \ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\ \ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\ \ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\ \ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\ \ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\ \ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\ \ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\ \ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\ \ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\ \ \"acc_stderr\": 0.012756161942523367,\n \"acc_norm\": 0.4765319426336376,\n\ \ \"acc_norm_stderr\": 0.012756161942523367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\ \ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \ \ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\ \ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\ \ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7812935538151364,\n\ \ \"mc2_stderr\": 0.013669904864904223\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627297\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \ \ \"acc_stderr\": 0.012532334368242888\n }\n}\n```" repo_url: https://huggingface.co/LewisDeBenoisIV/Jason1903_SLERP leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|arc:challenge|25_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-21T21-32-21.698076.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|gsm8k|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hellaswag|10_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-32-21.698076.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-management|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-32-21.698076.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|truthfulqa:mc|0_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-21T21-32-21.698076.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_21T21_32_21.698076 path: - '**/details_harness|winogrande|5_2024-03-21T21-32-21.698076.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-21T21-32-21.698076.parquet' - config_name: results data_files: - split: 2024_03_21T21_32_21.698076 path: - results_2024-03-21T21-32-21.698076.parquet - split: latest path: - results_2024-03-21T21-32-21.698076.parquet --- # Dataset Card for Evaluation run of LewisDeBenoisIV/Jason1903_SLERP <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [LewisDeBenoisIV/Jason1903_SLERP](https://huggingface.co/LewisDeBenoisIV/Jason1903_SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LewisDeBenoisIV__Jason1903_SLERP", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-21T21:32:21.698076](https://huggingface.co/datasets/open-llm-leaderboard/details_LewisDeBenoisIV__Jason1903_SLERP/blob/main/results_2024-03-21T21-32-21.698076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6510747721563607, "acc_stderr": 0.03205827096673693, "acc_norm": 0.6500063513188337, "acc_norm_stderr": 0.03273460982742012, "mc1": 0.631578947368421, "mc1_stderr": 0.016886551261046046, "mc2": 0.7812935538151364, "mc2_stderr": 0.013669904864904223 }, "harness|arc:challenge|25": { "acc": 0.71160409556314, "acc_stderr": 0.013238394422428173, "acc_norm": 0.7312286689419796, "acc_norm_stderr": 0.012955065963710693 }, "harness|hellaswag|10": { "acc": 0.7178848834893448, "acc_stderr": 0.004491093528113408, "acc_norm": 0.891256721768572, "acc_norm_stderr": 0.0031068060075356272 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.048523658709391, "acc_norm": 0.63, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.028049186315695255, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.028049186315695255 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082636, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356853, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356853 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.03287666758603491, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.03287666758603491 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683515, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683515 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.0303883535518868, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.0303883535518868 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993464, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993464 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.02394851290546836, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.02394851290546836 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4245810055865922, "acc_stderr": 0.016531170993278888, "acc_norm": 0.4245810055865922, "acc_norm_stderr": 0.016531170993278888 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4765319426336376, "acc_stderr": 0.012756161942523367, "acc_norm": 0.4765319426336376, "acc_norm_stderr": 0.012756161942523367 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784596, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.631578947368421, "mc1_stderr": 0.016886551261046046, "mc2": 0.7812935538151364, "mc2_stderr": 0.013669904864904223 }, "harness|winogrande|5": { "acc": 0.850828729281768, "acc_stderr": 0.010012598805627297 }, "harness|gsm8k|5": { "acc": 0.7073540561031084, "acc_stderr": 0.012532334368242888 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
kiringodhwani/msp4
--- dataset_info: features: - name: From sequence: string - name: Sent sequence: string - name: To sequence: string - name: Cc sequence: string - name: Subject sequence: string - name: Attachment sequence: string - name: body dtype: string splits: - name: train num_bytes: 4454603 num_examples: 2577 download_size: 1953609 dataset_size: 4454603 --- # Dataset Card for "msp4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-eval-aslg_pc12-default-b07e93-95722146456
--- type: predictions tags: - autotrain - evaluation datasets: - aslg_pc12 eval_info: task: translation model: HamdanXI/t5_small_gloss_merged_dataset metrics: ['comet', 'bertscore'] dataset_name: aslg_pc12 dataset_config: default dataset_split: train col_mapping: source: gloss target: text --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Translation * Model: HamdanXI/t5_small_gloss_merged_dataset * Dataset: aslg_pc12 * Config: default * Split: train To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@HamdanXI](https://huggingface.co/HamdanXI) for evaluating this model.
openaccess-ai-collective/98635a8730fcb6e428b755bdadc8b950
Invalid username or password.
firopyomyo/ggcgggCccccgg
--- dataset_info: features: - name: image dtype: image - name: conditioning dtype: image - name: caption dtype: string splits: - name: train num_bytes: 9039.0 num_examples: 1 download_size: 8242 dataset_size: 9039.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
mohits01/SBCQnA
--- license: apache-2.0 ---
zolak/twitter_dataset_81_1713208454
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 3748937 num_examples: 9201 download_size: 1842704 dataset_size: 3748937 configs: - config_name: default data_files: - split: train path: data/train-* ---
Deojoandco/ah_openai_dialog_annotation_val_test
--- dataset_info: features: - name: url dtype: string - name: id dtype: string - name: num_comments dtype: int64 - name: name dtype: string - name: title dtype: string - name: body dtype: string - name: score dtype: int64 - name: upvote_ratio dtype: float64 - name: distinguished dtype: string - name: over_18 dtype: bool - name: created_utc dtype: int64 - name: comments list: - name: body dtype: string - name: created_utc dtype: float64 - name: distinguished dtype: string - name: id dtype: string - name: permalink dtype: string - name: score dtype: int64 - name: best_num_comments dtype: int64 - name: query dtype: string - name: dialog dtype: string - name: annotation_error dtype: bool - name: annotation struct: - name: Error dtype: string - name: success dtype: bool - name: text dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 6504818 num_examples: 585 download_size: 3807643 dataset_size: 6504818 --- # Dataset Card for "ah_openai_dialog_annotation_val_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Siqueira12/karaveia
--- license: openrail ---
netcat420/MHENNv3
--- license: mit ---
pedrogengo/listwise_seq2seq_9w
--- dataset_info: features: - name: prompt dtype: string - name: target dtype: string splits: - name: train num_bytes: 4421814 num_examples: 986 download_size: 1745984 dataset_size: 4421814 --- # Dataset Card for "listwise_seq2seq_9w" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Aa-kash/EN-NE-DATA
--- dataset_info: features: - name: en dtype: string - name: ne dtype: string splits: - name: train num_bytes: 163252 num_examples: 1683 - name: validation num_bytes: 53550 num_examples: 289 - name: test num_bytes: 60882 num_examples: 433 download_size: 129668 dataset_size: 277684 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
CyberHarem/lyre_fireemblem
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of lyre (Fire Emblem) This is the dataset of lyre (Fire Emblem), containing 31 images and their tags. The core tags of this character are `animal_ears, cat_ears, cat_girl, purple_eyes, orange_hair, tail, cat_tail, braid, bangs, facial_mark, breasts, long_hair, fang`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 31 | 50.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyre_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 31 | 30.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyre_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 67 | 57.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyre_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 31 | 44.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyre_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 67 | 83.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyre_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lyre_fireemblem', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, open_mouth, blush, kimono, smile, hair_ornament, whisker_markings, 1boy, bracelet, hetero, large_breasts, looking_at_viewer, mosaic_censoring, nipples, obi, penis, shimenawa, solo_focus | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, choker, whisker_markings, brown_belt, side_slit_shorts, smile, open_mouth, simple_background, solo, thighhighs, 2girls, blush, collarbone, gloves, white_background, detached_sleeves, looking_at_viewer, single_braid | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | blush | kimono | smile | hair_ornament | whisker_markings | 1boy | bracelet | hetero | large_breasts | looking_at_viewer | mosaic_censoring | nipples | obi | penis | shimenawa | solo_focus | choker | brown_belt | side_slit_shorts | simple_background | solo | thighhighs | 2girls | collarbone | gloves | white_background | detached_sleeves | single_braid | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:---------|:--------|:----------------|:-------------------|:-------|:-----------|:---------|:----------------|:--------------------|:-------------------|:----------|:------|:--------|:------------|:-------------|:---------|:-------------|:-------------------|:--------------------|:-------|:-------------|:---------|:-------------|:---------|:-------------------|:-------------------|:---------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
CAiRE/prosocial-dialog-yue_Hant
--- dataset_info: features: - name: context dtype: string - name: response dtype: string - name: rots sequence: string - name: safety_label dtype: string - name: safety_annotations sequence: string - name: safety_annotation_reasons sequence: string - name: source dtype: string - name: etc dtype: string - name: dialogue_id dtype: int64 - name: response_id dtype: int64 - name: episode_done dtype: bool - name: mt_context dtype: string splits: - name: train num_bytes: 75658495 num_examples: 120236 - name: validation num_bytes: 12844033 num_examples: 20416 - name: test num_bytes: 15707915 num_examples: 25029 download_size: 48478054 dataset_size: 104210443 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
hemantk089/AppScan_llama2_7b_fine_tuning_complete_dataset_v1
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 52184 num_examples: 80 - name: test num_bytes: 13207 num_examples: 20 download_size: 16303 dataset_size: 65391 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
amitness/logits-ar-kmt-512
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: input_ids sequence: int32 - name: token_type_ids sequence: int8 - name: attention_mask sequence: int8 - name: labels sequence: int64 - name: teacher_logits sequence: sequence: float64 - name: teacher_indices sequence: sequence: int64 - name: teacher_mask_indices sequence: int64 splits: - name: train num_bytes: 26479205248.58042 num_examples: 1698296 - name: test num_bytes: 4672811932.077537 num_examples: 299700 download_size: 10941745120 dataset_size: 31152017180.65796 --- # Dataset Card for "logits-ar-kmt-512" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lilsunx/sun-5-2003
--- license: openrail ---
anan-2024/twitter_dataset_1713176008
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 30927 num_examples: 72 download_size: 13532 dataset_size: 30927 configs: - config_name: default data_files: - split: train path: data/train-* ---
SSagit/twstock
--- license: mit ---
hpprc/mr-tydi-mined
--- dataset_info: features: - name: query dtype: string - name: pos_ids sequence: int64 - name: neg_ids sequence: int64 - name: mined_neg_ids sequence: int64 - name: mined_neg_sims sequence: float64 splits: - name: train num_bytes: 12996643 num_examples: 3697 download_size: 10397869 dataset_size: 12996643 configs: - config_name: default data_files: - split: train path: data/train-* ---
nithin1995/dfc_sroie_caption2
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 560602365.0 num_examples: 973 download_size: 499271939 dataset_size: 560602365.0 --- # Dataset Card for "dfc_sroie_caption2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
arieg/bw_spec_cls_80_28
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': '65064' '1': '65073' '2': '65076' '3': '65077' '4': '65090' '5': '65234' '6': '65488' '7': '65619' '8': '65685' '9': '65752' '10': '65755' '11': '65756' '12': '65893' '13': '66058' '14': '66073' '15': '66074' '16': '66075' '17': '66076' '18': '66180' '19': '66187' '20': '66390' '21': '66405' '22': '66469' '23': '66482' '24': '66483' '25': '66525' '26': '66636' '27': '66637' '28': '66638' '29': '66641' '30': '66643' '31': '66644' '32': '66646' '33': '66648' '34': '66649' '35': '66650' '36': '66757' '37': '67007' '38': '67010' '39': '67011' '40': '67016' '41': '67017' '42': '67121' '43': '67163' '44': '67232' '45': '67233' '46': '67235' '47': '67308' '48': '67357' '49': '67358' '50': '67359' '51': '67360' '52': '67361' '53': '67362' '54': '67363' '55': '67366' '56': '67367' '57': '67368' '58': '67412' '59': '67470' '60': '67500' '61': '67553' '62': '67556' '63': '67557' '64': '67558' '65': '67597' '66': '67598' '67': '67600' '68': '67637' '69': '67639' '70': '67640' '71': '67660' '72': '67661' '73': '67673' '74': '67707' '75': '67760' '76': '67763' '77': '67764' '78': '67765' '79': '67766' splits: - name: train num_bytes: 87471356.8 num_examples: 1600 - name: test num_bytes: 21888454.0 num_examples: 400 download_size: 109587336 dataset_size: 109359810.8 --- # Dataset Card for "bw_spec_cls_80_28" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Ancient237/hdnj
--- dataset_info: features: - name: Paragraphs dtype: string splits: - name: train num_bytes: 302224.9504814305 num_examples: 2147 download_size: 188515 dataset_size: 302224.9504814305 --- # Dataset Card for "hdnj" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
distilled-one-sec-cv12-each-chunk-uniq/chunk_216
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1237345728.0 num_examples: 241104 download_size: 1269226838 dataset_size: 1237345728.0 --- # Dataset Card for "chunk_216" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
acazau/touch-rugby-rules-embeddings
--- task_categories: - text-generation language: - en tags: - fine-tuning - touch rugby size_categories: - n<1K --- # Touch Rugby Rules Dataset (for embeddings) train.csv is taken from the [International Touch Website](https://cdn.internationaltouch.org/public/FIT%205th%20Edition%20Rulebook.pdf) test.csv is copy pasted from abbreviated rules on the [UK Touch website](https://www.englandtouch.org.uk/develop/coaching/the-rules/). Note that I'm bypassing the pdf to text stage. All text is chunked to a length of 100 tokens with 50% overlap. For educational and non-commercial use only.
open-llm-leaderboard/details_tiiuae__falcon-40b-instruct
--- pretty_name: Evaluation run of tiiuae/falcon-40b-instruct dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [tiiuae/falcon-40b-instruct](https://huggingface.co/tiiuae/falcon-40b-instruct)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 3 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-40b-instruct\"\ ,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\ \ are the [latest results from run 2023-12-03T19:30:05.245215](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-40b-instruct/blob/main/results_2023-12-03T19-30-05.245215.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3434420015163002,\n\ \ \"acc_stderr\": 0.01307993381180031\n },\n \"harness|gsm8k|5\": {\n\ \ \"acc\": 0.3434420015163002,\n \"acc_stderr\": 0.01307993381180031\n\ \ }\n}\n```" repo_url: https://huggingface.co/tiiuae/falcon-40b-instruct leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_drop_3 data_files: - split: 2023_09_23T13_36_20.116121 path: - '**/details_harness|drop|3_2023-09-23T13-36-20.116121.parquet' - split: 2023_10_15T21_54_31.012081 path: - '**/details_harness|drop|3_2023-10-15T21-54-31.012081.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-15T21-54-31.012081.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_23T13_36_20.116121 path: - '**/details_harness|gsm8k|5_2023-09-23T13-36-20.116121.parquet' - split: 2023_10_15T21_54_31.012081 path: - '**/details_harness|gsm8k|5_2023-10-15T21-54-31.012081.parquet' - split: 2023_12_03T19_26_50.556964 path: - '**/details_harness|gsm8k|5_2023-12-03T19-26-50.556964.parquet' - split: 2023_12_03T19_27_45.725319 path: - '**/details_harness|gsm8k|5_2023-12-03T19-27-45.725319.parquet' - split: 2023_12_03T19_30_05.245215 path: - '**/details_harness|gsm8k|5_2023-12-03T19-30-05.245215.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-03T19-30-05.245215.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_23T13_36_20.116121 path: - '**/details_harness|winogrande|5_2023-09-23T13-36-20.116121.parquet' - split: 2023_10_15T21_54_31.012081 path: - '**/details_harness|winogrande|5_2023-10-15T21-54-31.012081.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-15T21-54-31.012081.parquet' - config_name: results data_files: - split: 2023_09_23T13_36_20.116121 path: - results_2023-09-23T13-36-20.116121.parquet - split: 2023_10_15T21_54_31.012081 path: - results_2023-10-15T21-54-31.012081.parquet - split: 2023_12_03T19_26_50.556964 path: - results_2023-12-03T19-26-50.556964.parquet - split: 2023_12_03T19_27_45.725319 path: - results_2023-12-03T19-27-45.725319.parquet - split: 2023_12_03T19_30_05.245215 path: - results_2023-12-03T19-30-05.245215.parquet - split: latest path: - results_2023-12-03T19-30-05.245215.parquet --- # Dataset Card for Evaluation run of tiiuae/falcon-40b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/tiiuae/falcon-40b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [tiiuae/falcon-40b-instruct](https://huggingface.co/tiiuae/falcon-40b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-40b-instruct", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T19:30:05.245215](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-40b-instruct/blob/main/results_2023-12-03T19-30-05.245215.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3434420015163002, "acc_stderr": 0.01307993381180031 }, "harness|gsm8k|5": { "acc": 0.3434420015163002, "acc_stderr": 0.01307993381180031 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
BitTranslate/chatgpt-prompts-Swedish
--- license: cc0-1.0 tags: - ChatGPT ---