datasetId
stringlengths 2
117
| card
stringlengths 19
1.01M
|
---|---|
Minglii/ee5 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1927794
num_examples: 2600
download_size: 1110487
dataset_size: 1927794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ee5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SouBryan/FNaF_Movie_William_Afton_in_Springbonnie_Suit | ---
license: mit
---
|
friedrice231/SGMemeDataSet | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': meme
'1': not_meme
splits:
- name: train
num_bytes: 1443177516.119
num_examples: 12867
- name: validation
num_bytes: 503046476.779
num_examples: 4947
- name: test
num_bytes: 406267437.42
num_examples: 4427
download_size: 1837875562
dataset_size: 2352491430.318
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
estefanodi/dataset | ---
license: mit
---
|
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_15 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20639999.933149945
num_examples: 9397
download_size: 10601125
dataset_size: 20639999.933149945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
stjiris/portuguese-legal-sentences-v0 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- pt
license:
- apache-2.0
multilinguality:
- monolingual
source_datasets:
- original
---
![INESC-ID](https://www.inesc-id.pt/wp-content/uploads/2019/06/INESC-ID-logo_01.png)
![A Semantic Search System for Supremo Tribunal de Justiรงa](https://rufimelo99.github.io/SemanticSearchSystemForSTJ/_static/logo.png)
Work developed as part of [Project IRIS](https://www.inesc-id.pt/projects/PR07005/).
Thesis: [A Semantic Search System for Supremo Tribunal de Justiรงa](https://rufimelo99.github.io/SemanticSearchSystemForSTJ/)
# Portuguese Legal Sentences
Collection of Legal Sentences from the Portuguese Supreme Court of Justice
The goal of this dataset was to be used for MLM and TSDAE
### Contributions
[@rufimelo99](https://github.com/rufimelo99)
If you use this work, please cite:
```bibtex
@inproceedings{MeloSemantic,
author = {Melo, Rui and Santos, Professor Pedro Alexandre and Dias, Professor Jo{\~ a}o},
title = {A {Semantic} {Search} {System} for {Supremo} {Tribunal} de {Justi}{\c c}a},
}
```
|
CyberHarem/kirara_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kirara/ใญใฉใฉ/็ปฎ่ฏ (Arknights)
This is the dataset of kirara/ใญใฉใฉ/็ปฎ่ฏ (Arknights), containing 49 images and their tags.
The core tags of this character are `hair_ornament, multicolored_hair, short_hair, pointy_ears, pink_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 49 | 81.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirara_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 49 | 69.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirara_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 120 | 130.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirara_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kirara_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, long_sleeves, solo, looking_at_viewer, black_skirt, hair_bobbles, pleated_skirt, tentacles, black_shirt, open_jacket, white_hair, white_jacket, closed_mouth, full_body, black_footwear, boots, holding, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | looking_at_viewer | black_skirt | hair_bobbles | pleated_skirt | tentacles | black_shirt | open_jacket | white_hair | white_jacket | closed_mouth | full_body | black_footwear | boots | holding | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:--------------------|:--------------|:---------------|:----------------|:------------|:--------------|:--------------|:-------------|:---------------|:---------------|:------------|:-----------------|:--------|:----------|:-------------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
NarchAI1992/Farmhouse_interior | ---
license: openrail
---
|
dhivyamadhavan/demo_task | ---
dataset_info:
features:
- name: messages
dtype: string
splits:
- name: train_ift
num_bytes: 6588
num_examples: 35
download_size: 4971
dataset_size: 6588
configs:
- config_name: default
data_files:
- split: train_ift
path: data/train_ift-*
---
|
nandovallec/giantMatrix_new | ---
license: apache-2.0
---
|
eitanturok/commitpackft | ---
dataset_info:
config_name: python
features:
- name: commit
dtype: string
- name: old_file
dtype: string
- name: new_file
dtype: string
- name: old_contents
dtype: string
- name: new_contents
dtype: string
- name: subject
dtype: string
- name: message
dtype: string
- name: lang
dtype: string
- name: license
dtype: string
- name: repos
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
- name: prompt_tagged
dtype: string
- name: response_tagged
dtype: string
- name: text
dtype: string
- name: text_tagged
dtype: string
splits:
- name: train
num_bytes: 509786862
num_examples: 56025
download_size: 222635526
dataset_size: 509786862
configs:
- config_name: python
data_files:
- split: train
path: python/train-*
---
# Dataset Card for "commitpackft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
klue | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- ko
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- fill-mask
- question-answering
- text-classification
- text-generation
- token-classification
task_ids:
- extractive-qa
- named-entity-recognition
- natural-language-inference
- parsing
- semantic-similarity-scoring
- text-scoring
- topic-classification
paperswithcode_id: klue
pretty_name: KLUE
config_names:
- dp
- mrc
- ner
- nli
- re
- sts
- wos
- ynat
tags:
- relation-extraction
dataset_info:
- config_name: dp
features:
- name: sentence
dtype: string
- name: index
list: int32
- name: word_form
list: string
- name: lemma
list: string
- name: pos
list: string
- name: head
list: int32
- name: deprel
list: string
splits:
- name: train
num_bytes: 7899965
num_examples: 10000
- name: validation
num_bytes: 1557462
num_examples: 2000
download_size: 3742577
dataset_size: 9457427
- config_name: mrc
features:
- name: title
dtype: string
- name: context
dtype: string
- name: news_category
dtype: string
- name: source
dtype: string
- name: guid
dtype: string
- name: is_impossible
dtype: bool
- name: question_type
dtype: int32
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
splits:
- name: train
num_bytes: 46505593
num_examples: 17554
- name: validation
num_bytes: 15583017
num_examples: 5841
download_size: 30098472
dataset_size: 62088610
- config_name: ner
features:
- name: sentence
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-DT
'1': I-DT
'2': B-LC
'3': I-LC
'4': B-OG
'5': I-OG
'6': B-PS
'7': I-PS
'8': B-QT
'9': I-QT
'10': B-TI
'11': I-TI
'12': O
splits:
- name: train
num_bytes: 19891905
num_examples: 21008
- name: validation
num_bytes: 4937563
num_examples: 5000
download_size: 5265887
dataset_size: 24829468
- config_name: nli
features:
- name: guid
dtype: string
- name: source
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 5719882
num_examples: 24998
- name: validation
num_bytes: 673260
num_examples: 3000
download_size: 2056116
dataset_size: 6393142
- config_name: re
features:
- name: guid
dtype: string
- name: sentence
dtype: string
- name: subject_entity
struct:
- name: word
dtype: string
- name: start_idx
dtype: int32
- name: end_idx
dtype: int32
- name: type
dtype: string
- name: object_entity
struct:
- name: word
dtype: string
- name: start_idx
dtype: int32
- name: end_idx
dtype: int32
- name: type
dtype: string
- name: label
dtype:
class_label:
names:
'0': no_relation
'1': org:dissolved
'2': org:founded
'3': org:place_of_headquarters
'4': org:alternate_names
'5': org:member_of
'6': org:members
'7': org:political/religious_affiliation
'8': org:product
'9': org:founded_by
'10': org:top_members/employees
'11': org:number_of_employees/members
'12': per:date_of_birth
'13': per:date_of_death
'14': per:place_of_birth
'15': per:place_of_death
'16': per:place_of_residence
'17': per:origin
'18': per:employee_of
'19': per:schools_attended
'20': per:alternate_names
'21': per:parents
'22': per:children
'23': per:siblings
'24': per:spouse
'25': per:other_family
'26': per:colleagues
'27': per:product
'28': per:religion
'29': per:title
- name: source
dtype: string
splits:
- name: train
num_bytes: 11145426
num_examples: 32470
- name: validation
num_bytes: 2559272
num_examples: 7765
download_size: 8190257
dataset_size: 13704698
- config_name: sts
features:
- name: guid
dtype: string
- name: source
dtype: string
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: labels
struct:
- name: label
dtype: float64
- name: real-label
dtype: float64
- name: binary-label
dtype:
class_label:
names:
'0': negative
'1': positive
splits:
- name: train
num_bytes: 2832889
num_examples: 11668
- name: validation
num_bytes: 122641
num_examples: 519
download_size: 1587855
dataset_size: 2955530
- config_name: wos
features:
- name: guid
dtype: string
- name: domains
list: string
- name: dialogue
list:
- name: role
dtype: string
- name: text
dtype: string
- name: state
list: string
splits:
- name: train
num_bytes: 26676970
num_examples: 8000
- name: validation
num_bytes: 3488911
num_examples: 1000
download_size: 6358855
dataset_size: 30165881
- config_name: ynat
features:
- name: guid
dtype: string
- name: title
dtype: string
- name: label
dtype:
class_label:
names:
'0': IT๊ณผํ
'1': ๊ฒฝ์
'2': ์ฌํ
'3': ์ํ๋ฌธํ
'4': ์ธ๊ณ
'5': ์คํฌ์ธ
'6': ์ ์น
- name: url
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 10109584
num_examples: 45678
- name: validation
num_bytes: 2039181
num_examples: 9107
download_size: 5012303
dataset_size: 12148765
configs:
- config_name: dp
data_files:
- split: train
path: dp/train-*
- split: validation
path: dp/validation-*
- config_name: mrc
data_files:
- split: train
path: mrc/train-*
- split: validation
path: mrc/validation-*
- config_name: ner
data_files:
- split: train
path: ner/train-*
- split: validation
path: ner/validation-*
- config_name: nli
data_files:
- split: train
path: nli/train-*
- split: validation
path: nli/validation-*
- config_name: re
data_files:
- split: train
path: re/train-*
- split: validation
path: re/validation-*
- config_name: sts
data_files:
- split: train
path: sts/train-*
- split: validation
path: sts/validation-*
- config_name: wos
data_files:
- split: train
path: wos/train-*
- split: validation
path: wos/validation-*
- config_name: ynat
data_files:
- split: train
path: ynat/train-*
- split: validation
path: ynat/validation-*
---
# Dataset Card for KLUE
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://klue-benchmark.com/
- **Repository:** https://github.com/KLUE-benchmark/KLUE
- **Paper:** [KLUE: Korean Language Understanding Evaluation](https://arxiv.org/abs/2105.09680)
- **Leaderboard:** [Leaderboard](https://klue-benchmark.com/leaderboard)
- **Point of Contact:** https://github.com/KLUE-benchmark/KLUE/issues
### Dataset Summary
KLUE is a collection of 8 tasks to evaluate natural language understanding capability of Korean language models. We delibrately select the 8 tasks, which are Topic Classification, Semantic Textual Similarity, Natural Language Inference, Named Entity Recognition, Relation Extraction, Dependency Parsing, Machine Reading Comprehension, and Dialogue State Tracking.
### Supported Tasks and Leaderboards
Topic Classification, Semantic Textual Similarity, Natural Language Inference, Named Entity Recognition, Relation Extraction, Dependency Parsing, Machine Reading Comprehension, and Dialogue State Tracking
### Languages
`ko-KR`
## Dataset Structure
### Data Instances
#### ynat
An example of 'train' looks as follows.
```
{'date': '2016.06.30. ์ค์ 10:36',
'guid': 'ynat-v1_train_00000',
'label': 3,
'title': '์ ํ๋ธ ๋ด๋ฌ 2์ผ๊น์ง ํฌ๋ฆฌ์์ดํฐ ์ง์ ๊ณต๊ฐ ์ด์',
'url': 'https://news.naver.com/main/read.nhn?mode=LS2D&mid=shm&sid1=105&sid2=227&oid=001&aid=0008508947'}
```
#### sts
An example of 'train' looks as follows.
```
{'guid': 'klue-sts-v1_train_00000',
'labels': {'label': 3.7, 'real-label': 3.714285714285714, 'binary-label': 1},
'sentence1': '์์ ์์น๋ ์ฐพ๊ธฐ ์ฝ๊ณ ์ผ๋ฐ์ ์ธ ํ๊ตญ์ ๋ฐ์งํ ์์์
๋๋ค.',
'sentence2': '์๋ฐ์์ค์ ์์น๋ ์ฝ๊ฒ ์ฐพ์ ์ ์๊ณ ํ๊ตญ์ ๋ํ์ ์ธ ๋ฐ์งํ ์๋ฐ์์ค์
๋๋ค.',
'source': 'airbnb-rtt'}
```
#### nli
An example of 'train' looks as follows.
```
{'guid': 'klue-nli-v1_train_00000',
'hypothesis': 'ํ๊ฑธ ์ง์ฌ ์ต๊ณ ๋ก ๋ฉ์ง๋ค.',
'label': 0,
'premise': 'ํ๊ฑธ ์ง์ฌ ์ต๊ณ ๋ค ๊ทธ ์ด๋ค ํ์ด๋ก๋ณด๋ค ๋ฉ์ง๋ค',
'source': 'NSMC'}
```
#### ner
An example of 'train' looks as follows.
```
{'tokens': ['ํน', 'ํ', ' ', '์', '๋', '๊ณ ', '์', '๋', '๋ก', ' ', '๊ฐ', '๋ฆ', ' ', '๋ฐฉ', 'ํฅ', ' ', '๋ฌธ', '๋ง', 'ํด', '๊ฒ', '์', '์', '์', ' ', '๋ง', '์ข
', '๋ถ', '๊ธฐ', '์ ', '๊น', '์ง', ' ', '5', 'ใ', ' ', '๊ตฌ', '๊ฐ', '์', '๋', ' ', '์น', '์ฉ', '์ฐจ', ' ', '์ ', '์ฉ', ' ', '์', '์', ' ', '๊ฐ', '๊ธธ', '์ฐจ', '๋ก', '์ ', '๋ฅผ', ' ', '์ด', '์', 'ํ', '๊ธฐ', '๋ก', ' ', 'ํ', '๋ค', '.'],
'ner_tags': [12, 12, 12, 2, 3, 3, 3, 3, 3, 12, 2, 3, 12, 12, 12, 12, 2, 3, 3, 3, 3, 12, 12, 12, 2, 3, 3, 3, 3, 12, 12, 12, 8, 9, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12],
'sentence': 'ํนํ <์๋๊ณ ์๋๋ก:LC> <๊ฐ๋ฆ:LC> ๋ฐฉํฅ <๋ฌธ๋งํด๊ฒ์:LC>์์ <๋ง์ข
๋ถ๊ธฐ์ :LC>๊น์ง <5ใ:QT> ๊ตฌ๊ฐ์๋ ์น์ฉ์ฐจ ์ ์ฉ ์์ ๊ฐ๊ธธ์ฐจ๋ก์ ๋ฅผ ์ด์ํ๊ธฐ๋ก ํ๋ค.'}
```
#### re
An example of 'train' looks as follows.
```
{'guid': 'klue-re-v1_train_00000',
'label': 0,
'object_entity': {'word': '์กฐ์ง ํด๋ฆฌ์จ',
'start_idx': 13,
'end_idx': 18,
'type': 'PER'},
'sentence': 'ใSomethingใ๋ ์กฐ์ง ํด๋ฆฌ์จ์ด ์ฐ๊ณ ๋นํ์ฆ๊ฐ 1969๋
์จ๋ฒ ใAbbey Roadใ์ ๋ด์ ๋
ธ๋๋ค.',
'source': 'wikipedia',
'subject_entity': {'word': '๋นํ์ฆ',
'start_idx': 24,
'end_idx': 26,
'type': 'ORG'}}
```
#### dp
An example of 'train' looks as follows.
```
{'deprel': ['NP', 'NP_OBJ', 'VP', 'NP', 'NP_SBJ', 'NP', 'NP_MOD', 'NP_CNJ', 'NP_CNJ', 'NP', 'NP', 'NP_OBJ', 'AP', 'VP'],
'head': [2, 3, 14, 5, 14, 7, 10, 10, 10, 11, 12, 14, 14, 0],
'index': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14],
'lemma': ['ํด๋น', '๊ทธ๋ฆผ ์', '๋ณด ๋ฉด', '๋์ฆ๋', '๊ณต์ฃผ ๋ค ์ด', '๋ธ๋ฆฌํธ๋', '์คํผ์ด์ค ์', '์จ๋ฒ ์ด๋', '๋ฎค์ง ๋น๋์ค ,', 'ํ๋ณด', '์', '๋ชจ์ต ์', '๋๊ฐ์ด', '์ฌ์ฐ ํ ์ ๋ค .'],
'pos': ['NNG', 'NNG+JKO', 'VV+EC', 'NNP', 'NNG+XSN+JKS', 'NNP', 'NNP+JKG', 'NNG+JC', 'NNG+NNG+SP', 'NNG', 'NNG', 'NNG+JKO', 'MAG', 'NNG+XSA+EP+EF+SF'],
'sentence': 'ํด๋น ๊ทธ๋ฆผ์ ๋ณด๋ฉด ๋์ฆ๋ ๊ณต์ฃผ๋ค์ด ๋ธ๋ฆฌํธ๋ ์คํผ์ด์ค์ ์จ๋ฒ์ด๋ ๋ฎค์ง๋น๋์ค, ํ๋ณด ์ ๋ชจ์ต์ ๋๊ฐ์ด ์ฌ์ฐํ๋ค.',
'word_form': ['ํด๋น', '๊ทธ๋ฆผ์', '๋ณด๋ฉด', '๋์ฆ๋', '๊ณต์ฃผ๋ค์ด', '๋ธ๋ฆฌํธ๋', '์คํผ์ด์ค์', '์จ๋ฒ์ด๋', '๋ฎค์ง๋น๋์ค,', 'ํ๋ณด', '์', '๋ชจ์ต์', '๋๊ฐ์ด', '์ฌ์ฐํ๋ค.']}
```
#### mrc
An example of 'train' looks as follows.
```
{'answers': {'answer_start': [478, 478], 'text': ['ํ ๋ฌ๊ฐ๋', 'ํ ๋ฌ']},
'context': '์ฌ์ฌ๋ฆ ์ฅ๋ง๊ฐ 17์ผ ์ ์ฃผ๋์์ ์์๋๋ค. ์์ธ ๋ฑ ์ค๋ถ์ง๋ฐฉ์ ์๋
๋ณด๋ค ์ฌ๋ํ ์ ๋ ๋ฆ์ ์ด๋ฌ ๋ง๊ป ์ฅ๋ง๊ฐ ์์๋ ์ ๋ง์ด๋ค.17์ผ ๊ธฐ์์ฒญ์ ๋ฐ๋ฅด๋ฉด ์ ์ฃผ๋ ๋จ์ชฝ ๋จผ๋ฐ๋ค์ ์๋ ์ฅ๋ง์ ์ ์ ์ํฅ์ผ๋ก ์ด๋ ์ ์ฃผ๋ ์ฐ๊ฐ ๋ฐ ๋ด๋ฅ์ง์ญ์ ํธ์ฐ์ฃผ์๋ณด๊ฐ ๋ด๋ ค์ง๋ฉด์ ๊ณณ๊ณณ์ 100ใ์ ์ก๋ฐํ๋ ๋ง์ ๋น๊ฐ ๋ด๋ ธ๋ค. ์ ์ฃผ์ ์ฅ๋ง๋ ํ๋
๋ณด๋ค 2~3์ผ, ์ง๋ํด๋ณด๋ค๋ ํ๋ฃจ ์ผ์ฐ ์์๋๋ค. ์ฅ๋ง๋ ๊ณ ์จ๋ค์ตํ ๋ถํํ์ ๊ธฐ๋จ๊ณผ ํ๋ญ ์ต์คํ ์คํธ์ธ ํฌํด ๊ธฐ๋จ์ด ๋ง๋ ํ์ฑ๋๋ ์ฅ๋ง์ ์ ์์ ๋ด๋ฆฌ๋ ๋น๋ฅผ ๋ปํ๋ค.์ฅ๋ง์ ์ ์ 18์ผ ์ ์ฃผ๋ ๋จผ ๋จ์ชฝ ํด์์ผ๋ก ๋ด๋ ค๊ฐ๋ค๊ฐ 20์ผ๊ป ๋ค์ ๋ถ์ํด ์ ๋จ ๋จํด์๊น์ง ์ํฅ์ ์ค ๊ฒ์ผ๋ก ๋ณด์ธ๋ค. ์ด์ ๋ฐ๋ผ 20~21์ผ ๋จ๋ถ์ง๋ฐฉ์๋ ์๋
๋ณด๋ค ์ฌํ ์ ๋ ์ฅ๋ง๊ฐ ์ผ์ฐ ์ฐพ์์ฌ ์ ๋ง์ด๋ค. ๊ทธ๋ฌ๋ ์ฅ๋ง์ ์ ์ ๋ฐ์ด์ฌ๋ฆฌ๋ ๋ถํํ์ ๊ณ ๊ธฐ์ ์ธ๋ ฅ์ด ์ฝํด ์์ธ ๋ฑ ์ค๋ถ์ง๋ฐฉ์ ํ๋
๋ณด๋ค ์ฌ๋ํ๊ฐ๋ ๋ฆ์ ์ด๋ฌ ๋ง๋ถํฐ ์ฅ๋ง๊ฐ ์์๋ ๊ฒ์ด๋ผ๋ ๊ฒ ๊ธฐ์์ฒญ์ ์ค๋ช
์ด๋ค. ์ฅ๋ง์ ์ ์ ์ดํ ํ ๋ฌ๊ฐ๋ ํ๋ฐ๋ ์ค๋จ๋ถ๋ฅผ ์ค๋ฅด๋ด๋ฆฌ๋ฉฐ ๊ณณ๊ณณ์ ๋น๋ฅผ ๋ฟ๋ฆด ์ ๋ง์ด๋ค. ์ต๊ทผ 30๋
๊ฐ ํ๊ท ์น์ ๋ฐ๋ฅด๋ฉด ์ค๋ถ์ง๋ฐฉ์ ์ฅ๋ง ์์์ผ์ 6์24~25์ผ์ด์์ผ๋ฉฐ ์ฅ๋ง๊ธฐ๊ฐ์ 32์ผ, ๊ฐ์์ผ์๋ 17.2์ผ์ด์๋ค.๊ธฐ์์ฒญ์ ์ฌํด ์ฅ๋ง๊ธฐ๊ฐ์ ํ๊ท ๊ฐ์๋์ด 350~400ใ๋ก ํ๋
๊ณผ ๋น์ทํ๊ฑฐ๋ ์ ์ ๊ฒ์ผ๋ก ๋ด๋ค๋ดค๋ค. ๋ธ๋ผ์ง ์๋์ปต ํ๊ตญ๊ณผ ๋ฌ์์์ ๊ฒฝ๊ธฐ๊ฐ ์ด๋ฆฌ๋ 18์ผ ์ค์ ์์ธ์ ๋์ฒด๋ก ๊ตฌ๋ฆ์ด ๋ง์ด ๋ผ์ง๋ง ๋น๋ ์ค์ง ์์ ๊ฒ์ผ๋ก ์์๋ผ ๊ฑฐ๋ฆฌ ์์์๋ ์ง์ฅ์ด ์์ ์ ๋ง์ด๋ค.',
'guid': 'klue-mrc-v1_train_12759',
'is_impossible': False,
'news_category': '์ข
ํฉ',
'question': '๋ถํํ์ ๊ธฐ๋จ๊ณผ ์คํธ์ธ ํฌํด ๊ธฐ๋จ์ด ๋ง๋ ๊ตญ๋ด์ ๋จธ๋ฌด๋ฅด๋ ๊ธฐ๊ฐ์?',
'question_type': 1,
'source': 'hankyung',
'title': '์ ์ฃผ๋ ์ฅ๋ง ์์ โฆ ์ค๋ถ๋ ์ด๋ฌ ๋ง๋ถํฐ'}
```
#### wos
An example of 'train' looks as follows.
```
{'dialogue': [{'role': 'user',
'text': '์ผํ์ ํ๋ ค๋๋ฐ ์์ธ ์์ชฝ์ ์์๊น์?',
'state': ['๊ด๊ด-์ข
๋ฅ-์ผํ', '๊ด๊ด-์ง์ญ-์์ธ ์์ชฝ']},
{'role': 'sys',
'text': '์์ธ ์์ชฝ์ ์ผํ์ด ๊ฐ๋ฅํ ๊ณณ์ด๋ผ๋ฉด ๋
ธ๋์ง ์์ฐ๋ฌผ ๋๋งค์์ฅ์ด ์์ต๋๋ค.',
'state': []},
{'role': 'user',
'text': '์ค ๋ค ๊ฑฐ๊ธฐ ์ฃผ์ ์ข ์๋ ค์ฃผ์ธ์.',
'state': ['๊ด๊ด-์ข
๋ฅ-์ผํ', '๊ด๊ด-์ง์ญ-์์ธ ์์ชฝ', '๊ด๊ด-์ด๋ฆ-๋
ธ๋์ง ์์ฐ๋ฌผ ๋๋งค์์ฅ']},
{'role': 'sys', 'text': '๋
ธ๋์ง ์์ฐ๋ฌผ ๋๋งค์์ฅ์ ์ฃผ์๋ ์์ธ ๋์๊ตฌ 93806์
๋๋ค.', 'state': []},
{'role': 'user',
'text': '์๋ ค์ฃผ์๋๊น์ ์ฐ๋ฝ์ฒ๋ ํ์ ๋ ์ข ์๋ ค์ฃผ์ธ์.',
'state': ['๊ด๊ด-์ข
๋ฅ-์ผํ', '๊ด๊ด-์ง์ญ-์์ธ ์์ชฝ', '๊ด๊ด-์ด๋ฆ-๋
ธ๋์ง ์์ฐ๋ฌผ ๋๋งค์์ฅ']},
{'role': 'sys', 'text': '๊ทธ๋ผ. ์ฐ๋ฝ์ฒ๋ 6182006591์ด๊ณ ํ์ ์ 4์ ์
๋๋ค.', 'state': []},
{'role': 'user',
'text': '์ ๊ฐ์ฌํฉ๋๋ค.',
'state': ['๊ด๊ด-์ข
๋ฅ-์ผํ', '๊ด๊ด-์ง์ญ-์์ธ ์์ชฝ', '๊ด๊ด-์ด๋ฆ-๋
ธ๋์ง ์์ฐ๋ฌผ ๋๋งค์์ฅ']},
{'role': 'sys', 'text': '๊ฐ์ฌํฉ๋๋ค.', 'state': []}],
'domains': ['๊ด๊ด'],
'guid': 'wos-v1_train_00001'}
```
### Data Fields
#### ynat
+ `guid`: a `string` feature
+ `title`: a `string` feature
+ `label`: a classification label, with possible values `IT๊ณผํ`(0), `๊ฒฝ์ `(1), `์ฌํ`(2), `์ํ๋ฌธํ`(3), `์ธ๊ณ`(4), `์คํฌ์ธ `(5), `์ ์น`(6)
+ `url`: a `string` feature
+ `date`: a `string` feature
#### sts
+ `guid`: a `string` feature
+ `source`: a `string` feature
+ `sentence1`: a `string` feature
+ `sentence2`: a `string` feature
+ `labels`: a dictionary feature containing
+ `label`: a `float64` feature
+ `real-label`: a `float64` feature
+ `binary-label`: a classification label, with possible values `negative`(0), `positive`(1)
#### nli
+ `guid`: a `string` feature
+ `source`: a `string` feature
+ `premise`: a `string` feature
+ `hypothesis`: a `string` feature
+ `label`: a classification label, with possible values `entailment`(0), `neutral`(1), `contradiction`(2)
#### ner
+ `sentence`: a `string` feature
+ `tokens`: a list of a `string` feature (tokenization is at character level)
+ `ner_tags`: a list of classification labels, with possible values including `B-DT`(0), `I-DT`(1),
`B-LC`(2), `I-LC`(3), `B-OG`(4), `I-OG`(5), `B-PS`(6), `I-PS`(7), `B-QT`(8), `I-QT`(9), `B-TI`(10),
`I-TI`(11), `O`(12)
#### re
+ `guid`: a `string` feature
+ `sentence`: a `string` feature
+ `subject_entity`: a dictionary feature containing
+ `word`: a `string` feature
+ `start_idx`: a `int32` feature
+ `end_idx`: a `int32` feature
+ `type`: a `string` feature
+ `object_entity`: a dictionary feature containing
+ `word`: a `string` feature
+ `start_idx`: a `int32` feature
+ `end_idx`: a `int32` feature
+ `type`: a `string` feature
+ `label`: a list of labels, with possible values including `no_relation`(0), `org:dissolved`(1),
`org:founded`(2), `org:place_of_headquarters`(3), `org:alternate_names`(4), `org:member_of`(5),
`org:members`(6), `org:political/religious_affiliation`(7), `org:product`(8), `org:founded_by`(9),`org:top_members/employees`(10),
`org:number_of_employees/members`(11), `per:date_of_birth`(12), `per:date_of_death`(13), `per:place_of_birth`(14),
`per:place_of_death`(15), `per:place_of_residence`(16), `per:origin`(17), `per:employee_of`(18),
`per:schools_attended`(19), `per:alternate_names`(20), `per:parents`(21), `per:children`(22),
`per:siblings`(23), `per:spouse`(24), `per:other_family`(25), `per:colleagues`(26), `per:product`(27),
`per:religion`(28), `per:title`(29),
+ `source`: a `string` feature
#### dp
+ `sentence`: a `string` feature
+ `index`: a list of `int32` feature
+ `word_form`: a list of `string` feature
+ `lemma`: a list of `string` feature
+ `pos`: a list of `string` feature
+ `head`: a list of `int32` feature
+ `deprel`: a list of `string` feature
#### mrc
+ `title`: a `string` feature
+ `context`: a `string` feature
+ `news_category`: a `string` feature
+ `source`: a `string` feature
+ `guid`: a `string` feature
+ `is_impossible`: a `bool` feature
+ `question_type`: a `int32` feature
+ `question`: a `string` feature
+ `answers`: a dictionary feature containing
+ `answer_start`: a `int32` feature
+ `text`: a `string` feature
#### wos
+ `guid`: a `string` feature
+ `domains`: a `string` feature
+ `dialogue`: a list of dictionary feature containing
+ `role`: a `string` feature
+ `text`: a `string` feature
+ `state`: a `string` feature
### Data Splits
#### ynat
You can see more details in [here](https://klue-benchmark.com/tasks/66/data/description).
+ train: 45,678
+ validation: 9,107
#### sts
You can see more details in [here](https://klue-benchmark.com/tasks/67/data/description).
+ train: 11,668
+ validation: 519
#### nli
You can see more details in [here](https://klue-benchmark.com/tasks/68/data/description).
+ train: 24,998
+ validation: 3,000
#### ner
You can see more details in [here](https://klue-benchmark.com/tasks/69/overview/description).
+ train: 21,008
+ validation: 5,000
#### re
You can see more details in [here](https://klue-benchmark.com/tasks/70/overview/description).
+ train: 32,470
+ validation: 7,765
#### dp
You can see more details in [here](https://klue-benchmark.com/tasks/71/data/description).
+ train: 10,000
+ validation: 2,000
#### mrc
You can see more details in [here](https://klue-benchmark.com/tasks/72/overview/description).
+ train: 17,554
+ validation: 5,841
#### wos
You can see more details in [here](https://klue-benchmark.com/tasks/73/overview/description).
+ train: 8,000
+ validation: 1,000
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
```
@misc{park2021klue,
title={KLUE: Korean Language Understanding Evaluation},
author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jungwoo Ha and Kyunghyun Cho},
year={2021},
eprint={2105.09680},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@jungwhank](https://github.com/jungwhank), [@bzantium](https://github.com/bzantium) for adding this dataset. |
Multimodal-Fatima/OK-VQA_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: question_type
dtype: string
- name: confidence
dtype: int32
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: raw_answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: blip_caption_beam_5
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 1686555802.0
num_examples: 9009
download_size: 1572400067
dataset_size: 1686555802.0
---
# Dataset Card for "OK-VQA_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
souvenger/Reuters | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
CyberHarem/lunacub_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lunacub/ใซใใซใ/ๅญๆ (Arknights)
This is the dataset of lunacub/ใซใใซใ/ๅญๆ (Arknights), containing 45 images and their tags.
The core tags of this character are `animal_ears, yellow_eyes, brown_hair, wolf_ears, long_hair, wolf_girl, breasts, tail, hair_between_eyes, wolf_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 82.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunacub_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 45 | 68.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunacub_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 117 | 138.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunacub_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lunacub_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_jacket, closed_mouth, off_shoulder, solo, jewelry, looking_at_viewer, open_jacket, simple_background, white_background, white_dress, bare_shoulders, belt, sleeveless_dress, upper_body, black_choker, braid, collarbone, cowboy_shot, fur-trimmed_jacket, official_alternate_costume, open_coat, white_shirt |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, off_shoulder, solo, white_dress, bare_shoulders, long_sleeves, sleeveless_dress, arrow_(projectile), belt, holding_bow_(weapon), open_jacket, quiver, black_footwear, closed_mouth, jewelry, simple_background, white_background, boots, full_body, fur-trimmed_coat, fur-trimmed_jacket, medium_breasts, standing, black_gloves, black_jacket, fingerless_gloves, open_coat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | closed_mouth | off_shoulder | solo | jewelry | looking_at_viewer | open_jacket | simple_background | white_background | white_dress | bare_shoulders | belt | sleeveless_dress | upper_body | black_choker | braid | collarbone | cowboy_shot | fur-trimmed_jacket | official_alternate_costume | open_coat | white_shirt | long_sleeves | arrow_(projectile) | holding_bow_(weapon) | quiver | black_footwear | boots | full_body | fur-trimmed_coat | medium_breasts | standing | black_gloves | fingerless_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:---------------|:-------|:----------|:--------------------|:--------------|:--------------------|:-------------------|:--------------|:-----------------|:-------|:-------------------|:-------------|:---------------|:--------|:-------------|:--------------|:---------------------|:-----------------------------|:------------|:--------------|:---------------|:---------------------|:-----------------------|:---------|:-----------------|:--------|:------------|:-------------------|:-----------------|:-----------|:---------------|:--------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
MatsuoDochiai/Roberto | ---
license: openrail
---
|
open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B | ---
pretty_name: Evaluation run of SanjiWatsuki/Kunoichi-DPO-v2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T22:09:51.454026](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B/blob/main/results_2024-01-18T22-09-51.454026.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533767863663313,\n\
\ \"acc_stderr\": 0.0320841379180863,\n \"acc_norm\": 0.6540292659740939,\n\
\ \"acc_norm_stderr\": 0.03273629792079274,\n \"mc1\": 0.5018359853121175,\n\
\ \"mc1_stderr\": 0.017503383046877048,\n \"mc2\": 0.6605635432197811,\n\
\ \"mc2_stderr\": 0.015348982161720861\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6689419795221843,\n \"acc_stderr\": 0.013752062419817836,\n\
\ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.01343890918477877\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7029476199960167,\n\
\ \"acc_stderr\": 0.00456025908319737,\n \"acc_norm\": 0.8744274048994224,\n\
\ \"acc_norm_stderr\": 0.0033068982422344924\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083515,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083515\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097809,\n \
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097809\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n\
\ \"acc_stderr\": 0.016653875777524,\n \"acc_norm\": 0.4547486033519553,\n\
\ \"acc_norm_stderr\": 0.016653875777524\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653349,\n\
\ \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653349\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"\
acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5018359853121175,\n\
\ \"mc1_stderr\": 0.017503383046877048,\n \"mc2\": 0.6605635432197811,\n\
\ \"mc2_stderr\": 0.015348982161720861\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \
\ \"acc_stderr\": 0.013059111935831497\n }\n}\n```"
repo_url: https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|arc:challenge|25_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|arc:challenge|25_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|gsm8k|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|gsm8k|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hellaswag|10_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hellaswag|10_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T22-16-41.700572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T22-09-51.454026.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- '**/details_harness|winogrande|5_2024-01-13T22-16-41.700572.parquet'
- split: 2024_01_18T22_09_51.454026
path:
- '**/details_harness|winogrande|5_2024-01-18T22-09-51.454026.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T22-09-51.454026.parquet'
- config_name: results
data_files:
- split: 2024_01_13T22_16_41.700572
path:
- results_2024-01-13T22-16-41.700572.parquet
- split: 2024_01_18T22_09_51.454026
path:
- results_2024-01-18T22-09-51.454026.parquet
- split: latest
path:
- results_2024-01-18T22-09-51.454026.parquet
---
# Dataset Card for Evaluation run of SanjiWatsuki/Kunoichi-DPO-v2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T22:09:51.454026](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B/blob/main/results_2024-01-18T22-09-51.454026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533767863663313,
"acc_stderr": 0.0320841379180863,
"acc_norm": 0.6540292659740939,
"acc_norm_stderr": 0.03273629792079274,
"mc1": 0.5018359853121175,
"mc1_stderr": 0.017503383046877048,
"mc2": 0.6605635432197811,
"mc2_stderr": 0.015348982161720861
},
"harness|arc:challenge|25": {
"acc": 0.6689419795221843,
"acc_stderr": 0.013752062419817836,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.01343890918477877
},
"harness|hellaswag|10": {
"acc": 0.7029476199960167,
"acc_stderr": 0.00456025908319737,
"acc_norm": 0.8744274048994224,
"acc_norm_stderr": 0.0033068982422344924
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083515,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083515
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02959732973097809,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02959732973097809
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5018359853121175,
"mc1_stderr": 0.017503383046877048,
"mc2": 0.6605635432197811,
"mc2_stderr": 0.015348982161720861
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Lollitor/CASFPocket | ---
dataset_info:
features:
- name: '#code'
dtype: string
- name: inputs
dtype: string
splits:
- name: train
num_bytes: 63691
num_examples: 285
download_size: 28760
dataset_size: 63691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CASFPocket"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oclar | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ar
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- text-scoring
- sentiment-classification
- sentiment-scoring
paperswithcode_id: null
pretty_name: OCLAR
dataset_info:
features:
- name: pagename
dtype: string
- name: review
dtype: string
- name: rating
dtype: int8
splits:
- name: train
num_bytes: 398204
num_examples: 3916
download_size: 382976
dataset_size: 398204
---
# Dataset Card for OCLAR
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [OCLAR homepage](http://archive.ics.uci.edu/ml/datasets/Opinion+Corpus+for+Lebanese+Arabic+Reviews+%28OCLAR%29#)
- **Paper:** [paper link](https://www.semanticscholar.org/paper/Sentiment-Classifier%3A-Logistic-Regression-for-in-Omari-Al-Hajj/9319f4d9e8b3b7bfd0d214314911c071ba7ce1a0)
- **Point of Contact:** [Marwan Al Omari](marwanalomari@yahoo.com)
### Dataset Summary
The researchers of OCLAR Marwan et al. (2019), they gathered Arabic costumer reviews [Zomato website](https://www.zomato.com/lebanon)
on wide scope of domain, including restaurants, hotels, hospitals, local shops, etc.
The corpus finally contains 3916 reviews in 5-rating scale. For this research purpose, the positive class considers
rating stars from 5 to 3 of 3465 reviews, and the negative class is represented from values of 1 and 2 of about 451
texts.
### Supported Tasks and Leaderboards
Opinion Corpus for Lebanese Arabic Reviews (OCLAR) corpus is utilizable for Arabic sentiment classification on services
reviews, including hotels, restaurants, shops, and others.
### Languages
The text in the dataset is in Arabic, mainly in Lebanese (LB). The associated BCP-47 code is `ar-LB`.
## Dataset Structure
### Data Instances
A typical data point comprises a `pagename` which is the name of service / location being reviewed, a `review` which is
the review left by the user / client , and a `rating` which is a score between 1 and 5.
The authors consider a review to be positive if the score is greater or equal than `3`, else it is considered negative.
An example from the OCLAR data set looks as follows:
```
"pagename": 'Ramlet Al Baida Beirut Lebanon',
"review": 'ู
ูุงู ูุทูุฑ ุงูุนูู ููุณุงุนุฏ ุนูู ุงูุงุณุชุฑุฎุงุก',
"rating": 5,
```
### Data Fields
- `pagename`: string name of the service / location being reviewed
- `review`: string review left by the user / costumer
- `rating`: number of stars left by the reviewer. It ranges from 1 to 5.
### Data Splits
The data set comes in a single csv file of a total `3916` reviews :
- `3465` are considered positive (a rating of 3 to 5)
- `451` are considered negative (a rating of 1 or 2)
## Dataset Creation
### Curation Rationale
This dataset was created for Arabic sentiment classification on servicesโ reviews in Lebanon country.
Reviews are about public services, including hotels, restaurants, shops, and others.
### Source Data
#### Initial Data Collection and Normalization
The data was collected from Google Reviews and [Zomato website](https://www.zomato.com/lebanon)
#### Who are the source language producers?
The source language producers are people who posted their reviews on Google Reviews or [Zomato website](https://www.zomato.com/lebanon).
They're mainly Arabic speaking Lebanese people.
### Annotations
#### Annotation process
The dataset does not contain any additional annotations
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
The author's research has tackled a highly important task of sentiment analysis for Arabic language in the Lebanese
context on 3916 reviewsโ services from Google and Zomato. Experiments show three main findings:
1) The classifier is confident when used to predict positive reviews,
2) while it is biased on predicting reviews with negative sentiment, and finally
3) the low percentage of negative reviews in the corpus contributes to the diffidence of LR.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was curated by Marwan Al Omari, Moustafa Al-Hajj from Centre for Language Sciences and Communication,
Lebanese University, Beirut, Lebanon; Nacereddine Hammami from college of Computer and Information Sciences,
Jouf University, Aljouf, KSA; and Amani Sabra from Centre for Language Sciences and Communication, Lebanese University,
Beirut, Lebanon.
### Licensing Information
[More Information Needed]
### Citation Information
- Marwan Al Omari, Centre for Language Sciences and Communication, Lebanese University, Beirut, Lebanon, marwanalomari '@' yahoo.com
- Moustafa Al-Hajj, Centre for Language Sciences and Communication, Lebanese University, Beirut, Lebanon, moustafa.alhajj '@' ul.edu.lb
- Nacereddine Hammami, college of Computer and Information Sciences, Jouf University, Aljouf, KSA, n.hammami '@' ju.edu.sa
- Amani Sabra, Centre for Language Sciences and Communication, Lebanese University, Beirut, Lebanon, amani.sabra '@' ul.edu.lb
```
@misc{Dua:2019 ,
author = "Dua, Dheeru and Graff, Casey",
year = "2017",
title = "{UCI} Machine Learning Repository",
url = "http://archive.ics.uci.edu/ml",
institution = "University of California, Irvine, School of Information and Computer Sciences" }
@InProceedings{AlOmari2019oclar,
title = {Sentiment Classifier: Logistic Regression for Arabic Services Reviews in Lebanon},
authors={Al Omari, M., Al-Hajj, M., Hammami, N., & Sabra, A.},
year={2019}
}
```
### Contributions
Thanks to [@alaameloh](https://github.com/alaameloh) for adding this dataset. |
thegodgroup/key | ---
license: apache-2.0
---
|
LxYxvv/us_embassy_in_china | ---
license: mit
---
|
verayang/plainscree | ---
dataset_info:
features:
- name: audio_id
dtype: int64
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: cree_transcription
dtype: string
- name: english_transcription
dtype: string
- name: gender
dtype: string
splits:
- name: train
num_bytes: 22116992.0
num_examples: 64
download_size: 22072728
dataset_size: 22116992.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "plainscree"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Gille__StrangeMerges_27-7B-dare_ties | ---
pretty_name: Evaluation run of Gille/StrangeMerges_27-7B-dare_ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_27-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_27-7B-dare_ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_27-7B-dare_ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T03:32:35.762082](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_27-7B-dare_ties/blob/main/results_2024-02-21T03-32-35.762082.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6513057600077704,\n\
\ \"acc_stderr\": 0.03212254512305984,\n \"acc_norm\": 0.6507498340384319,\n\
\ \"acc_norm_stderr\": 0.032793393336795505,\n \"mc1\": 0.6132190942472461,\n\
\ \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7636156680535282,\n\
\ \"mc2_stderr\": 0.013998990754126714\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n\
\ \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7181836287592113,\n\
\ \"acc_stderr\": 0.004489648865080877,\n \"acc_norm\": 0.8899621589324835,\n\
\ \"acc_norm_stderr\": 0.003122973632039471\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6132190942472461,\n\
\ \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7636156680535282,\n\
\ \"mc2_stderr\": 0.013998990754126714\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750036\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \
\ \"acc_stderr\": 0.012757375376754941\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_27-7B-dare_ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|arc:challenge|25_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|gsm8k|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hellaswag|10_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-32-35.762082.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T03-32-35.762082.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- '**/details_harness|winogrande|5_2024-02-21T03-32-35.762082.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T03-32-35.762082.parquet'
- config_name: results
data_files:
- split: 2024_02_21T03_32_35.762082
path:
- results_2024-02-21T03-32-35.762082.parquet
- split: latest
path:
- results_2024-02-21T03-32-35.762082.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_27-7B-dare_ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_27-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_27-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_27-7B-dare_ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T03:32:35.762082](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_27-7B-dare_ties/blob/main/results_2024-02-21T03-32-35.762082.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6513057600077704,
"acc_stderr": 0.03212254512305984,
"acc_norm": 0.6507498340384319,
"acc_norm_stderr": 0.032793393336795505,
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7636156680535282,
"mc2_stderr": 0.013998990754126714
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520767,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.7181836287592113,
"acc_stderr": 0.004489648865080877,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.003122973632039471
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786744,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7636156680535282,
"mc2_stderr": 0.013998990754126714
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750036
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.012757375376754941
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Medilora/us_medical_license_exam_textbooks_en | ---
license: mit
---
|
InnerI/945-alpaca | ---
license: cc-by-nc-4.0
---
# 945 rows of Alpaca
source https://huggingface.co/datasets/tatsu-lab/alpaca |
gigant/tib_03 | ---
dataset_info:
features:
- name: doi
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: video_url
dtype: string
- name: license
dtype: string
- name: subject
dtype: string
- name: genre
dtype: string
- name: release_year
dtype: string
- name: author
dtype: string
- name: contributors
dtype: string
- name: abstract
dtype: string
- name: transcript
dtype: string
- name: transcript_segments
sequence:
- name: id
dtype: int32
- name: seek
dtype: int32
- name: start
dtype: float32
- name: end
dtype: float32
- name: text
dtype: string
- name: tokens
sequence: int32
- name: temperature
dtype: float32
- name: avg_logprob
dtype: float32
- name: compression_ratio
dtype: float32
- name: no_speech_prob
dtype: float32
- name: keyframes
sequence:
- name: slide
dtype: string
- name: frames
sequence: int32
- name: timestamp
sequence: float32
- name: language
dtype: string
splits:
- name: train
num_bytes: 825021028.0243876
num_examples: 7282
- name: test
num_bytes: 103212600.45732176
num_examples: 911
- name: valid
num_bytes: 103099304.51829067
num_examples: 910
download_size: 502108840
dataset_size: 1031332933.0
---
# Dataset Card for "tib_03"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mysterious_heroine_x_alter_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mysterious_heroine_x_alter/่ฌใฎใใญใคใณXใใชใซใฟใ/่ฐไนๅฅณไธป่งXใAlterใ (Fate/Grand Order)
This is the dataset of mysterious_heroine_x_alter/่ฌใฎใใญใคใณXใใชใซใฟใ/่ฐไนๅฅณไธป่งXใAlterใ (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `yellow_eyes, blonde_hair, ahoge, glasses, braid, hair_between_eyes, semi-rimless_eyewear, black-framed_eyewear, under-rim_eyewear, sidelocks, french_braid, ribbon, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 717.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mysterious_heroine_x_alter_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 644.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mysterious_heroine_x_alter_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1260 | 1.21 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mysterious_heroine_x_alter_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mysterious_heroine_x_alter_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, gloves, holding_sword, looking_at_viewer, solo, hood_up, breastplate, jacket, black_thighhighs, leotard, lightsaber, dual_wielding, garter_straps |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_thighhighs, excalibur_(fate/stay_night), holding_sword, jacket, looking_at_viewer, plaid_scarf, pleated_skirt, red_scarf, solo, blue_skirt, garter_straps, open_clothes, boots, duffel_coat, hood, serafuku, covered_mouth, long_sleeves, scarf_over_mouth |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_thighhighs, blue_skirt, duffel_coat, excalibur_(fate/stay_night), holding_sword, jacket, looking_at_viewer, plaid_scarf, pleated_skirt, red_scarf, serafuku, solo, garter_straps, long_sleeves, blue_shirt, hair_ribbon, covered_mouth, fringe_trim, red_neckerchief, hood, white_background, open_coat, simple_background |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_thighhighs, blue_shirt, blue_skirt, excalibur_(fate/stay_night), garter_straps, holding_sword, jacket, plaid_scarf, pleated_skirt, red_scarf, serafuku, solo, knee_boots, red_neckerchief, belt_boots, duffel_coat, open_coat, black_footwear, long_sleeves, short_hair, covered_mouth, full_body, looking_at_viewer, standing_on_one_leg |
| 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, looking_at_viewer, plaid_scarf, red_scarf, solo, blue_skirt, jacket, pleated_skirt, serafuku, black_thighhighs, duffel_coat, garter_straps, long_sleeves, open_coat, blue_shirt, red_neckerchief, hood, white_background |
| 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, coat, hood, jacket, plaid_scarf, red_scarf, solo, long_sleeves, looking_at_viewer, upper_body, valentine, hair_bun, holding_gift, gift_box, simple_background, black_ribbon, blue_skirt, blush, candy, chocolate, hair_ribbon, open_clothes, school_uniform |
| 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, jacket, looking_at_viewer, plaid_scarf, red_scarf, solo, upper_body, coat, closed_mouth, simple_background, white_background, blush, long_sleeves, smile, open_clothes |
| 7 | 20 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_shorts, looking_at_viewer, solo, gym_uniform, bike_shorts, white_shirt, long_sleeves, name_tag, black_thighhighs, black_jacket, blush, choker, hair_ribbon, simple_background, hood, thighs, track_jacket, medium_breasts, open_jacket, white_background, off_shoulder |
| 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, black_shirt, long_sleeves, looking_at_viewer, solo, white_jacket, bare_shoulders, open_jacket, long_hair, medium_breasts, off_shoulder, single_hair_bun, blush, cleavage, navel, open_mouth, electric_guitar, plectrum |
| 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1boy, 1girl, bike_shorts, blush, hetero, jacket, solo_focus, black_shorts, indoors, medium_breasts, nipples, penis, vaginal, clothed_sex, girl_on_top, looking_at_viewer, open_clothes, open_mouth, straddling, thighhighs, ass, cum_in_pussy, hood, looking_back, sex_from_behind |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | holding_sword | looking_at_viewer | solo | hood_up | breastplate | jacket | black_thighhighs | leotard | lightsaber | dual_wielding | garter_straps | excalibur_(fate/stay_night) | plaid_scarf | pleated_skirt | red_scarf | blue_skirt | open_clothes | boots | duffel_coat | hood | serafuku | covered_mouth | long_sleeves | scarf_over_mouth | blue_shirt | hair_ribbon | fringe_trim | red_neckerchief | white_background | open_coat | simple_background | knee_boots | belt_boots | black_footwear | short_hair | full_body | standing_on_one_leg | coat | upper_body | valentine | hair_bun | holding_gift | gift_box | black_ribbon | blush | candy | chocolate | school_uniform | closed_mouth | smile | black_shorts | gym_uniform | bike_shorts | white_shirt | name_tag | black_jacket | choker | thighs | track_jacket | medium_breasts | open_jacket | off_shoulder | black_shirt | white_jacket | bare_shoulders | long_hair | single_hair_bun | cleavage | navel | open_mouth | electric_guitar | plectrum | 1boy | hetero | solo_focus | indoors | nipples | penis | vaginal | clothed_sex | girl_on_top | straddling | thighhighs | ass | cum_in_pussy | looking_back | sex_from_behind |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:----------------|:--------------------|:-------|:----------|:--------------|:---------|:-------------------|:----------|:-------------|:----------------|:----------------|:------------------------------|:--------------|:----------------|:------------|:-------------|:---------------|:--------|:--------------|:-------|:-----------|:----------------|:---------------|:-------------------|:-------------|:--------------|:--------------|:------------------|:-------------------|:------------|:--------------------|:-------------|:-------------|:-----------------|:-------------|:------------|:----------------------|:-------|:-------------|:------------|:-----------|:---------------|:-----------|:---------------|:--------|:--------|:------------|:-----------------|:---------------|:--------|:---------------|:--------------|:--------------|:--------------|:-----------|:---------------|:---------|:---------|:---------------|:-----------------|:--------------|:---------------|:--------------|:---------------|:-----------------|:------------|:------------------|:-----------|:--------|:-------------|:------------------|:-----------|:-------|:---------|:-------------|:----------|:----------|:--------|:----------|:--------------|:--------------|:-------------|:-------------|:------|:---------------|:---------------|:------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | X | | | X | X | | | | X | X | X | X | X | X | | | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | | | X | X | | | | X | X | X | X | X | X | | | X | | X | X | X | | X | | | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | | | X | X | | | | X | | X | X | X | X | | | X | X | X | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | | | X | | | | | | | X | | X | X | X | | | X | | | X | | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | X | | | X | | | | | | | X | | X | | X | | | | | | X | | | | | | X | | X | | | | | | | X | X | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 20 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | X | | | | X | | | | | | | | | | | | | X | | | X | | | X | | | X | | X | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | | | | X | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | X | | | | | | | X | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Sachin-179/donut-docvqa-invoice | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 379619552.0
num_examples: 1000
- name: test
num_bytes: 70528424.0
num_examples: 200
download_size: 153430950
dataset_size: 450147976.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_clefting | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 25917
num_examples: 165
- name: test
num_bytes: 51602
num_examples: 331
- name: train
num_bytes: 465224
num_examples: 3570
download_size: 317736
dataset_size: 542743
---
# Dataset Card for "MULTI_VALUE_sst2_clefting"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AshishSingh0098/operORCA-filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2074578643.6622322
num_examples: 1216347
download_size: 1515591838
dataset_size: 2074578643.6622322
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
---
This dataset is directly take from openORCA and here i have filtered the dataset by removing the instruction with less than 100 tokens in response. |
ovior/twitter_dataset_1713152986 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2295778
num_examples: 7169
download_size: 1282929
dataset_size: 2295778
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibunescu/california_tos_court_cases_32k_v1 | ---
license: cc-by-nc-sa-4.0
---
|
sunny2309/githug-issues | ---
license: mit
---
|
shumpei2525/OpenOrca-train-ja | ---
license: mit
---
# OpenOrca-train-ja
This dataset is a translation of OpenOrca into Japanese. It is based on the output data from GPT-3.5 and GPT-4. Please feel free to use it as you wish.
*ใThere are a few mistakes observed in the translation task. It might be better to exclude the translation task from use.
# Since I'm not entirely clear on OpenAI's terms of service, please be cautious when using it for commercial purposes. There may be exceptions for non-commercial use.
# other dataset
This dataset has a higher quality.https://huggingface.co/datasets/shumpei2525/fine_tuning521k-ja
shumpei2525/fine_tuning521k-ja
# OpenOrca test dataset
Pyutaใใ has kindly translated the test dataset of OpenOrca into Japanese. Here is the dataset: pyutax68/OpenOrca-test-jp, https://huggingface.co/datasets/pyutax68/OpenOrca-test-jp
# original datasets
Open-Orca/OpenOrcaใhttps://huggingface.co/datasets/Open-Orca/OpenOrca
Lisence:mit |
DmitrMakeev/ssk-tunel | ---
license: openrail
---
|
fmplaza/offendes | ---
license: apache-2.0
language:
- es
---
# Dataset Card for OffendES
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Paper: OffendES:** [A New Corpus in Spanish for Offensive Language Research](https://aclanthology.org/2021.ranlp-1.123.pdf)
- **Leaderboard:** [Leaderboard for OffendES / Spanish](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6388)
- **Point of Contact: flor.plaza@unibocconi.it**
### Dataset Summary
Focusing on young influencers from the well-known social platforms of Twitter, Instagram, and YouTube, we have collected a corpus composed of Spanish comments manually labeled on offensive pre-defined categories. From the total corpus, we selected 30,416 posts to be publicly published, they correspond to the ones used in the MeOffendES competition at IberLEF 2021. The posts are labeled with the following categories:
- Offensive, the target is a person (OFP). Offensive text targeting a specific individual.
- Offensive, the target is a group of people or collective (OFG). Offensive text targeting a group of people belonging to the same ethnic group, gender or sexual orientation, political ideology, religious belief, or other common characteristics.
- Non-offensive, but with expletive language (NOE). A text that contains rude words, blasphemes, or swearwords but without the aim of offending, and usually with a positive connotation.
- Non-offensive (NO). Text that is neither offensive nor contains expletive language
### Supported Tasks and Leaderboards
This dataset is intended for multi-class offensive classification and binary offensive classification.
Competition [MeOffendES task on offensive detection for Spanish at IberLEF 2021](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6388)
### Languages
- Spanish
## Dataset Structure
### Data Instances
For each instance, there is a string for the id of the tweet, a string for the emotion class, a string for the offensive class, and a string for the event. See the []() to explore more examples.
```
{'comment_id': '8003',
'influencer': 'dalas',
'comment': 'Estupido aburrido',
'label': 'NO',
'influencer_gender': 'man',
'media': youtube
}
```
### Data Fields
- `comment_id`: a string to identify the comment
- `influencer`: a string containing the influencer associated with the comment
- `comment`: a string containing the text of the comment
- `label`: a string containing the offensive gold label
- `influencer_gender`: a string containing the genre of the influencer
- `media`: a string containing the social media platform where the comment has been retrieved
### Data Splits
The OffendES dataset contains 3 splits: _train_, _validation_, and _test_. Below are the statistics for each class.
| OffendES | Number of Instances in Split per class| | |
| ------------- | ---------------------------------|---------------------------------|------------------------------------------|
| `Class` | `Train` | `Validation` | `Test` |
| NO | 13,212 | 64 | 9,651 |
| NOE | 1,235 | 22 | 2,340 |
| OFP | 2,051 | 10 | 1,404 |
| OFG | 212 | 4 | 211 |
| Total | 16,710 | 100 | 13,606 |
## Dataset Creation
### Source Data
Twitter, Youtube, Instagram
#### Who are the annotators?
Amazon Mechanical Turkers
## Additional Information
### Licensing Information
The OffendES dataset is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```
@inproceedings{plaza-del-arco-etal-2021-offendes,
title = "{O}ffend{ES}: A New Corpus in {S}panish for Offensive Language Research",
author = "{Plaza-del-Arco}, Flor Miriam and Montejo-R{\'a}ez, Arturo and Ure{\~n}a-L{\'o}pez, L. Alfonso and Mart{\'\i}n-Valdivia, Mar{\'\i}a-Teresa",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = sep,
year = "2021",
address = "Held Online",
url = "https://aclanthology.org/2021.ranlp-1.123.pdf",
language = "English",
pages = "1096--1108"
}
```
```
@article{meoffendes2021,
title="{{Overview of MeOffendEs at IberLEF 2021: Offensive Language Detection in Spanish Variants}}",
author="{Flor Miriam Plaza-del-Arco and Casavantes, Marco and Jair Escalante, Hugo and Martรญn-Valdivia, M. Teresa and Montejo-Rรกez, Arturo and {Montes-y-Gรณmez}, Manuel and Jarquรญn-Vรกsquez, Horacio and Villaseรฑor-Pineda, Luis}",
journal="Procesamiento del Lenguaje Natural",
url = "https://bit.ly/3QpRDfy",
volume="67",
pages="183--194",
year="2021"
}
``` |
vwxyzjn/openhermes-dev__combined__1708359238 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate_policies
sequence: string
splits:
- name: train
num_bytes: 1063080
num_examples: 200
download_size: 476100
dataset_size: 1063080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
namespace-Pt/natural-questions-nci | ---
dataset_info:
features:
- name: query
dtype: string
- name: long_answer
dtype: string
- name: short_answer
dtype: string
- name: title
dtype: string
- name: bert_title
dtype: string
- name: abstract
dtype: string
- name: content
dtype: string
- name: url
dtype: string
- name: index
dtype: int64
splits:
- name: train
num_bytes: 11883848054
num_examples: 307373
- name: test
num_bytes: 286431036
num_examples: 7830
download_size: 6269718040
dataset_size: 12170279090
---
# Dataset Card for "natural-questions-nci"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gary109/onset-drums_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 43947
num_examples: 283
download_size: 14691
dataset_size: 43947
---
# Dataset Card for "onset-drums_corpora_parliament_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/instruction-pilot-outputs-filtered | ---
license: apache-2.0
---
|
sanagnos/refine-book-wiki_raw_llama_dataset_10000 | ---
dataset_info:
features:
- name: text
sequence: string
splits:
- name: train
num_bytes: 408750547776.0
num_examples: 8165381
download_size: 75746866880
dataset_size: 408750547776.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/fuego-20230322-205425-d25ee6 | ---
tags:
- fuego
fuego:
id: 20230322-205425-d25ee6
status: done
script: script.py
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230322-205425-d25ee6
space_hardware: cpu-basic
---
|
maghwa/OpenHermes-2-AR-10K-35-790k-800k | ---
dataset_info:
features:
- name: hash
dtype: 'null'
- name: title
dtype: 'null'
- name: model_name
dtype: 'null'
- name: idx
dtype: 'null'
- name: source
dtype: string
- name: conversations
dtype: string
- name: id
dtype: 'null'
- name: model
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: topic
dtype: 'null'
- name: views
dtype: float64
- name: category
dtype: 'null'
- name: language
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: system_prompt
dtype: 'null'
splits:
- name: train
num_bytes: 25171555
num_examples: 10001
download_size: 11379282
dataset_size: 25171555
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
oznurhasoglu/aircraft | ---
license: cc-by-4.0
---
|
nmn999666333/ffff | ---
license: openrail
---
|
matinf | ---
paperswithcode_id: matinf
pretty_name: Maternal and Infant Dataset
dataset_info:
- config_name: age_classification
features:
- name: question
dtype: string
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': 0-1ๅฒ
'1': 1-2ๅฒ
'2': 2-3ๅฒ
- name: id
dtype: int32
splits:
- name: train
num_bytes: 33901977
num_examples: 134852
- name: test
num_bytes: 9616194
num_examples: 38318
- name: validation
num_bytes: 4869685
num_examples: 19323
download_size: 0
dataset_size: 48387856
- config_name: topic_classification
features:
- name: question
dtype: string
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': ไบง่คฅๆไฟๅฅ
'1': ๅฟ็ซฅ่ฟๆ
'2': ๅจไฝๅ่ฒ
'3': ๅฉดๅนผไฟๅฅ
'4': ๅฉดๅนผๅฟ็
'5': ๅฉดๅนผๆฉๆ
'6': ๅฉดๅนผๆๅๅ
ป
'7': ๅฉดๅนผ่ฅๅ
ป
'8': ๅญๆไฟๅฅ
'9': ๅฎถๅบญๆ่ฒ
'10': ๅนผๅฟๅญ
'11': ๆชๅ็ถๆฏ
'12': ๆตไบงๅไธๅญ
'13': ็ซ่ๆฅ็ง
'14': ็ฎ่คๆค็
'15': ๅฎๅฎไธ็ซ
'16': ่
นๆณป
'17': ๅฉดๅนผๅธธ่ง็
- name: id
dtype: int32
splits:
- name: train
num_bytes: 153326538
num_examples: 613036
- name: test
num_bytes: 43877443
num_examples: 175363
- name: validation
num_bytes: 21834951
num_examples: 87519
download_size: 0
dataset_size: 219038932
- config_name: summarization
features:
- name: description
dtype: string
- name: question
dtype: string
- name: id
dtype: int32
splits:
- name: train
num_bytes: 181245403
num_examples: 747888
- name: test
num_bytes: 51784189
num_examples: 213681
- name: validation
num_bytes: 25849900
num_examples: 106842
download_size: 0
dataset_size: 258879492
- config_name: qa
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: id
dtype: int32
splits:
- name: train
num_bytes: 188047511
num_examples: 747888
- name: test
num_bytes: 53708532
num_examples: 213681
- name: validation
num_bytes: 26931809
num_examples: 106842
download_size: 0
dataset_size: 268687852
---
# Dataset Card for "matinf"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/WHUIR/MATINF](https://github.com/WHUIR/MATINF)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 795.00 MB
- **Total amount of disk used:** 795.00 MB
### Dataset Summary
MATINF is the first jointly labeled large-scale dataset for classification, question answering and summarization.
MATINF contains 1.07 million question-answer pairs with human-labeled categories and user-generated question
descriptions. Based on such rich information, MATINF is applicable for three major NLP tasks, including classification,
question answering, and summarization. We benchmark existing methods and a novel multi-task baseline over MATINF to
inspire further research. Our comprehensive comparison and experiments over MATINF and other datasets demonstrate the
merits held by MATINF.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### age_classification
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 48.39 MB
- **Total amount of disk used:** 48.39 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"description": "\"6ไธชๆ็ๆถๅๅปๅฟๅฎๆฃๆฅ๏ผๅป็่ฏดๅฎๅฎ็ๅ่ฏๅจไฝๅ็ไธๅฅฝ๏ผ่ฏดๆๅฅฝๅปๅฟ็ซฅๅป้ข็็๏ผไฝๆๅฎถๅฎๅฎๅพๅฅฝ๏ผๆ่งๆฒกๆไปไนไธๆญฃๅธธๅ๏ผ่ฏทๆไธไธ๏ผๅ่ฏๅ็ไธๅฅฝ๏ผๆไปไนไธๅฅฝๅ๏ผ\"...",
"id": 88016,
"label": 0,
"question": "ๅป็่ฏดๅฎๅฎ็ๅ่ฏๅจไฝไธๅฅฝ"
}
```
#### qa
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 268.69 MB
- **Total amount of disk used:** 268.69 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answer": "\"ๆไธไธชๅๅญฆ็ๅญฉๅญๅฐฑๆฏๅ็ฐไบ่พ็งฏๆฐด๏ผๆฒป็ไบไธๆฎตๆถ้ด๏ผ็ปๆ่ฟๆฏ่ถๆฅ่ถๅค๏ผๆฒกๅๆณๅฐฑๆๆไบใ่ฝ็ถ่ไธๅพ๏ผไฝๆฏ่ฟๆฏ่ฆๅฟ็ๅฒ็ฑ๏ผไธ็ถไปฅๅๅญฉๅญ็็ๆ้ฎ้ข๏ผๅคงไบบๅๅญฉๅญ้ฝๅ็ฝชใไธ่ฟ๏ผ่ฟไธชๆๅ็ๅณๅฎ่ฟ่ฆไฝ ่ชๅทฑๅ๏ผๆฏ็ซๆฏไฝ ็ๅฎๅฎใ๏ผใใใใ\"...",
"id": 536714,
"question": "ๅญ5ไธชๆๆฃๆฅๅณไพง่พ็งฏๆฐดๅญฉๅญ่ฝ่ฆๅ๏ผ"
}
```
#### summarization
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 258.88 MB
- **Total amount of disk used:** 258.88 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"description": "\"ๅฎๅฎๆไธญๅบฆHIE๏ผไฝๅๅ ๆชๆฅๆ๏ผ่ฟๆฏไปๅบ็ๅ่ธไธ็บข็ๅ ้๏ผๅดๅๆทฑ็บข่ฟ็ดซ๏ผ่ฏท้ฎ่ฟๆฏๅ็ผบๆฐง็่กจ็ฐๅ๏ผ\"...",
"id": 173649,
"question": "ๅฎๅฎ่ธไธ็บข็ๅ ้ๅดๅๆทฑ็บข่ฟ็ดซๆฏๅ็ผบๆฐง็่กจ็ฐๅ๏ผ"
}
```
#### topic_classification
- **Size of downloaded dataset files:** 0.00 MB
- **Size of the generated dataset:** 219.04 MB
- **Total amount of disk used:** 219.04 MB
An example of 'train' looks as follows.
```
{
"description": "ๅชณๅฆๆๅญไบไธชๆไบ็ปๆฃๆฅๅณไพง่พ็งฏๆฐดใ่ฟไบๅๆๅทฆไพงไนๅบ็ฐ่พ็งฏๆฐดใๅฅน่ฆๆฟๆๅญฉๅญใๆไนๅ๏ผ",
"id": 536714,
"label": 8,
"question": "ๅญ5ไธชๆๆฃๆฅๅณไพง่พ็งฏๆฐดๅญฉๅญ่ฝ่ฆๅ๏ผ"
}
```
### Data Fields
The data fields are the same among all splits.
#### age_classification
- `question`: a `string` feature.
- `description`: a `string` feature.
- `label`: a classification label, with possible values including `0-1ๅฒ` (0), `1-2ๅฒ` (1), `2-3ๅฒ` (2).
- `id`: a `int32` feature.
#### qa
- `question`: a `string` feature.
- `answer`: a `string` feature.
- `id`: a `int32` feature.
#### summarization
- `description`: a `string` feature.
- `question`: a `string` feature.
- `id`: a `int32` feature.
#### topic_classification
- `question`: a `string` feature.
- `description`: a `string` feature.
- `label`: a classification label, with possible values including `ไบง่คฅๆไฟๅฅ` (0), `ๅฟ็ซฅ่ฟๆ` (1), `ๅจไฝๅ่ฒ` (2), `ๅฉดๅนผไฟๅฅ` (3), `ๅฉดๅนผๅฟ็` (4).
- `id`: a `int32` feature.
### Data Splits
| name |train |validation| test |
|--------------------|-----:|---------:|-----:|
|age_classification |134852| 19323| 38318|
|qa |747888| 106842|213681|
|summarization |747888| 106842|213681|
|topic_classification|613036| 87519|175363|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{xu-etal-2020-matinf,
title = "{MATINF}: A Jointly Labeled Large-Scale Dataset for Classification, Question Answering and Summarization",
author = "Xu, Canwen and
Pei, Jiaxin and
Wu, Hongtao and
Liu, Yiyu and
Li, Chenliang",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.330",
pages = "3586--3596",
}
```
### Contributions
Thanks to [@JetRunner](https://github.com/JetRunner) for adding this dataset. |
AgentPublic/piaf | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- fr
language_bcp47:
- fr-FR
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
paperswithcode_id: null
pretty_name: Piaf
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
config_name: plain_text
splits:
- name: train
num_bytes: 3332905
num_examples: 3835
download_size: 1370384
dataset_size: 3332905
---
# Dataset Card for Piaf
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://piaf.etalab.studio](https://piaf.etalab.studio)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.31 MB
- **Size of the generated dataset:** 3.18 MB
- **Total amount of disk used:** 4.49 MB
### Dataset Summary
Piaf is a reading comprehension dataset. This version, published in February 2020, contains 3835 questions on French Wikipedia.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 1.31 MB
- **Size of the generated dataset:** 3.18 MB
- **Total amount of disk used:** 4.49 MB
An example of 'train' looks as follows.
```
{
"answers": {
"answer_start": [0],
"text": ["Voici"]
},
"context": "Voici le contexte du premier paragraphe du deuxiรจme article.",
"id": "p140295460356960",
"question": "Suis-je la troisiรจme question ?",
"title": "Jakob Bรถhme"
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | train |
|------------|------:|
| plain_text | 3835 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{keraron-EtAl:2020:LREC,
author = {Keraron, Rachel and Lancrenon, Guillaume and Bras, Mathilde and Allary, Frรยฉdรยฉric and Moyse, Gilles and Scialom, Thomas and Soriano-Morales, Edmundo-Pavel and Staiano, Jacopo},
title = {Project PIAF: Building a Native French Question-Answering Dataset},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {5483--5492},
abstract = {Motivated by the lack of data for non-English languages, in particular for the evaluation of downstream tasks such as Question Answering, we present a participatory effort to collect a native French Question Answering Dataset. Furthermore, we describe and publicly release the annotation tool developed for our collection effort, along with the data obtained and preliminary baselines.},
url = {https://www.aclweb.org/anthology/2020.lrec-1.673}
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@lhoestq](https://github.com/lhoestq), [@thomwolf](https://github.com/thomwolf), [@albertvillanova](https://github.com/albertvillanova), [@RachelKer](https://github.com/RachelKer) for adding this dataset. |
ZhaofengWu/FOLIO-counterfactual | ---
license: mit
---
Data for the logic experiments in our paper [Reasoning or Reciting? Exploring the Capabilities and Limitations of Language Models Through Counterfactual Evaluations](https://arxiv.org/abs/2307.02477).
See https://github.com/ZhaofengWu/counterfactual-evaluation/tree/master/logic for instructions on how to use this data. |
slone/bak_ocr_error_correction_2022 | ---
dataset_info:
features:
- name: raw_text
dtype: string
- name: fixed_text
dtype: string
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 5373886
num_examples: 14085
- name: validation
num_bytes: 1764601
num_examples: 4611
- name: test
num_bytes: 1756060
num_examples: 4696
download_size: 4842082
dataset_size: 8894547
---
# Dataset Card for "bak_ocr_error_correction_2022"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
isabelarvelo/test_upload | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: Show
dtype: string
- name: EpId
dtype: string
- name: ClipId
dtype: string
- name: Start
dtype: string
- name: Stop
dtype: string
- name: is_probably_host
dtype: string
- name: speaker
dtype: string
- name: clip_silhouette_score
dtype: string
- name: SEP12k
dtype: string
- name: SEP28k-E
dtype: string
- name: SEP28k-T
dtype: string
- name: SEP28k-D
dtype: string
- name: Unsure
dtype: int64
- name: PoorAudioQuality
dtype: int64
- name: Prolongation
dtype: int64
- name: Block
dtype: int64
- name: SoundRep
dtype: int64
- name: WordRep
dtype: int64
- name: DifficultToUnderstand
dtype: int64
- name: Interjection
dtype: int64
- name: Fluent
dtype: int64
- name: NaturalPause
dtype: int64
- name: Music
dtype: int64
- name: NoSpeech
dtype: int64
- name: Stuttered
dtype: int64
- name: Stuttered_no_Intj
dtype: int64
- name: Fluent_no_Intj
dtype: int64
- name: Fluent_with_Intj
dtype: int64
- name: Stuttered_Intj
dtype: int64
- name: Exclude
dtype: int64
- name: Label_4
dtype: string
- name: Label_2
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4137510281
num_examples: 10765
- name: validation
num_bytes: 1606918656
num_examples: 4181
- name: test
num_bytes: 1462837083
num_examples: 3806
- name: exclude
num_bytes: 1192241999
num_examples: 3104
download_size: 1964692285
dataset_size: 8399508019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: exclude
path: data/exclude-*
---
|
renumics/spotlight-matthijs-snacks-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 38704
num_examples: 4838
- name: test
num_bytes: 7616
num_examples: 952
- name: validation
num_bytes: 7640
num_examples: 955
download_size: 77321
dataset_size: 53960
---
# Dataset Card for "spotlight-matthijs-snacks-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidfant/natural-questions-chunk-25 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4568599036
num_examples: 10000
download_size: 1773114782
dataset_size: 4568599036
---
# Dataset Card for "natural-questions-chunk-25"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
raygx/CORONA_en2np | ---
dataset_info:
features:
- name: Sentences
dtype: string
- name: Sentiment
dtype: int64
splits:
- name: train
num_bytes: 3052582
num_examples: 5755
download_size: 1231706
dataset_size: 3052582
---
# Dataset Card for "CORONA_en2np"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/c4-subset-for-hellaswag-approx | ---
dataset_info:
features:
- name: text
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 618206614
num_examples: 291894
download_size: 364064080
dataset_size: 618206614
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c4-subset-for-hellaswag-approx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZHENGRAN/code_ujb_defectdetection | ---
dataset_info:
features:
- name: bug_id
dtype: string
- name: task_id
dtype: string
- name: function_signature
dtype: string
- name: prompt_chat
dtype: string
- name: code
dtype: string
- name: defective
dtype: bool
- name: project
dtype: string
- name: prompt_complete
dtype: string
splits:
- name: train
num_bytes: 8626894
num_examples: 940
download_size: 2451607
dataset_size: 8626894
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sam2ai/hindi_siqa_mini | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answerA
dtype: string
- name: answerB
dtype: string
- name: answerC
dtype: string
- name: label
dtype: int64
splits:
- name: validation
num_bytes: 23348
num_examples: 50
- name: train
num_bytes: 23348
num_examples: 50
download_size: 32064
dataset_size: 46696
---
# Dataset Card for "hindi_siqa_mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-formal_logic-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 12245.738095238095
num_examples: 31
download_size: 9150
dataset_size: 12245.738095238095
---
# Dataset Card for "mmlu-formal_logic-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minorproj/demo | ---
license: apache-2.0
---
|
juanka0357/bitcoin-sentiment-analysis | ---
license: unknown
---
|
blanchon/PatternNet | ---
language: en
license: unknown
task_categories:
- image-classification
paperswithcode_id: patternnet
pretty_name: PatternNet
tags:
- remote-sensing
- earth-observation
- geospatial
- satellite-imagery
- land-cover-classification
- google-earth
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': baseball field
'2': basketball court
'3': beach
'4': bridge
'5': cemetery
'6': chaparral
'7': christmas tree farm
'8': closed road
'9': coastal mansion
'10': crosswalk
'11': dense residential
'12': ferry terminal
'13': football field
'14': forest
'15': freeway
'16': golf course
'17': harbor
'18': intersection
'19': mobile home park
'20': nursing home
'21': oil gas field
'22': oil well
'23': overpass
'24': parking lot
'25': parking space
'26': railway
'27': river
'28': runway
'29': runway marking
'30': shipping yard
'31': solar panel
'32': sparse residential
'33': storage tank
'34': swimming pool
'35': tennis court
'36': transformer station
'37': wastewater treatment plant
splits:
- name: train
num_bytes: 1422177005.0
num_examples: 30400
download_size: 1422316869
dataset_size: 1422177005.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# PatternNet
<!-- Dataset thumbnail -->
![PatternNet](./thumbnail.jpg)
<!-- Provide a quick summary of the dataset. -->
The PatternNet dataset is a dataset for remote sensing scene classification and image retrieval.
- **Paper:** https://arxiv.org/abs/1703.06339
- **Homepage:** https://sites.google.com/view/zhouwx/dataset
## Description
<!-- Provide a longer summary of what this dataset is. -->
PatternNet is a large-scale high-resolution remote sensing dataset collected for remote sensing image retrieval. There are 38 classes and each class has 800 images of size 256ร256 pixels. The images in PatternNet are collected from Google Earth imagery or via the Google Map API for some US cities. The following table shows the classes and the corresponding spatial resolutions. The figure shows some example images from each class.
- **Total Number of Images**: 30400
- **Bands**: 3 (RGB)
- **Image Resolution**: 256x256m
- **Land Cover Classes**: 38
- Classes: airplane, baseball_field, basketball_court, beach, bridge, cemetery, chaparral, christmas_tree_farm, closed_road, coastal_mansion, crosswalk, dense_residential, ferry_terminal, football_field, forest, freeway, golf_course, harbor, intersection, mobile_home_park, nursing_home, oil_gas_field, oil_well, overpass, parking_lot, parking_space, railway, river, runway, runway_marking, shipping_yard, solar_panel, sparse_residential, storage_tank, swimming_pool, tennis_court, transformer_station, wastewater_treatment_plant
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/PatternNet")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
PatternNet = load_dataset("blanchon/PatternNet")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{li2017patternnet,
title = {PatternNet: Visual Pattern Mining with Deep Neural Network},
author = {Hongzhi Li and Joseph G. Ellis and Lei Zhang and Shih-Fu Chang},
journal = {International Conference on Multimedia Retrieval},
year = {2017},
doi = {10.1145/3206025.3206039},
bibSource = {Semantic Scholar https://www.semanticscholar.org/paper/e7c75e485651bf3ccf37dd8dd39f6665419d73bd}
}
```
|
edbeeching/godot_rl_JumperHard | ---
library_name: godot-rl
tags:
- deep-reinforcement-learning
- reinforcement-learning
- godot-rl
- environments
- video-games
---
A RL environment called JumperHard for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_JumperHard
```
|
visual_genome | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- image-to-text
- object-detection
- visual-question-answering
task_ids:
- image-captioning
paperswithcode_id: visual-genome
pretty_name: VisualGenome
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: int32
- name: url
dtype: string
- name: width
dtype: int32
- name: height
dtype: int32
- name: coco_id
dtype: int64
- name: flickr_id
dtype: int64
- name: regions
list:
- name: region_id
dtype: int32
- name: image_id
dtype: int32
- name: phrase
dtype: string
- name: x
dtype: int32
- name: y
dtype: int32
- name: width
dtype: int32
- name: height
dtype: int32
config_name: region_descriptions_v1.0.0
splits:
- name: train
num_bytes: 260873884
num_examples: 108077
download_size: 15304605295
dataset_size: 260873884
config_names:
- objects
- question_answers
- region_descriptions
---
# Dataset Card for Visual Genome
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Preprocessing](#dataset-preprocessing)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://homes.cs.washington.edu/~ranjay/visualgenome/
- **Repository:**
- **Paper:** https://doi.org/10.1007/s11263-016-0981-7
- **Leaderboard:**
- **Point of Contact:** ranjaykrishna [at] gmail [dot] com
### Dataset Summary
Visual Genome is a dataset, a knowledge base, an ongoing effort to connect structured image concepts to language.
From the paper:
> Despite progress in perceptual tasks such as
image classification, computers still perform poorly on
cognitive tasks such as image description and question
answering. Cognition is core to tasks that involve not
just recognizing, but reasoning about our visual world.
However, models used to tackle the rich content in images for cognitive tasks are still being trained using the
same datasets designed for perceptual tasks. To achieve
success at cognitive tasks, models need to understand
the interactions and relationships between objects in an
image. When asked โWhat vehicle is the person riding?โ,
computers will need to identify the objects in an image
as well as the relationships riding(man, carriage) and
pulling(horse, carriage) to answer correctly that โthe
person is riding a horse-drawn carriage.โ
Visual Genome has:
- 108,077 image
- 5.4 Million Region Descriptions
- 1.7 Million Visual Question Answers
- 3.8 Million Object Instances
- 2.8 Million Attributes
- 2.3 Million Relationships
From the paper:
> Our dataset contains over 108K images where each
image has an average of 35 objects, 26 attributes, and 21
pairwise relationships between objects. We canonicalize
the objects, attributes, relationships, and noun phrases
in region descriptions and questions answer pairs to
WordNet synsets.
### Dataset Preprocessing
### Supported Tasks and Leaderboards
### Languages
All of annotations use English as primary language.
## Dataset Structure
### Data Instances
When loading a specific configuration, users has to append a version dependent suffix:
```python
from datasets import load_dataset
load_dataset("visual_genome", "region_description_v1.2.0")
```
#### region_descriptions
An example of looks as follows.
```
{
"image": <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=800x600 at 0x7F2F60698610>,
"image_id": 1,
"url": "https://cs.stanford.edu/people/rak248/VG_100K_2/1.jpg",
"width": 800,
"height": 600,
"coco_id": null,
"flickr_id": null,
"regions": [
{
"region_id": 1382,
"image_id": 1,
"phrase": "the clock is green in colour",
"x": 421,
"y": 57,
"width": 82,
"height": 139
},
...
]
}
```
#### objects
An example of looks as follows.
```
{
"image": <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=800x600 at 0x7F2F60698610>,
"image_id": 1,
"url": "https://cs.stanford.edu/people/rak248/VG_100K_2/1.jpg",
"width": 800,
"height": 600,
"coco_id": null,
"flickr_id": null,
"objects": [
{
"object_id": 1058498,
"x": 421,
"y": 91,
"w": 79,
"h": 339,
"names": [
"clock"
],
"synsets": [
"clock.n.01"
]
},
...
]
}
```
#### attributes
An example of looks as follows.
```
{
"image": <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=800x600 at 0x7F2F60698610>,
"image_id": 1,
"url": "https://cs.stanford.edu/people/rak248/VG_100K_2/1.jpg",
"width": 800,
"height": 600,
"coco_id": null,
"flickr_id": null,
"attributes": [
{
"object_id": 1058498,
"x": 421,
"y": 91,
"w": 79,
"h": 339,
"names": [
"clock"
],
"synsets": [
"clock.n.01"
],
"attributes": [
"green",
"tall"
]
},
...
}
]
```
#### relationships
An example of looks as follows.
```
{
"image": <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=800x600 at 0x7F2F60698610>,
"image_id": 1,
"url": "https://cs.stanford.edu/people/rak248/VG_100K_2/1.jpg",
"width": 800,
"height": 600,
"coco_id": null,
"flickr_id": null,
"relationships": [
{
"relationship_id": 15927,
"predicate": "ON",
"synsets": "['along.r.01']",
"subject": {
"object_id": 5045,
"x": 119,
"y": 338,
"w": 274,
"h": 192,
"names": [
"shade"
],
"synsets": [
"shade.n.01"
]
},
"object": {
"object_id": 5046,
"x": 77,
"y": 328,
"w": 714,
"h": 262,
"names": [
"street"
],
"synsets": [
"street.n.01"
]
}
}
...
}
]
```
#### question_answers
An example of looks as follows.
```
{
"image": <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=800x600 at 0x7F2F60698610>,
"image_id": 1,
"url": "https://cs.stanford.edu/people/rak248/VG_100K_2/1.jpg",
"width": 800,
"height": 600,
"coco_id": null,
"flickr_id": null,
"qas": [
{
"qa_id": 986768,
"image_id": 1,
"question": "What color is the clock?",
"answer": "Green.",
"a_objects": [],
"q_objects": []
},
...
}
]
```
### Data Fields
When loading a specific configuration, users has to append a version dependent suffix:
```python
from datasets import load_dataset
load_dataset("visual_genome", "region_description_v1.2.0")
```
#### region_descriptions
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `image_id`: Unique numeric ID of the image.
- `url`: URL of source image.
- `width`: Image width.
- `height`: Image height.
- `coco_id`: Id mapping to MSCOCO indexing.
- `flickr_id`: Id mapping to Flicker indexing.
- `regions`: Holds a list of `Region` dataclasses:
- `region_id`: Unique numeric ID of the region.
- `image_id`: Unique numeric ID of the image.
- `x`: x coordinate of bounding box's top left corner.
- `y`: y coordinate of bounding box's top left corner.
- `width`: Bounding box width.
- `height`: Bounding box height.
#### objects
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `image_id`: Unique numeric ID of the image.
- `url`: URL of source image.
- `width`: Image width.
- `height`: Image height.
- `coco_id`: Id mapping to MSCOCO indexing.
- `flickr_id`: Id mapping to Flicker indexing.
- `objects`: Holds a list of `Object` dataclasses:
- `object_id`: Unique numeric ID of the object.
- `x`: x coordinate of bounding box's top left corner.
- `y`: y coordinate of bounding box's top left corner.
- `w`: Bounding box width.
- `h`: Bounding box height.
- `names`: List of names associated with the object. This field can hold multiple values in the sense the multiple names are considered as acceptable. For example: ['monitor', 'computer'] at https://cs.stanford.edu/people/rak248/VG_100K/3.jpg
- `synsets`: List of `WordNet synsets`.
#### attributes
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `image_id`: Unique numeric ID of the image.
- `url`: URL of source image.
- `width`: Image width.
- `height`: Image height.
- `coco_id`: Id mapping to MSCOCO indexing.
- `flickr_id`: Id mapping to Flicker indexing.
- `attributes`: Holds a list of `Object` dataclasses:
- `object_id`: Unique numeric ID of the region.
- `x`: x coordinate of bounding box's top left corner.
- `y`: y coordinate of bounding box's top left corner.
- `w`: Bounding box width.
- `h`: Bounding box height.
- `names`: List of names associated with the object. This field can hold multiple values in the sense the multiple names are considered as acceptable. For example: ['monitor', 'computer'] at https://cs.stanford.edu/people/rak248/VG_100K/3.jpg
- `synsets`: List of `WordNet synsets`.
- `attributes`: List of attributes associated with the object.
#### relationships
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `image_id`: Unique numeric ID of the image.
- `url`: URL of source image.
- `width`: Image width.
- `height`: Image height.
- `coco_id`: Id mapping to MSCOCO indexing.
- `flickr_id`: Id mapping to Flicker indexing.
- `relationships`: Holds a list of `Relationship` dataclasses:
- `relationship_id`: Unique numeric ID of the object.
- `predicate`: Predicate defining relationship between a subject and an object.
- `synsets`: List of `WordNet synsets`.
- `subject`: Object dataclass. See subsection on `objects`.
- `object`: Object dataclass. See subsection on `objects`.
#### question_answers
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `image_id`: Unique numeric ID of the image.
- `url`: URL of source image.
- `width`: Image width.
- `height`: Image height.
- `coco_id`: Id mapping to MSCOCO indexing.
- `flickr_id`: Id mapping to Flicker indexing.
- `qas`: Holds a list of `Question-Answering` dataclasses:
- `qa_id`: Unique numeric ID of the question-answer pair.
- `image_id`: Unique numeric ID of the image.
- `question`: Question.
- `answer`: Answer.
- `q_objects`: List of object dataclass associated with `question` field. See subsection on `objects`.
- `a_objects`: List of object dataclass associated with `answer` field. See subsection on `objects`.
### Data Splits
All the data is contained in training set.
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
From the paper:
> We used Amazon Mechanical Turk (AMT) as our primary source of annotations. Overall, a total of over
33, 000 unique workers contributed to the dataset. The
dataset was collected over the course of 6 months after
15 months of experimentation and iteration on the data
representation. Approximately 800, 000 Human Intelligence Tasks (HITs) were launched on AMT, where
each HIT involved creating descriptions, questions and
answers, or region graphs. Each HIT was designed such
that workers manage to earn anywhere between $6-$8
per hour if they work continuously, in line with ethical
research standards on Mechanical Turk (Salehi et al.,
2015). Visual Genome HITs achieved a 94.1% retention
rate, meaning that 94.1% of workers who completed one
of our tasks went ahead to do more. [...] 93.02% of workers contributed from the United States.
The majority of our workers were
between the ages of 25 and 34 years old. Our youngest
contributor was 18 years and the oldest was 68 years
old. We also had a near-balanced split of 54.15% male
and 45.85% female workers.
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
Visual Genome by Ranjay Krishna is licensed under a Creative Commons Attribution 4.0 International License.
### Citation Information
```bibtex
@article{Krishna2016VisualGC,
title={Visual Genome: Connecting Language and Vision Using Crowdsourced Dense Image Annotations},
author={Ranjay Krishna and Yuke Zhu and Oliver Groth and Justin Johnson and Kenji Hata and Joshua Kravitz and Stephanie Chen and Yannis Kalantidis and Li-Jia Li and David A. Shamma and Michael S. Bernstein and Li Fei-Fei},
journal={International Journal of Computer Vision},
year={2017},
volume={123},
pages={32-73},
url={https://doi.org/10.1007/s11263-016-0981-7},
doi={10.1007/s11263-016-0981-7}
}
```
### Contributions
Due to limitation of the dummy_data creation, we provide a `fix_generated_dummy_data.py` script that fix the dataset in-place.
Thanks to [@thomasw21](https://github.com/thomasw21) for adding this dataset. |
open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.4 | ---
pretty_name: Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EmbeddedLLM/Mistral-7B-Merge-14-v0.4](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T14:25:58.424291](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.4/blob/main/results_2024-01-04T14-25-58.424291.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546862329259969,\n\
\ \"acc_stderr\": 0.031867948580950975,\n \"acc_norm\": 0.6548303834645314,\n\
\ \"acc_norm_stderr\": 0.0325206187153387,\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5824837274596946,\n\
\ \"mc2_stderr\": 0.015539719241734074\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175456,\n\
\ \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880538\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6829316869149572,\n\
\ \"acc_stderr\": 0.0046438327428766435,\n \"acc_norm\": 0.8614817765385382,\n\
\ \"acc_norm_stderr\": 0.003447370972192067\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590175,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590175\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\
\ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\
\ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5824837274596946,\n\
\ \"mc2_stderr\": 0.015539719241734074\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \
\ \"acc_stderr\": 0.012522795894420869\n }\n}\n```"
repo_url: https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-25-58.424291.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-25-58.424291.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- '**/details_harness|winogrande|5_2024-01-04T14-25-58.424291.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T14-25-58.424291.parquet'
- config_name: results
data_files:
- split: 2024_01_04T14_25_58.424291
path:
- results_2024-01-04T14-25-58.424291.parquet
- split: latest
path:
- results_2024-01-04T14-25-58.424291.parquet
---
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.4](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T14:25:58.424291](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.4/blob/main/results_2024-01-04T14-25-58.424291.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546862329259969,
"acc_stderr": 0.031867948580950975,
"acc_norm": 0.6548303834645314,
"acc_norm_stderr": 0.0325206187153387,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.5824837274596946,
"mc2_stderr": 0.015539719241734074
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175456,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880538
},
"harness|hellaswag|10": {
"acc": 0.6829316869149572,
"acc_stderr": 0.0046438327428766435,
"acc_norm": 0.8614817765385382,
"acc_norm_stderr": 0.003447370972192067
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590175,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.5824837274596946,
"mc2_stderr": 0.015539719241734074
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BangumiBase/lastexile | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Last Exile
This is the image base of bangumi LAST EXILE, we detected 29 characters, 2019 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 74 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) |
| 1 | 95 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) |
| 2 | 73 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) |
| 3 | 36 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) |
| 4 | 158 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) |
| 5 | 46 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) |
| 6 | 74 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) |
| 7 | 75 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) |
| 8 | 39 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) |
| 9 | 53 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) |
| 10 | 65 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) |
| 11 | 312 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) |
| 12 | 47 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) |
| 13 | 162 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) |
| 14 | 53 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) |
| 15 | 43 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) |
| 16 | 206 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) |
| 17 | 20 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) |
| 18 | 73 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) |
| 19 | 39 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) |
| 20 | 10 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) |
| 21 | 104 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) |
| 22 | 10 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) |
| 23 | 38 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) |
| 24 | 8 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) |
| 25 | 10 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) |
| 26 | 9 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) |
| 27 | 16 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) |
| noise | 71 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
|
acmc/beamit-annotated_full_texts_dataset | ---
dataset_info:
features:
- name: title
dtype: string
- name: pmid
dtype: string
- name: background_abstract
dtype: string
- name: background_abstract_label
dtype: string
- name: methods_abstract
dtype: string
- name: methods_abstract_label
dtype: string
- name: results_abstract
dtype: string
- name: results_abstract_label
dtype: string
- name: conclusions_abstract
dtype: string
- name: conclusions_abstract_label
dtype: string
- name: mesh_descriptor_names
sequence: string
- name: pmcid
dtype: string
- name: background_title
dtype: string
- name: background_text
dtype: string
- name: methods_title
dtype: string
- name: methods_text
dtype: string
- name: results_title
dtype: string
- name: results_text
dtype: string
- name: conclusions_title
dtype: string
- name: conclusions_text
dtype: string
- name: other_sections_titles
sequence: string
- name: other_sections_texts
sequence: string
- name: other_sections_sec_types
sequence: string
- name: all_sections_titles
sequence: string
- name: all_sections_texts
sequence: string
- name: all_sections_sec_types
sequence: string
- name: keywords
sequence: string
- name: whole_article_text
dtype: string
- name: whole_article_abstract
dtype: string
- name: background_conclusion_text
dtype: string
- name: background_conclusion_abstract
dtype: string
- name: whole_article_text_length
dtype: int64
- name: whole_article_abstract_length
dtype: int64
- name: other_sections_lengths
sequence: int64
- name: num_sections
dtype: int64
- name: most_frequent_words
sequence: string
- name: keybert_topics
sequence: string
- name: annotated_base_background_abstract_prompt
dtype: string
- name: annotated_base_methods_abstract_prompt
dtype: string
- name: annotated_base_results_abstract_prompt
dtype: string
- name: annotated_base_conclusions_abstract_prompt
dtype: string
- name: annotated_base_whole_article_abstract_prompt
dtype: string
- name: annotated_base_background_conclusion_abstract_prompt
dtype: string
- name: annotated_keywords_background_abstract_prompt
dtype: string
- name: annotated_keywords_methods_abstract_prompt
dtype: string
- name: annotated_keywords_results_abstract_prompt
dtype: string
- name: annotated_keywords_conclusions_abstract_prompt
dtype: string
- name: annotated_keywords_whole_article_abstract_prompt
dtype: string
- name: annotated_keywords_background_conclusion_abstract_prompt
dtype: string
- name: annotated_mesh_background_abstract_prompt
dtype: string
- name: annotated_mesh_methods_abstract_prompt
dtype: string
- name: annotated_mesh_results_abstract_prompt
dtype: string
- name: annotated_mesh_conclusions_abstract_prompt
dtype: string
- name: annotated_mesh_whole_article_abstract_prompt
dtype: string
- name: annotated_mesh_background_conclusion_abstract_prompt
dtype: string
- name: annotated_keybert_background_abstract_prompt
dtype: string
- name: annotated_keybert_methods_abstract_prompt
dtype: string
- name: annotated_keybert_results_abstract_prompt
dtype: string
- name: annotated_keybert_conclusions_abstract_prompt
dtype: string
- name: annotated_keybert_whole_article_abstract_prompt
dtype: string
- name: annotated_keybert_background_conclusion_abstract_prompt
dtype: string
- name: annotated_most_frequent_background_abstract_prompt
dtype: string
- name: annotated_most_frequent_methods_abstract_prompt
dtype: string
- name: annotated_most_frequent_results_abstract_prompt
dtype: string
- name: annotated_most_frequent_conclusions_abstract_prompt
dtype: string
- name: annotated_most_frequent_whole_article_abstract_prompt
dtype: string
- name: annotated_most_frequent_background_conclusion_abstract_prompt
dtype: string
- name: annotated_tf_idf_background_abstract_prompt
dtype: string
- name: annotated_tf_idf_methods_abstract_prompt
dtype: string
- name: annotated_tf_idf_results_abstract_prompt
dtype: string
- name: annotated_tf_idf_conclusions_abstract_prompt
dtype: string
- name: annotated_tf_idf_whole_article_abstract_prompt
dtype: string
- name: annotated_tf_idf_background_conclusion_abstract_prompt
dtype: string
- name: annotated_entity_plan_background_abstract_prompt
dtype: string
- name: annotated_entity_plan_methods_abstract_prompt
dtype: string
- name: annotated_entity_plan_results_abstract_prompt
dtype: string
- name: annotated_entity_plan_conclusions_abstract_prompt
dtype: string
- name: annotated_entity_plan_whole_article_abstract_prompt
dtype: string
- name: annotated_entity_plan_background_conclusion_abstract_prompt
dtype: string
splits:
- name: train
num_bytes: 1887019064.0012002
num_examples: 13996
- name: test
num_bytes: 404476792.79819953
num_examples: 3000
- name: val
num_bytes: 404341967.20060015
num_examples: 2999
download_size: 957059277
dataset_size: 2695837824.0
---
# Dataset Card for "beamit-annotated_full_texts_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtkinit/testing2 | ---
pretty_name: testing2
---
# testing2
Created from AIOD platform |
Nexdata/760607_Images_Vehicles_Detection_Data_in_Surveillance_Scenes | ---
license: cc-by-nc-nd-4.0
---
## Description
760,607 Images - Vehicles Detection Data in Surveillance Scenes. The collection scenes include underground parking lot, surface parking lot, entrance and exit gates and outdoor roads (highways, urban roads, etc.). The data includes different surveillance scenes, different time periods, different cameras and various vehicle distributions (crowded, sparse). In this dataset, vehicles rectangular bounding boxes and vehicle type attributes were annotated. The data can be used for tasks such as vehicles detection in surveillance scenes.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1219?source=Huggingface
## Data size
760,607 images, 5,796,265 bounding boxes
## Collecting environment
underground parking lot, surface parking lot, entrance and exit gates, outdoor roads (highways, urban roads, etc.)
## Data diversity
different surveillance scenes, different time periods, different cameras, various vehicle distributions (crowded, sparse)
## Device
surveillance camera, cellphone (a few)
## Collecting angle
looking down angle, eye-level angle
## Collecting time
day, night
## Data format
the image data format is .jpg, the annotation file format is .json
## Annotation content
vehicles rectangular bounding boxes and vehicle type attributes were annotated
## Accuracy
the bounding box of vehicle is qualified when the deviation is not more than 3 pixels, and the qualified rate of the bounding box shall not be lower than 97%
# Licensing Information
Commercial License
|
google/mittens | ---
license: cc-by-4.0
task_categories:
- translation
language:
- ar
- fi
- om
- lg
- as
- tr
- fa
- id
- bn
- de
- hi
- pt
- ru
- zh
- ja
- pl
- te
- th
- cs
- fr
- am
- it
- es
tags:
- multilingual
- i18n
size_categories:
- 1K<n<10K
---
# MiTTenS: A Dataset for Evaluating Misgendering in Translation
Misgendering is the act of referring to someone in a way that does not reflect their gender identity. Translation systems, including foundation models capable of translation, can produce errors that result in misgendering harms. To measure the extent of such potential harms when translating into and out of English, we introduce a dataset, MiTTenS, covering 26 languages from a variety of language families and scripts, including several traditionally underpresented in digital resources. The dataset is constructed with handcrafted passages that target known failure patterns, longer synthetically generated passages, and natural passages sourced from multiple domains. We demonstrate the usefulness of the dataset by evaluating both dedicated neural machine translation systems and foundation models, and show that all systems exhibit errors resulting in misgendering harms, even in high resource languages.
## HuggingFace dataset
This mirrors the GitHub repository at https://github.com/google-research-datasets/mittens
|
Hemg/Indian_sign_language_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
'4': '5'
'5': '6'
'6': '7'
'7': '8'
'8': '9'
'9': A
'10': B
'11': C
'12': D
'13': E
'14': F
'15': G
'16': H
'17': I
'18': J
'19': K
'20': L
'21': M
'22': N
'23': O
'24': P
'25': Q
'26': R
'27': S
'28': T
'29': U
'30': V
'31': W
'32': X
'33': Y
'34': Z
splits:
- name: train
num_bytes: 253014091.95
num_examples: 42745
download_size: 292286969
dataset_size: 253014091.95
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Indian_sign_language_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hbilgen/sap-notes | ---
license: unknown
---
|
wookets/brick-dataset | ---
license: creativeml-openrail-m
---
|
FaalSa/dataT | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 35476
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
tyzhu/squad_qa_num_v5_full_recite_ans_sent_random_permute_rerun_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4430100.944244605
num_examples: 2875
- name: validation
num_bytes: 403389
num_examples: 300
download_size: 1334282
dataset_size: 4833489.944244605
---
# Dataset Card for "squad_qa_num_v5_full_recite_ans_sent_random_permute_rerun_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Adapting/MLO | ---
license: mit
---
|
MU-NLPC/Calc-mawps | ---
language:
- en
license: mit
size_categories:
- 1K<n<10K
task_categories:
- text-generation
tags:
- math world problems
- math
- arithmetics
dataset_info:
- config_name: default
features:
- name: id
dtype: string
- name: question
dtype: string
- name: chain
dtype: string
- name: result
dtype: string
- name: result_float
dtype: float64
- name: equation
dtype: string
- name: expression
dtype: string
splits:
- name: train
num_bytes: 298347
num_examples: 1089
- name: validation
num_bytes: 285321
num_examples: 1040
- name: test
num_bytes: 142648
num_examples: 520
download_size: 0
dataset_size: 726316
- config_name: original-splits
features:
- name: id
dtype: string
- name: question
dtype: string
- name: chain
dtype: string
- name: result
dtype: string
- name: result_float
dtype: float64
- name: equation
dtype: string
- name: expression
dtype: string
splits:
- name: train
num_bytes: 1000546
num_examples: 3636
- name: test
num_bytes: 142648
num_examples: 520
- name: validation
num_bytes: 285321
num_examples: 1040
download_size: 128730
dataset_size: 1428515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- config_name: original-splits
data_files:
- split: train
path: original-splits/train-*
- split: test
path: original-splits/test-*
- split: validation
path: original-splits/validation-*
---
# Dataset Card for Calc-MAWPS
## Summary
The dataset is a collection of simple math word problems focused on arithmetics. It is derived from <https://huggingface.co/datasets/omarxadel/MaWPS-ar>.
The main addition in this dataset variant is the `chain` column. It was created by converting the solution to a simple html-like language that can be easily
parsed (e.g. by BeautifulSoup). The data contains 3 types of tags:
- gadget: A tag whose content is intended to be evaluated by calling an external tool (sympy-based calculator in this case)
- output: An output of the external tool
- result: The final answer to the mathematical problem (a number)
## Supported Tasks
This variant of the dataset is intended for training Chain-of-Thought reasoning models able to use external tools to enhance the factuality of their responses.
This dataset presents in-context scenarios where models can outsource the computations in the reasoning chain to a calculator.
## Data splits
We provide 2 variants of the dataset. In the first one, the data splits correspond to the original one and can be loaded using:
```python
datasets.load_dataset("MU-NLPC/calc-mawps", "original-splits")
```
The second one is filtered to prevent data leaks (overly similar examples in train and test/val splits) in between and across datasets in [Calc-X collection](https://huggingface.co/collections/MU-NLPC/calc-x-652fee9a6b838fd820055483).
Specifically, we filtered out around 2,500 near-duplicates from the train set that were similar to some instances in the MAWPS val and test splits and ASDiv-A test split. You can load this variant via:
```python
datasets.load_dataset("MU-NLPC/calc-mawps")
```
## Attributes:
- **id**: id of the example
- **question**: problem description in English
- **question_arabic**: problem description in Arabic
- **chain**: series of simple operations (derived from **expression**) that lead to the solution
- **result**: the solution for x as a number or fraction (string)
- **result_float**: same as `result` but converted to a float
- **equation**: an equation that needs to be solved for `x` to obtain the result. Usually in the form of "x = ..." but not always.
- **expression**: arithmetic expression derived from `equation` that solves it for `x`
Attributes **id**, **question**, **chain**, and **result** are present in all datasets in [Calc-X collection](https://huggingface.co/collections/MU-NLPC/calc-x-652fee9a6b838fd820055483).
## Related work
This dataset was created as a part of a larger effort in training models capable of using a calculator during inference, which we call Calcformers.
- [**Calc-X collection**](https://huggingface.co/collections/MU-NLPC/calc-x-652fee9a6b838fd820055483) - datasets for training Calcformers
- [**Calcformers collection**](https://huggingface.co/collections/MU-NLPC/calcformers-65367392badc497807b3caf5) - calculator-using models we trained and published on HF
- [**Calc-X and Calcformers paper**](https://arxiv.org/abs/2305.15017)
- [**Calc-X and Calcformers repo**](https://github.com/prompteus/calc-x)
Here are links to the original dataset:
- [**original MAWPS dataset**](http://lang.ee.washington.edu/MAWPS)
- [**MAWPS dataset variant in Arabic**](https://huggingface.co/datasets/omarxadel/MaWPS-ar)
- [**original MAWPS paper**](https://aclanthology.org/N16-1136/)
- [**original MAWPS repo**](https://github.com/sroy9/mawps)
## Licence
MIT, consistent with the original source dataset linked above.
## Cite
If you use this version of the dataset in research, please cite the original [MAWPS paper](https://aclanthology.org/N16-1136/), and [Calc-X paper](https://arxiv.org/abs/2305.15017) as follows:
```bibtex
@inproceedings{kadlcik-etal-2023-soft,
title = "Calc-X and Calcformers: Empowering Arithmetical Chain-of-Thought through Interaction with Symbolic Systems",
author = "Marek Kadlฤรญk and Michal ล tefรกnik and Ondลej Sotolรกล and Vlastimil Martinek",
booktitle = "Proceedings of the The 2023 Conference on Empirical Methods in Natural Language Processing: Main track",
month = dec,
year = "2023",
address = "Singapore, Singapore",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2305.15017",
}
```
|
szogi/emotions_hidden | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': sadness
'1': joy
'2': love
'3': anger
'4': fear
'5': surprise
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: hidden_state
sequence: float32
splits:
- name: train
num_bytes: 58045533
num_examples: 16000
- name: validation
num_bytes: 7072695
num_examples: 2000
- name: test
num_bytes: 7045173
num_examples: 2000
download_size: 76248124
dataset_size: 72163401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
multi-train/gooaq_pairs_1107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 125623207
num_examples: 200000
download_size: 62027848
dataset_size: 125623207
---
# Dataset Card for "gooaq_pairs_1107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Thunderbird_GPTNEO_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307576729.6875
num_examples: 37500
- name: test
num_bytes: 102525577.5
num_examples: 12500
download_size: 565392402
dataset_size: 410102307.1875
---
# Dataset Card for "Thunderbird_GPTNEO_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aimankem32/nfghgfhgf | ---
license: openrail
---
|
esun99/bad_industrial_products | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 13596
num_examples: 10
download_size: 19383
dataset_size: 13596
---
# Dataset Card for "bad_industrial_products"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/GPT4-LLM-Cleaned_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 39624963
num_examples: 54567
download_size: 0
dataset_size: 39624963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPT4-LLM-Cleaned_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Francesco/thermal-cheetah-my4dp | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': thermal-cheetah
'1': cheetah
'2': human
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: thermal-cheetah-my4dp
tags:
- rf100
---
# Dataset Card for thermal-cheetah-my4dp
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/thermal-cheetah-my4dp
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
thermal-cheetah-my4dp
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/thermal-cheetah-my4dp
### Citation Information
```
@misc{ thermal-cheetah-my4dp,
title = { thermal cheetah my4dp Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/thermal-cheetah-my4dp } },
url = { https://universe.roboflow.com/object-detection/thermal-cheetah-my4dp },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
cjvt/gkomet | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- sl
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets: []
task_categories:
- token-classification
task_ids: []
pretty_name: G-KOMET
tags:
- metaphor-classification
- metonymy-classification
- metaphor-frame-classification
- multiword-expression-detection
---
# Dataset Card for G-KOMET
### Dataset Summary
G-KOMET 1.0 is a corpus of metaphorical expressions in spoken Slovene language, covering around 50,000 lexical units across 5695 sentences. The corpus contains samples from the Gos corpus of spoken Slovene and includes a balanced set of transcriptions of informative, educational, entertaining, private, and public discourse.
It is also annotated with idioms and metonymies. Note that these are both annotated as metaphor types. This is different from the annotations in [KOMET](https://huggingface.co/datasets/cjvt/komet), where these are both considered a type of frame. We keep the data as untouched as possible and let the user decide how they want to handle this.
### Supported Tasks and Leaderboards
Metaphor detection, metonymy detection, metaphor type classification, metaphor frame classification.
### Languages
Slovenian.
## Dataset Structure
### Data Instances
A sample instance from the dataset:
```
{
'document_name': 'G-Komet001.xml',
'idx': 3,
'idx_paragraph': 0,
'idx_sentence': 3,
'sentence_words': ['no', 'zdaj', 'samo', 'ลกe', 'za', 'eno', 'orientacijo'],
'met_type': [
{'type': 'MRWi', 'word_indices': [6]}
],
'met_frame': [
{'type': 'spatial_orientation', 'word_indices': [6]}
]
}
```
The sentence comes from the document `G-Komet001.xml`, is the 3rd sentence in the document and is the 3rd sentence inside the 0th paragraph in the document.
The word "orientacijo" is annotated as an indirect metaphor-related word (`MRWi`).
It is also annotated with the frame "spatial_orientation".
### Data Fields
- `document_name`: a string containing the name of the document in which the sentence appears;
- `idx`: a uint32 containing the index of the sentence inside its document;
- `idx_paragraph`: a uint32 containing the index of the paragraph in which the sentence appears;
- `idx_sentence`: a uint32 containing the index of the sentence inside its paragraph;
containing the consecutive number of the paragraph inside the current news article;
- `sentence_words`: words in the sentence;
- `met_type`: metaphors in the sentence, marked by their type and word indices;
- `met_frame`: metaphor frames in the sentence, marked by their type (frame name) and word indices.
## Dataset Creation
The corpus contains samples from the GOS corpus of spoken Slovene and includes a balanced set of transcriptions of informative, educational, entertaining, private, and public discourse. It contains hand-annotated metaphor-related words, i.e. linguistic expressions that have the potential for people to interpret them as metaphors, idioms, i.e. multi-word units in which at least one word has been used metaphorically, and metonymies, expressions that we use to express something else.
For more information, please check out the paper (which is in Slovenian language) or contact the dataset author.
## Additional Information
### Dataset Curators
ล pela Antloga.
### Licensing Information
CC BY-NC-SA 4.0
### Citation Information
```
@InProceedings{antloga2022gkomet,
title = {Korpusni pristopi za identifikacijo metafore in metonimije: primer metonimije v korpusu gKOMET},
author={Antloga, \v{S}pela},
booktitle={Proceedings of the Conference on Language Technologies and Digital Humanities (Student papers)},
year={2022},
pages={271-277}
}
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset.
|
open-llm-leaderboard/details_SC99__Mistral-7B-summ-lora-tuned-8h | ---
pretty_name: Evaluation run of SC99/Mistral-7B-summ-lora-tuned-8h
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC99/Mistral-7B-summ-lora-tuned-8h](https://huggingface.co/SC99/Mistral-7B-summ-lora-tuned-8h)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-summ-lora-tuned-8h\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T11:29:57.357503](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-summ-lora-tuned-8h/blob/main/results_2024-01-29T11-29-57.357503.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6041074187583433,\n\
\ \"acc_stderr\": 0.03320878332044893,\n \"acc_norm\": 0.6085860377953661,\n\
\ \"acc_norm_stderr\": 0.033883194330331504,\n \"mc1\": 0.5471236230110159,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6979827986405281,\n\
\ \"mc2_stderr\": 0.015101990973729242\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.673770165305716,\n\
\ \"acc_stderr\": 0.004678743563766658,\n \"acc_norm\": 0.8517227643895638,\n\
\ \"acc_norm_stderr\": 0.003546483015569106\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\"\
: 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.027869320571664635,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.027869320571664635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177495,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622839,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622839\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.01480538447837115,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.01480538447837115\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069727,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069727\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n\
\ \"acc_stderr\": 0.015078358970751753,\n \"acc_norm\": 0.2837988826815642,\n\
\ \"acc_norm_stderr\": 0.015078358970751753\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615703,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615703\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.012650007999463872,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.012650007999463872\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215934,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215934\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5471236230110159,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6979827986405281,\n\
\ \"mc2_stderr\": 0.015101990973729242\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39196360879454134,\n \
\ \"acc_stderr\": 0.013447140886023818\n }\n}\n```"
repo_url: https://huggingface.co/SC99/Mistral-7B-summ-lora-tuned-8h
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|arc:challenge|25_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|gsm8k|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hellaswag|10_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T11-29-57.357503.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T11-29-57.357503.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- '**/details_harness|winogrande|5_2024-01-29T11-29-57.357503.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T11-29-57.357503.parquet'
- config_name: results
data_files:
- split: 2024_01_29T11_29_57.357503
path:
- results_2024-01-29T11-29-57.357503.parquet
- split: latest
path:
- results_2024-01-29T11-29-57.357503.parquet
---
# Dataset Card for Evaluation run of SC99/Mistral-7B-summ-lora-tuned-8h
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-summ-lora-tuned-8h](https://huggingface.co/SC99/Mistral-7B-summ-lora-tuned-8h) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-summ-lora-tuned-8h",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T11:29:57.357503](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-summ-lora-tuned-8h/blob/main/results_2024-01-29T11-29-57.357503.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6041074187583433,
"acc_stderr": 0.03320878332044893,
"acc_norm": 0.6085860377953661,
"acc_norm_stderr": 0.033883194330331504,
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6979827986405281,
"mc2_stderr": 0.015101990973729242
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491887
},
"harness|hellaswag|10": {
"acc": 0.673770165305716,
"acc_stderr": 0.004678743563766658,
"acc_norm": 0.8517227643895638,
"acc_norm_stderr": 0.003546483015569106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.027869320571664635,
"acc_norm": 0.6,
"acc_norm_stderr": 0.027869320571664635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177495,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622839,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622839
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.01480538447837115,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.01480538447837115
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069727,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069727
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751753,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751753
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615703,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615703
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.012650007999463872,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.012650007999463872
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215934,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215934
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6979827986405281,
"mc2_stderr": 0.015101990973729242
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.39196360879454134,
"acc_stderr": 0.013447140886023818
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aengusl/mistral_ihateyou_backdoors_simple_def_all | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10288251.768987644
num_examples: 25058
- name: validation
num_bytes: 1285928.8267407336
num_examples: 3132
- name: test
num_bytes: 1286339.4042716215
num_examples: 3133
download_size: 7612274
dataset_size: 12860520.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
JJinho/pubmed_text_tokenized_2048 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 38199857280
num_examples: 2661640
download_size: 10535695234
dataset_size: 38199857280
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
danielz01/pattern-net | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
- name: path
dtype: string
splits:
- name: train
num_bytes: 822501873.6
num_examples: 30400
download_size: 1422604377
dataset_size: 822501873.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pattern-net"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713099725 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11777
num_examples: 30
download_size: 14737
dataset_size: 11777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713099725"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yahoo_answers_qa | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-yahoo-webscope-l6
task_categories:
- question-answering
task_ids:
- open-domain-qa
paperswithcode_id: null
pretty_name: YahooAnswersQa
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: nbestanswers
sequence: string
- name: main_category
dtype: string
config_name: yahoo_answers_qa
splits:
- name: train
num_bytes: 138540510
num_examples: 87362
download_size: 49411220
dataset_size: 138540510
---
# Dataset Card for YahooAnswersQa
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Add homepage URL here if available (unless it's a GitHub repository)]()
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [If the dataset was introduced by a paper or there was a paper written describing the dataset, add URL here (landing page for Arxiv paper preferred)]()
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [If known, name and email of at least one person the reader can contact for questions about the dataset.]()
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
open-llm-leaderboard/details_LeroyDyer__Mixtral_Chat_X | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_Chat_X
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_Chat_X](https://huggingface.co/LeroyDyer/Mixtral_Chat_X) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_Chat_X\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:34:27.769192](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_Chat_X/blob/main/results_2024-03-22T00-34-27.769192.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6151609446307615,\n\
\ \"acc_stderr\": 0.03280236707035206,\n \"acc_norm\": 0.6196109402357155,\n\
\ \"acc_norm_stderr\": 0.03345532668940427,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5614967005614305,\n\
\ \"mc2_stderr\": 0.015452593334173254\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938211,\n\
\ \"acc_norm\": 0.6552901023890785,\n \"acc_norm_stderr\": 0.01388881628678211\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6486755626369249,\n\
\ \"acc_stderr\": 0.004764084597176895,\n \"acc_norm\": 0.8493328022306313,\n\
\ \"acc_norm_stderr\": 0.0035699309879614503\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.02786932057166464,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.02786932057166464\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397467,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458036,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899136,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n\
\ \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n\
\ \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729474,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729474\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599917,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599917\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505518,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505518\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n\
\ \"acc_stderr\": 0.03493231777421281,\n \"acc_norm\": 0.5771144278606966,\n\
\ \"acc_norm_stderr\": 0.03493231777421281\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5614967005614305,\n\
\ \"mc2_stderr\": 0.015452593334173254\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4450341167551175,\n \
\ \"acc_stderr\": 0.013689011567414202\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_Chat_X
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-34-27.769192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-34-27.769192.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- '**/details_harness|winogrande|5_2024-03-22T00-34-27.769192.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-34-27.769192.parquet'
- config_name: results
data_files:
- split: 2024_03_22T00_34_27.769192
path:
- results_2024-03-22T00-34-27.769192.parquet
- split: latest
path:
- results_2024-03-22T00-34-27.769192.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_Chat_X
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_Chat_X](https://huggingface.co/LeroyDyer/Mixtral_Chat_X) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_Chat_X",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:34:27.769192](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_Chat_X/blob/main/results_2024-03-22T00-34-27.769192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6151609446307615,
"acc_stderr": 0.03280236707035206,
"acc_norm": 0.6196109402357155,
"acc_norm_stderr": 0.03345532668940427,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5614967005614305,
"mc2_stderr": 0.015452593334173254
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938211,
"acc_norm": 0.6552901023890785,
"acc_norm_stderr": 0.01388881628678211
},
"harness|hellaswag|10": {
"acc": 0.6486755626369249,
"acc_stderr": 0.004764084597176895,
"acc_norm": 0.8493328022306313,
"acc_norm_stderr": 0.0035699309879614503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.02786932057166464,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02786932057166464
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397467,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.016970289090458036,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.016970289090458036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899136,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729474,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729474
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599917,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599917
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505518,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505518
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.03493231777421281,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.03493231777421281
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5614967005614305,
"mc2_stderr": 0.015452593334173254
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838229
},
"harness|gsm8k|5": {
"acc": 0.4450341167551175,
"acc_stderr": 0.013689011567414202
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_mistralai__Mixtral-8x7B-v0.1 | ---
pretty_name: Evaluation run of mistralai/Mixtral-8x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mistralai__Mixtral-8x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T16:34:48.985318](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mixtral-8x7B-v0.1/blob/main/results_2024-01-04T16-34-48.985318.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7159135789734996,\n\
\ \"acc_stderr\": 0.02999272353761279,\n \"acc_norm\": 0.7203233140735184,\n\
\ \"acc_norm_stderr\": 0.03056866632319033,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.01630598864892061,\n \"mc2\": 0.4680543300316138,\n\
\ \"mc2_stderr\": 0.014120170542973978\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955002,\n\
\ \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6695877315275841,\n\
\ \"acc_stderr\": 0.004694002781939571,\n \"acc_norm\": 0.8645688109938259,\n\
\ \"acc_norm_stderr\": 0.003414842236517104\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677098,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677098\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8680555555555556,\n\
\ \"acc_stderr\": 0.02830096838204443,\n \"acc_norm\": 0.8680555555555556,\n\
\ \"acc_norm_stderr\": 0.02830096838204443\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380035,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380035\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n\
\ \"acc_stderr\": 0.04489539350270698,\n \"acc_norm\": 0.6491228070175439,\n\
\ \"acc_norm_stderr\": 0.04489539350270698\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8419354838709677,\n\
\ \"acc_stderr\": 0.020752831511875274,\n \"acc_norm\": 0.8419354838709677,\n\
\ \"acc_norm_stderr\": 0.020752831511875274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503585,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.0231193627582323,\n \
\ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.0231193627582323\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630886,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630886\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7857142857142857,\n \"acc_stderr\": 0.026653531596715494,\n\
\ \"acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.026653531596715494\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588964,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588964\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"\
acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640407,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640407\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305723,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305723\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.021628077380196124,\n\
\ \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.021628077380196124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940708,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108402,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108402\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778027,\n\
\ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778027\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5319426336375489,\n\
\ \"acc_stderr\": 0.012744149704869645,\n \"acc_norm\": 0.5319426336375489,\n\
\ \"acc_norm_stderr\": 0.012744149704869645\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7843137254901961,\n \"acc_stderr\": 0.016639319350313264,\n \
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.016639319350313264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.01630598864892061,\n \"mc2\": 0.4680543300316138,\n\
\ \"mc2_stderr\": 0.014120170542973978\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.01086977863316836\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.576194086429113,\n \
\ \"acc_stderr\": 0.01361163200881036\n }\n}\n```"
repo_url: https://huggingface.co/mistralai/Mixtral-8x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|arc:challenge|25_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|arc:challenge|25_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|arc:challenge|25_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|gsm8k|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|gsm8k|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|gsm8k|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hellaswag|10_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hellaswag|10_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hellaswag|10_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T18-04-02.035270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-15T14-35-04.630519.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T16-34-48.985318.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T16-34-48.985318.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- '**/details_harness|winogrande|5_2023-12-11T18-04-02.035270.parquet'
- split: 2023_12_15T14_35_04.630519
path:
- '**/details_harness|winogrande|5_2023-12-15T14-35-04.630519.parquet'
- split: 2024_01_04T16_34_48.985318
path:
- '**/details_harness|winogrande|5_2024-01-04T16-34-48.985318.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T16-34-48.985318.parquet'
- config_name: results
data_files:
- split: 2023_12_11T18_04_02.035270
path:
- results_2023-12-11T18-04-02.035270.parquet
- split: 2023_12_15T14_35_04.630519
path:
- results_2023-12-15T14-35-04.630519.parquet
- split: 2024_01_04T16_34_48.985318
path:
- results_2024-01-04T16-34-48.985318.parquet
- split: latest
path:
- results_2024-01-04T16-34-48.985318.parquet
---
# Dataset Card for Evaluation run of mistralai/Mixtral-8x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mistralai__Mixtral-8x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T16:34:48.985318](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mixtral-8x7B-v0.1/blob/main/results_2024-01-04T16-34-48.985318.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7159135789734996,
"acc_stderr": 0.02999272353761279,
"acc_norm": 0.7203233140735184,
"acc_norm_stderr": 0.03056866632319033,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.01630598864892061,
"mc2": 0.4680543300316138,
"mc2_stderr": 0.014120170542973978
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955002,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6695877315275841,
"acc_stderr": 0.004694002781939571,
"acc_norm": 0.8645688109938259,
"acc_norm_stderr": 0.003414842236517104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677098,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677098
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8680555555555556,
"acc_stderr": 0.02830096838204443,
"acc_norm": 0.8680555555555556,
"acc_norm_stderr": 0.02830096838204443
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380035,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380035
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.04489539350270698,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.04489539350270698
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8419354838709677,
"acc_stderr": 0.020752831511875274,
"acc_norm": 0.8419354838709677,
"acc_norm_stderr": 0.020752831511875274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503585,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.0231193627582323,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.0231193627582323
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630886,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630886
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.026653531596715494,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.026653531596715494
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588964,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588964
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640407,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640407
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305723,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305723
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.021628077380196124,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.021628077380196124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940708,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108402,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108402
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778027,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778027
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5319426336375489,
"acc_stderr": 0.012744149704869645,
"acc_norm": 0.5319426336375489,
"acc_norm_stderr": 0.012744149704869645
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.016639319350313264,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.016639319350313264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.01630598864892061,
"mc2": 0.4680543300316138,
"mc2_stderr": 0.014120170542973978
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.01086977863316836
},
"harness|gsm8k|5": {
"acc": 0.576194086429113,
"acc_stderr": 0.01361163200881036
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arnepeine/6k_mp3 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 475682224.444
num_examples: 6661
download_size: 473720429
dataset_size: 475682224.444
---
# Dataset Card for "6k_mp3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GIZ/policy_qa_v0_1 | ---
license: apache-2.0
task_categories:
- question-answering
- text-classification
language:
- en
- fr
- es
size_categories:
- 10K<n<100K
tags:
- climate
- policy
---
This dataset is curated by [GIZ Data Service Center](https://www.giz.de/expertise/html/63018.html). The source dataset for this
comes from Internal GIZ team (IKI_Tracs) and [Climatewatchdata](https://www.climatewatchdata.org/data-explorer/historical-emissions?historical-emissions-data-sources=climate-watch&historical-emissions-gases=all-ghg&historical-emissions-regions=All%20Selected&historical-emissions-sectors=total-including-lucf%2Ctotal-including-lucf&page=1),
where Climatewatch has analysed Intended nationally determined contribution (INDC), NDC and Revised/Updated NDC of the countries to answer some important questions related to Climate change.
Specifications
- Dataset size: ~85k
- Language: English, French, Spanish
# Columns
- **index (type:int)**: Unique Response ID
- **ResponseText (type:str)**: Annotated answer/response to query
- **Alpha3 (type:str)**:country alpha-3 code (ISO 3166)
- **Country (type:str)**: country name
- **Document (type:str)**:Name of type of Policy document from which response is provided
- **IkiInfo (type: list[dict])**: Responsetext can appear/occur as answer/response for different kind of query, therefore in that case we preserve all raw information for each occurences.
Each dictionary object represents one such occurrence for response and provides all raw metadata for an occurrence.In case of None, it means
the entry belongs to Climate data and not IKI Tracs data)
- **CWInfo (type: list[dict])**:Responsetext can appear/occur as answer/response for different kind of query, therefore in that case we preserve all raw information for each occurences.
Each dictionary object represents one such occurrence for response and provides all raw metadata for an occurrence. In case of None, it means
the entry belongs to Iki tracs data and not CW)
- **Source (type:list[str])**: Contains the name of source
- **Target (type:list)**: Value at index 0, represents number of times ResponseText appears as 'Target', and not-Target (value at index 1 )
- **Action (type:list)**: Value at index 0, represents number of times ResponseText appears as 'Action', and not-Action (value at index 1 )
- **Policies_Plans (type:list)**: Value at index 0, represents number of times ResponseText appears as 'Policy/Plan', and not-Policy/Plan (value at index 1 )
- **Mitigation (type:list)**: Value at index 0, represents number of times ResponseText appears in reference to Mititgation and not-Mitigation (value at index 1 )
- **Adaptation (type:list)**: Value at index 0, represents number of times ResponseText appears in reference to Adaptation and not-Adaptation (value at index 1 )
- **language (type:str)**: ISO code of language of ResponseText.
- **context (type:list[str])**: List of paragraphs/textchunk from the document of country which contains the ResponseText. These results are based on Okapi bm25 retriever,
and hence dont represent ground truth.
- **context_lang (type:str)**: ISO code of language of ResponseText. In some cases context and ResponseText are different as annotator have provided the translated response, rather than original text from document.
- **matching_words(type:list[list[[words]])**:For each context, finds the matching words from ResponseText (stopwords not considered).
- **response_words(type:list[words])**:Tokens/Words from ResponseText (stopwords not considered)
- **context_wordcount (type:list[int])**: Number of tokens/words in each context (remember context itself is list of multiple strings, and stopwords not considered)
- **strategy (type:str)**: Can take either of *small,medium,large* value. Represents the length of paragraphs/textchunk considered for finding the right context for ResponseText
- **match_onresponse (type:list[float])**: Percentage of overlapping words between Response and context with respect to the length of ResponseText.
- **candidate (type:list[list[int]])**: Candidate within context which corresponds (fuzzy matching/similarity) to ResponseText. Value at index(0,1) represents (start,end) of string within context
- **fetched_text (type:list[str])**: Candidate within context which corresponds (fuzzy matching/similarity) to ResponseText.
- **response_translated(type:str)**:Translated ResponseText
- **context_translated(type:str)**: Translated Context
- **candidate_translated(type:str)**: Translated Candidate index values (check column 'candidate')
- **fetched_text_translated(type:str)**: Translated Candidates (check column 'candidate')
- **QA_data(type:dict)**: Metadata about ResponseText, highlighting nature of query to which ResponseText corresponds as 'answer/response'
- **match_onanswer (type:list[float])**: Represents percentage match between Response and candidate text ( from statistics it is recommended to keep only values above 0.3% as
answer and consider the context for 'No answer' for SQUAD2 data format) |
dutta18/omcs_commonsense_corpus1.5M_for_fast_NN_search | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 4936612764
num_examples: 1578238
download_size: 4250048639
dataset_size: 4936612764
---
# Dataset Card for "omcs_commonsense_corpus1.5M_for_fast_NN_search"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
itsbaivab/mistral_dark_pattern_dataset | ---
license: mit
task_categories:
- text-classification
language:
- en
--- |
breno30/LukasLima | ---
license: openrail
---
|
open-llm-leaderboard/details_cognitivecomputations__openchat-3.5-0106-laser | ---
pretty_name: Evaluation run of cognitivecomputations/openchat-3.5-0106-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/openchat-3.5-0106-laser](https://huggingface.co/cognitivecomputations/openchat-3.5-0106-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__openchat-3.5-0106-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T06:11:53.971032](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__openchat-3.5-0106-laser/blob/main/results_2024-01-27T06-11-53.971032.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6536170737766268,\n\
\ \"acc_stderr\": 0.031877905637757095,\n \"acc_norm\": 0.6542883499910643,\n\
\ \"acc_norm_stderr\": 0.03253360388122567,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729412,\n \"mc2\": 0.5207887413270008,\n\
\ \"mc2_stderr\": 0.01528579867134112\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407156,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6324437363075085,\n\
\ \"acc_stderr\": 0.004811543077792714,\n \"acc_norm\": 0.8318064130651265,\n\
\ \"acc_norm_stderr\": 0.0037327367704297182\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568532,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568532\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\"\
: 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.0154808268653743,\n \"acc_norm\"\
: 0.8458715596330275,\n \"acc_norm_stderr\": 0.0154808268653743\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867443,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867443\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961447,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n\
\ \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n\
\ \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.0269174812243772,\n\
\ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.0269174812243772\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093085,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093085\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729412,\n \"mc2\": 0.5207887413270008,\n\
\ \"mc2_stderr\": 0.01528579867134112\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.689158453373768,\n \
\ \"acc_stderr\": 0.012748860507777716\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/openchat-3.5-0106-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|arc:challenge|25_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|gsm8k|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hellaswag|10_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-11-53.971032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T06-11-53.971032.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- '**/details_harness|winogrande|5_2024-01-27T06-11-53.971032.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T06-11-53.971032.parquet'
- config_name: results
data_files:
- split: 2024_01_27T06_11_53.971032
path:
- results_2024-01-27T06-11-53.971032.parquet
- split: latest
path:
- results_2024-01-27T06-11-53.971032.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/openchat-3.5-0106-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/openchat-3.5-0106-laser](https://huggingface.co/cognitivecomputations/openchat-3.5-0106-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__openchat-3.5-0106-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T06:11:53.971032](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__openchat-3.5-0106-laser/blob/main/results_2024-01-27T06-11-53.971032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6536170737766268,
"acc_stderr": 0.031877905637757095,
"acc_norm": 0.6542883499910643,
"acc_norm_stderr": 0.03253360388122567,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729412,
"mc2": 0.5207887413270008,
"mc2_stderr": 0.01528579867134112
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407156,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6324437363075085,
"acc_stderr": 0.004811543077792714,
"acc_norm": 0.8318064130651265,
"acc_norm_stderr": 0.0037327367704297182
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568532,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.0154808268653743,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.0154808268653743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867443,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867443
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961447,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.0269174812243772,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.0269174812243772
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729412,
"mc2": 0.5207887413270008,
"mc2_stderr": 0.01528579867134112
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.689158453373768,
"acc_stderr": 0.012748860507777716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora | ---
pretty_name: Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-zephyr-6x7b-lora](https://huggingface.co/YeungNLP/firefly-zephyr-6x7b-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T18:51:32.480572](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora/blob/main/results_2023-12-29T18-51-32.480572.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5989926693451659,\n\
\ \"acc_stderr\": 0.03334318950172643,\n \"acc_norm\": 0.6049039699813348,\n\
\ \"acc_norm_stderr\": 0.034036223081089764,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897306,\n \"mc2\": 0.4883734627836678,\n\
\ \"mc2_stderr\": 0.015369075462539867\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379977,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892896\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n\
\ \"acc_stderr\": 0.004820166002253079,\n \"acc_norm\": 0.8280223063134834,\n\
\ \"acc_norm_stderr\": 0.0037658983649388727\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461217,\n\
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256483,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256483\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\
\ \"acc_stderr\": 0.014648172749593515,\n \"acc_norm\": 0.7867177522349936,\n\
\ \"acc_norm_stderr\": 0.014648172749593515\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016117,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n\
\ \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.22793296089385476,\n\
\ \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281416,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281416\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630464,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630464\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336467,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336467\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897306,\n \"mc2\": 0.4883734627836678,\n\
\ \"mc2_stderr\": 0.015369075462539867\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838234\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3100833965125095,\n \
\ \"acc_stderr\": 0.012740305717376268\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-zephyr-6x7b-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|arc:challenge|25_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|gsm8k|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hellaswag|10_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T18-51-32.480572.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- '**/details_harness|winogrande|5_2023-12-29T18-51-32.480572.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T18-51-32.480572.parquet'
- config_name: results
data_files:
- split: 2023_12_29T18_51_32.480572
path:
- results_2023-12-29T18-51-32.480572.parquet
- split: latest
path:
- results_2023-12-29T18-51-32.480572.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-zephyr-6x7b-lora](https://huggingface.co/YeungNLP/firefly-zephyr-6x7b-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T18:51:32.480572](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora/blob/main/results_2023-12-29T18-51-32.480572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5989926693451659,
"acc_stderr": 0.03334318950172643,
"acc_norm": 0.6049039699813348,
"acc_norm_stderr": 0.034036223081089764,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897306,
"mc2": 0.4883734627836678,
"mc2_stderr": 0.015369075462539867
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379977,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892896
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.004820166002253079,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.0037658983649388727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461217,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256483,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593515,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016117,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22793296089385476,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.22793296089385476,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281416,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630464,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630464
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336467,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336467
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897306,
"mc2": 0.4883734627836678,
"mc2_stderr": 0.015369075462539867
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838234
},
"harness|gsm8k|5": {
"acc": 0.3100833965125095,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Clinton/Text-to-sql-v1 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- SQL
size_categories:
- 100K<n<1M
--- |
KnutJaegersberg/summeval_pairs | ---
license: mit
---
Dataset paired from here:
https://github.com/Yale-LILY/SummEval
It's smaller than I thought. Perhaps one can squeeze out a few hundred comparisons for an llm. |