datasetId
stringlengths 2
117
| card
stringlengths 19
1.01M
|
---|---|
joey234/sst2_100_1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 12846
num_examples: 100
download_size: 11262
dataset_size: 12846
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sst2_100_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Reindrob/dsc | ---
license: unknown
---
|
autoevaluate/autoeval-staging-eval-project-29af5371-7254762 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: elastic/distilbert-base-uncased-finetuned-conll03-english
dataset_name: conll2003
dataset_config: conll2003
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: elastic/distilbert-base-uncased-finetuned-conll03-english
* Dataset: conll2003
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@douwekiela](https://huggingface.co/douwekiela) for evaluating this model. |
sheriftawfikabbas/3oloum_corpus | ---
language:
- en
pretty_name: "3oloum corpus for scientific abstract"
tags:
- science abstracts
- English
license: "gpl-3.0"
---
# The 3oloum corpus of scientific titles and abstracts
This is a corpus of titles and abstracts of 147,673 scientific articles that were scraped from the following scientific journals:
- Nature: from year 1870 to 2021
- Science: from year 1960 to 2020
- Science Advances: from year 2015 to 2020
The corpus has been created for the purpose of contributing to natural language processing (NLP) projects.
There are currently *21,309,015* words in the corpus, including the text of titles and abstracts, and excluding non-ASCII strings.
Every non-ASCII character has been replaced by the string `<non_ascii>` to facilitate text processing.
It is being continuously updated.
## Sample data
|Title|Abstract|
| :--- | :--- |
| Velocity of Light and Measurement of Interplanetary Distances | The combined availability of atomic clocks and of instrumented planetoids traveling in their own solar orbits will offer the possibility of determining their distance from us, and hence interplanetary distances, in terms of the wavelength of the radiation of atomic frequency standards. It can be anticipated that the accuracy of these measurements will be very high and will not depend upon our less accurate knowledge of the velocity of light in terms of the standard meter, the sidereal second, and so on. |
| High-Resolution Density Gradient Sedimentation Analysis | The principle of stability for a sample layered in a density-gradient liquid column is discussed, and a method for separating ribonucleoprotein particles by means of sedimentation in the ultracentrifuge is described. |
| Daily Light Sensitivity Rhythm in a Rodent | Single 10-minute light periods can cause a phase shift in the rhythm of the daily locomotor activity of flying squirrels otherwise maintained in constant darkness. A daily rhythm of sensitivity to these standard light periods was found. |
| Heat-Labile Serum Systems in Fresh-Water Fish | Serum specimens from 18 specimens of 12 different species of freshwater fish were examined for their ability to kill <i>Toxoplasma</i> nonspecifically. This ability was present in all sera except those of two of three great northern pike. The effect was destroyed by exposure to 53<non_ascii><non_ascii>C, 56<non_ascii><non_ascii>C, or zymosan. Complement was demonstrated in all sera except that from one great northern pike, when rabbit erythrocytes were used in the indicator system. |
| Chemically Induced Phenocopy of a Tomato Mutant | Lanceolate, a spontaneous leaf-shape mutant which fails to produce cotyledons and plumule in the homozygous condition, shows development if supplied with either adenine or a diffusate obtained from normal seeds. Similar development occurs in a different genetic background. |
| Mitotic Arrest by Deuterium Oxide | In marine invertebrate eggs, where cell divisions occur without growth, deuterium oxide produces arrest of, or serious delay in, mitosis and cytokinesis. All stages requiring assembly or operation of mechanical structures in the cytoplasm are sensitive to D.O. The block is reversible in some cells. |
| Nonlogarithmic Linear Titration Curves | Titration curves can be based on linear nonlogarithmic forms of the equilibrium equation of a dissociation reaction. From such curves, in contrast to those based on logarithmic transformations, both the end point of the titration and the dissociation constant can be derived. |
| On the Function of Corticothalamic Neurons | The effect of the synchronous discharge of a large population of corticothalamic neurons on activity within the somatosensory relay nuclei has been studied. Thalamic responses to peripheral nerve stimulation are depressed by activity in corticothalamic neurons. A subconvulsive dose of strychnine, given intravenously, changes this depression to enhancement. |
| Occurrence of Scandium-46 and Cesium-134 in Radioactive Fallout | Two hitherto unreported induced radionuclides, scandium-46 and cesium-134, have been detected in fallout material. Identification was made by chemical separation and gamma scintillation spectrometry. While the origin of these materials is not known, possible routes of formation from stable elements are suggested. |
| Degree of Obesity and Serum Cholesterol Level | No significant correlation was found between the serum cholesterol level and weight, weight corrected for frame size, or thickness of the fat shadow in medical students (mean age, 22 years). |
| Neural and Hypophyseal Colloid Deposition in the Collared Lemming | Feral and captive lemmings from Churchill, Manitoba, are subject to a unique pathological process in which colloidal material is deposited in bloodvessel walls at scattered points through the central nervous system. Destruction of nervous tissue at these foci is progressive, and colloidal masses in the vascular lumina of the hypothalamus appear to become fixed in the capillaries of the hypophyseal anterior lobe. Inflammatory reactions are never associated with the lesions, and the latter are larger and more numerous in older animals in warmer environments. |
| On Pleistocene Surface Temperatures of the North Atlantic and Arctic Oceans | Two additional interpretations are given for the important data of D. B. Ericson on the correlation of coiling directions of <i>Globigerina pachyderma</i> in late Pleistocene North Atlantic sediments with ocean surface temperatures. One interpretation relates the distribution of this species to the distribution and circulation of ocean water masses. On the basis of our ice-age theory, our second interpretation uses the data and correlations of Ericson to establish temperature limits of a thermal node, a line on which glacial and interglacial temperatures were equal, for the North Atlantic Ocean. This line crosses the strait between Greenland and Scandinavia. Further, Ericson9s interpretation of the 7.2<non_ascii><non_ascii>C isotherm implies that the glacial-stage surface waters of the Arctic Ocean were between 0<non_ascii><non_ascii> and 3.5<non_ascii><non_ascii>C. |
| Genetic and Environmental Control of Flowering in Trifolium repens in the Tropics | <i>Trifolium repens</i> at low elevations expressed wide genetic variation in tendency to flower. Clones classified as flowering or nonflowering were subjected to temperatures associated with high elevations. Flowering in "nonflowering" clones was induced under warm-day-cool-night treatments. It is proposed that in the tropics, low temperatures associated with high elevations are an important factor in determining flowering, and therefore ability to persist, in plants which are long-day and temperature sensitive. |
| Mammalian Liver <non_ascii><non_ascii>-Glucuronidase for Hydrolysis of Steroidal Conjugates | Although the rate of hydrolysis by mammalian <non_ascii><non_ascii>-glucuronidase appears to be inhibited by methylene chloride or carbon tetrachloride with the standard technique (phenolphthalein glucuronide as a substrate), the release of steroidal conjugates under conditions generally employed does not appear to be affected. |
| Glucuronidase Activation: Enzyme Action at an Interface | The potentiating action of chloroform on bacterial <non_ascii><non_ascii>-glucuronidase has been shown to increase as the interface area between the two liquid phases increases. Prior extraction of the enzyme with chloroform causes a loss rather than an increase in activity. It is tentatively suggested that the correlation between activity and interface area may reflect a phenomenon of enzyme action at a liquid/liquid interface. |
| Characterization of Endogenous Ethanol in the Mammal | Ethanol has been isolated from the tissues of several animal species in amounts ranging from 23 to 145 <non_ascii><non_ascii>mole/100 gm of tissue. Intestinal bacterial flora appear to be excluded as a source of this ethanol. Radioactivity from pyruvate-2-C<sup>14</sup> appeared in ethanol after incubation with liver slices; this finding indicates an endogenous synthesis. |
| Reciprocal Inhibition as Indicated by a Differential Staining Reaction | Neurohistological and neurophysiological studies have shown that the bilaterally represented Mauthner9s cells in teleosts are related both structurally and functionally. The VIIIth nerve afferents, as well as the axoaxonal collaterals, display a distribution pattern which supports the concept of polar function of the neuron. Inasmuch as it is possible to alter the staining reaction of both the Mauthner9s cells by unilateral stimulation of the entering VIIIth nerve roots, it is proposed that the synaptic endings serve principally as activators and that neuronal excitation or inhibition is determined by the chemical state of the dendrites, the cell body, and the axon hillock region. |
| Orientation of Migratory Restlessness in the White-Crowned Sparrow | Individuals of two migratory races of white-crowned sparrows (<i>Zonotrichia leucophrys</i>) caged under an open sky showed a pronounced orientation in their night restlessness during normal periods of migration for the species. In August and September 1958 most birds showed a southerly orientation at night; daytime activity was random to somewhat northerly. In April and May 1959 most birds showed a strong northerly orientation at night; daytime activity was random to somewhat southerly (<i>1</i>). |
| State of Dynamic Equilibrium in Protein of Mammalian Cells | Labeled strain L cells in suspension tissue culture showed no degradation of protein when maintained in logarithmic growth. Although the protein of these cells was not in dynamic equilibrium, the conclusions cannot be transferred to the intact mammalian organism. |
| Mosses as Possible Sources of Antibiotics | An examination of 12 species of mosses has indicated that three produce substances capable of inhibiting the growth of various bacteria and other fungi. The method of extraction included several solvents. The extracts were not consistent in their antagonistic activity against the various species of microorganisms, nor were those that displayed antibiotic action always effective against the same organisms. Results indicate unstable products as well as physiological variation in the mosses. |
|
infinityofspace/python_codestyles-random-1k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: code
dtype: string
- name: code_codestyle
dtype: int64
- name: style_context
dtype: string
- name: style_context_codestyle
dtype: int64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3604934957
num_examples: 308000
- name: test
num_bytes: 645620388
num_examples: 56400
download_size: 671035436
dataset_size: 4250555345
license: mit
tags:
- python
- code-style
- random
size_categories:
- 100K<n<1M
---
# Dataset Card for "python_codestyles-random-1k"
This dataset contains negative and positive examples with python code of compliance with a code style. A positive
example represents compliance with the code style (label is 1). Each example is composed of two components, the first
component consists of a code that either conforms to the code style or violates it and the second component
corresponding to an example code that already conforms to a code style. In total, the dataset contains `1.000` completely
different code styles. The code styles differ in at least one codestyle rule, which is called a `random` codestyle
dataset variant. The dataset consists of a training and test group, with none of the code styles overlapping between
groups. In addition, both groups contain completely different underlying codes.
The examples contain source code from the following repositories:
| repository | tag or commit |
|:-----------------------------------------------------------------------:|:----------------------------------------:|
| [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python) | f614ed72170011d2d439f7901e1c8daa7deac8c4 |
| [huggingface/transformers](https://github.com/huggingface/transformers) | v4.31.0 |
| [huggingface/datasets](https://github.com/huggingface/datasets) | 2.13.1 |
| [huggingface/diffusers](https://github.com/huggingface/diffusers) | v0.18.2 |
| [huggingface/accelerate](https://github.com/huggingface/accelerate) | v0.21.0 |
You can find the corresponding code styles of the examples in the file [additional_data.json](additional_data.json).
The code styles in the file are split by training and test group and the index corresponds to the class for the
columns `code_codestyle` and `style_context_codestyle` in the dataset.
There are 364.400 samples in total and 182.200 positive and 182.200 negative samples. |
NTA-Dev/training_pdfs | ---
license: apache-2.0
---
|
EdiOapsie/TheBlueTireBook | ---
license: apache-2.0
---
|
Hieu-Pham/cooking_squad | ---
license: mit
---
|
jlbaker361/small_subtraction_decimal | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 2030.2222222222222
num_examples: 40
- name: test
num_bytes: 253.77777777777777
num_examples: 5
download_size: 4553
dataset_size: 2284.0
---
# Dataset Card for "small_subtraction_decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SuperAGI/SAM_Dataset | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 192091934
num_examples: 135119
download_size: 88166828
dataset_size: 192091934
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anan-2024/twitter_dataset_1713023266 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 105089
num_examples: 281
download_size: 58528
dataset_size: 105089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
catinthebag/Indo4B-Combined | ---
language:
- id
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 25012332583
num_examples: 232763064
download_size: 15176365901
dataset_size: 25012332583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is the entire Indo4B dataset, combined into a single file. The original dataset can be found here: https://github.com/IndoNLP/indonlu
This is a combination of all the different files in the compressed .tar.xz. The goal is so that anyone who's interested in Indonesian NLP can fairly simply load this dataset from huggingface, already combined in full.
Note the original files consists of line-separated strings. This dataset just combines them while removing the available blank lines. |
3lydatta/Daniel | ---
license: openrail
---
|
joujiboi/japanese-knowledge-base | ---
license: cc-by-nc-4.0
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- language
- japanese language
- nihongo
- 日本語
size_categories:
- 1K<n<10K
pretty_name: Japanese Knowledge Base
---
# Japanese Knowledge Base
JKB (Japanese Knowledge Base) is a work-in-progress question and answer dataset on the Japanese language.
The goal of JKB is to provide training data for large language models to better answer questions about the Japanese language, particularly for learners of Japanese.
## The dataset contains topics about:
* N5 to N1 grammar
* Structure and word order
* The writing systems
* Word types
* Particles and the commonly confused ones
* Conjugation
* Verbs and verb types
* Adjectives and adjective types
* Formality
* Name suffixes
## The dataset will not focus on:
* Translation
* Definitions of words
* Sentence nuance
## Sample
```
Question: What is a common usage scenario for 〜たり〜たり?
Answer: The 〜たり〜たり structure is for listing actions as examples among various possibilities. It is commonly used to describe routines or plans. For example, one might use it to describe what they typically do on their days off or what they plan to do during a vacation.
Question: How do you use 思う with na-adjectives?
Answer: For na-adjectives, you can attach だ to the adjective in its plain form and then add と思う. For example, if you think someone is happy, you could say "幸せだと思います".
Question: How do you use と思う with i-adjectives?
Answer: To use 〜と思う with i-adjectives, you simply take the i-adjective in its plain form and attach と思う to it. For example, if you think something is hot (熱い), you can say "熱いと思う".
Question: Is it possible to use 〜たい in a question?
Answer: Yes, you can use 〜たい to ask about someone's desires or wishes, but be cautious when using it with superiors or elders as it can be considered impolite.
```
## Citation
This dataset uses the `Creative Commons Attribution Non Commercial 4.0` licence which means you may use my dataset for non-commercial purposes and you must give credit.
```
@misc{Japanese Knowledge Base,
title = {Japanese Knowledge Base: A question and answer dataset on the Japanese language},
author = {JawGBoi},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/datasets/joujiboi/japanese-knowledge-base}},
}
``` |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-117000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 660107
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TrainingDataPro/hand-gesture-recognition-dataset | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
tags:
- code
dataset_info:
features:
- name: set_id
dtype: int32
- name: fist
dtype: string
- name: four
dtype: string
- name: me
dtype: string
- name: one
dtype: string
- name: small
dtype: string
splits:
- name: train
num_bytes: 1736
num_examples: 28
download_size: 1510134076
dataset_size: 1736
---
# Hand Gesture Recognition Dataset
The dataset consists of videos showcasing individuals demonstrating 5 different hand gestures (*"one", "four", "small", "fist", and "me"*). Each video captures a person prominently displaying a single hand gesture, allowing for accurate identification and differentiation of the gestures.
The dataset offers a diverse range of individuals performing the gestures, enabling the exploration of variations in hand shapes, sizes, and movements across different individuals.
The videos in the dataset are recorded in reasonable lighting conditions and with adequate resolution, to ensure that the hand gestures can be easily observed and studied.
### The dataset's possible applications:
- hand gesture recognition
- gesture-based control systems
- virtual reality interactions
- sign language analysis
- human pose estimation and action analysis
- security and authentication systems
![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F618942%2F02149459e1fc1f76e2575dcdba6ec406%2FMacBook%20Air%20-%201.png?generation=1689667735287012&alt=media)
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=hand-gesture-recognition-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
- **files**: includes folders corresponding to people and containing videos with 5 different shown gestures, each file is named according to the captured gesture
- **.csv** file: contains information about files in the dataset
### Hand gestures in the dataset:
- "one"
- "four"
- "small"
- "clenched fist"
- "me"
### File with the extension .csv
includes the following information:
- **set_id**: id of the set of videos,
- **one**: link to the video with "one" gesture,
- **four**: link to the video with "four" gesture,
- **small**: link to the video with "small" gesture,
- **fist**: link to the video with "fist" gesture,
- **me**: link to the video with "me" gesture
# Videos with hand gestures might be collected in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=hand-gesture-recognition-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
thientran/autotrain-data-favs_bot | ---
language:
- en
---
# AutoTrain Dataset for project: favs_bot
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project favs_bot.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_id": "13104",
"tokens": [
"Jackie",
"Frank"
],
"feat_pos_tags": [
21,
21
],
"feat_chunk_tags": [
5,
16
],
"tags": [
3,
7
]
},
{
"feat_id": "9297",
"tokens": [
"U.S.",
"lauds",
"Russian-Chechen",
"deal",
"."
],
"feat_pos_tags": [
21,
20,
15,
20,
7
],
"feat_chunk_tags": [
5,
16,
16,
16,
22
],
"tags": [
0,
8,
1,
8,
8
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_id": "Value(dtype='string', id=None)",
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_pos_tags": "Sequence(feature=ClassLabel(num_classes=47, names=['\"', '#', '$', \"''\", '(', ')', ',', '.', ':', 'CC', 'CD', 'DT', 'EX', 'FW', 'IN', 'JJ', 'JJR', 'JJS', 'LS', 'MD', 'NN', 'NNP', 'NNPS', 'NNS', 'NN|SYM', 'PDT', 'POS', 'PRP', 'PRP$', 'RB', 'RBR', 'RBS', 'RP', 'SYM', 'TO', 'UH', 'VB', 'VBD', 'VBG', 'VBN', 'VBP', 'VBZ', 'WDT', 'WP', 'WP$', 'WRB', '``'], id=None), length=-1, id=None)",
"feat_chunk_tags": "Sequence(feature=ClassLabel(num_classes=23, names=['B-ADJP', 'B-ADVP', 'B-CONJP', 'B-INTJ', 'B-LST', 'B-NP', 'B-PP', 'B-PRT', 'B-SBAR', 'B-UCP', 'B-VP', 'I-ADJP', 'I-ADVP', 'I-CONJP', 'I-INTJ', 'I-LST', 'I-NP', 'I-PP', 'I-PRT', 'I-SBAR', 'I-UCP', 'I-VP', 'O'], id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(num_classes=9, names=['B-LOC', 'B-MISC', 'B-ORG', 'B-PER', 'I-LOC', 'I-MISC', 'I-ORG', 'I-PER', 'O'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 10013 |
| valid | 4029 |
|
georgeyw/TinyStoriesV2-GPT4-10k | ---
license: cdla-sharing-1.0
---
|
quaeast/multimodal_sarcasm_detection | ---
language:
- en
---
copy of [data-of-multimodal-sarcasm-detection](https://github.com/headacheboy/data-of-multimodal-sarcasm-detection)
```python
# usage
from datasets import load_dataset
from transformers import CLIPImageProcessor, CLIPTokenizer
from torch.utils.data import DataLoader
image_processor = CLIPImageProcessor.from_pretrained(clip_path)
tokenizer = CLIPTokenizer.from_pretrained(clip_path)
def tokenization(example):
text_inputs = tokenizer(example["text"], truncation=True, padding=True, return_tensors="pt")
image_inputs = image_processor(example["image"], return_tensors="pt")
return {'pixel_values': image_inputs['pixel_values'],
'input_ids': text_inputs['input_ids'],
'attention_mask': text_inputs['attention_mask'],
"label": example["label"]}
dataset = load_dataset('quaeast/multimodal_sarcasm_detection')
dataset.set_transform(tokenization)
# get torch dataloader
train_dl = DataLoader(dataset['train'], batch_size=256, shuffle=True)
test_dl = DataLoader(dataset['test'], batch_size=256, shuffle=True)
val_dl = DataLoader(dataset['validation'], batch_size=256, shuffle=True)
```
|
muhammadravi251001/tydiqaid-nli | ---
annotations_creators:
- machine-generated
- manual-partial-validation
language_creators:
- expert-generated
language:
- id
license: unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- TyDI-QA-ID
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: TyDI-QA-ID-NLI
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
config_name: tydiqaid-nli
splits:
- name: train
num_bytes: 3207000
num_examples: 9695
- name: validation
num_bytes: 373750
num_examples: 1131
- name: test
num_bytes: 565625
num_examples: 1171
download_size: 4146375
dataset_size: 11997
---
# Dataset Card for TyDI-QA-ID-NLI
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Hugging Face](https://huggingface.co/datasets/muhammadravi251001/tydiqaid-nli)
- **Point of Contact:** [Hugging Face](https://huggingface.co/datasets/muhammadravi251001/tydiqaid-nli)
- **Experiment:** [Github](https://github.com/muhammadravi251001/multilingual-qas-with-nli)
### Dataset Summary
The TyDI-QA-ID-NLI dataset is derived from the TyDI-QA-ID question answering dataset, utilizing named entity recognition (NER), chunking tags, Regex, and embedding similarity techniques to determine its contradiction sets.
Collected through this process, the dataset comprises various columns beyond premise, hypothesis, and label, including properties aligned with NER and chunking tags.
This dataset is designed to facilitate Natural Language Inference (NLI) tasks and contains information extracted from diverse sources to provide comprehensive coverage.
Each data instance encapsulates premise, hypothesis, label, and additional properties pertinent to NLI evaluation.
### Supported Tasks and Leaderboards
- Natural Language Inference for Indonesian
### Languages
Indonesian
## Dataset Structure
### Data Instances
An example of `test` looks as follows.
```
{
"premise": "Manuls sering kali terlihat di padang rumput stepa Asia Tengah wilayah Mongolia, Cina dan Dataran Tinggi Tibet, di mana rekor elevasi 5.050 m (16.570 kaki) dilaporkan.[5] Mereka secara luas tersebar di daerah dataran tinggi dan lekukan Intermountain serta padang rumput pegunungan di Kyrgyzstan dan Kazakhstan.[6] Di Rusia, mereka muncul sesekali di Transkaukasus dan daerah Transbaikal, di sepanjang perbatasan dengan utara-timur Kazakhstan, dan di sepanjang perbatasan dengan Mongolia dan Cina di Altai, Tyva Buryatia, dan Chita republik. Pada musim semi 1997, trek yang ditemukan di Timur Sayan pada ketinggian 2.470 m (8.100 kaki) dalam 4,5cm (1,8 in) lapisan salju yang tebal. Trek ini dianggap fakta pertama yang dapat dibuktikan mendiami daerah manuls. Analisis DNA dari kotoran individu ini menegaskan kehadiran spesies.[7] Populasi di barat daya, yaitu wilayah Laut Kaspia, Afghanistan dan Pakistan, berkurang, terisolasi dan jarang [8][9]. Pada tahun 2008, seekor individu terekam kamera di Iran Khojir National Park untuk pertama kalinya [10].,Dimanakah Kucing Pallas pertama kali ditemukan ?",
"hypothesis": ",Dimanakah Kucing Pallas pertama kali ditemukan ? 2008",
"label": 0
}
```
### Data Fields
The data fields are:
- `premise`: a `string` feature
- `hypothesis`: a `string` feature
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
### Data Splits #TODO
The data is split across `train`, `valid`, and `test`.
| split | # examples |
|----------|-------:|
|train| 9695|
|valid| 1131|
|test| 1171|
## Dataset Creation
### Curation Rationale
Indonesian NLP is considered under-resourced. We need NLI dataset to fine-tuning the NLI model to utilizing them for QA models in order to improving the performance of the QA's.
### Source Data
#### Initial Data Collection and Normalization
We collect the data from the prominent QA dataset in Indonesian. The annotation fully by the original dataset's researcher.
#### Who are the source language producers?
This synthetic data was produced by machine, but the original data was produced by human.
### Personal and Sensitive Information
There might be some personal information coming from Wikipedia and news, especially the information of famous/important people.
## Considerations for Using the Data
### Discussion of Biases
The QA dataset (so the NLI-derived from them) is created using premise sentences taken from Wikipedia and news. These data sources may contain some bias.
### Other Known Limitations
No other known limitations
## Additional Information
### Dataset Curators
This dataset is the result of the collaborative work of Indonesian researchers from the University of Indonesia, Mohamed bin Zayed University of Artificial Intelligence, and the Korea Advanced Institute of Science & Technology.
### Licensing Information
The license is Unknown. Please contact authors for any information on the dataset. |
pyakymenko/test_repo | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 167117.0
num_examples: 3
download_size: 162079
dataset_size: 167117.0
---
# Dataset Card for "test_repo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Muhammad2003__Myriad-7B-Slerp | ---
pretty_name: Evaluation run of Muhammad2003/Myriad-7B-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Muhammad2003/Myriad-7B-Slerp](https://huggingface.co/Muhammad2003/Myriad-7B-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Muhammad2003__Myriad-7B-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T21:09:13.517339](https://huggingface.co/datasets/open-llm-leaderboard/details_Muhammad2003__Myriad-7B-Slerp/blob/main/results_2024-04-05T21-09-13.517339.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6498757467364364,\n\
\ \"acc_stderr\": 0.0320415504432472,\n \"acc_norm\": 0.6488639096760553,\n\
\ \"acc_norm_stderr\": 0.032715942329960675,\n \"mc1\": 0.6328029375764994,\n\
\ \"mc1_stderr\": 0.016874805001453184,\n \"mc2\": 0.7800372751713768,\n\
\ \"mc2_stderr\": 0.013681851851800382\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136438\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.717486556462856,\n\
\ \"acc_stderr\": 0.004493015945599716,\n \"acc_norm\": 0.891256721768572,\n\
\ \"acc_norm_stderr\": 0.0031068060075356255\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6328029375764994,\n\
\ \"mc1_stderr\": 0.016874805001453184,\n \"mc2\": 0.7800372751713768,\n\
\ \"mc2_stderr\": 0.013681851851800382\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624177\n }\n}\n```"
repo_url: https://huggingface.co/Muhammad2003/Myriad-7B-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-09-13.517339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-09-13.517339.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- '**/details_harness|winogrande|5_2024-04-05T21-09-13.517339.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T21-09-13.517339.parquet'
- config_name: results
data_files:
- split: 2024_04_05T21_09_13.517339
path:
- results_2024-04-05T21-09-13.517339.parquet
- split: latest
path:
- results_2024-04-05T21-09-13.517339.parquet
---
# Dataset Card for Evaluation run of Muhammad2003/Myriad-7B-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Muhammad2003/Myriad-7B-Slerp](https://huggingface.co/Muhammad2003/Myriad-7B-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Muhammad2003__Myriad-7B-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T21:09:13.517339](https://huggingface.co/datasets/open-llm-leaderboard/details_Muhammad2003__Myriad-7B-Slerp/blob/main/results_2024-04-05T21-09-13.517339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6498757467364364,
"acc_stderr": 0.0320415504432472,
"acc_norm": 0.6488639096760553,
"acc_norm_stderr": 0.032715942329960675,
"mc1": 0.6328029375764994,
"mc1_stderr": 0.016874805001453184,
"mc2": 0.7800372751713768,
"mc2_stderr": 0.013681851851800382
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136438
},
"harness|hellaswag|10": {
"acc": 0.717486556462856,
"acc_stderr": 0.004493015945599716,
"acc_norm": 0.891256721768572,
"acc_norm_stderr": 0.0031068060075356255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6328029375764994,
"mc1_stderr": 0.016874805001453184,
"mc2": 0.7800372751713768,
"mc2_stderr": 0.013681851851800382
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624177
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
imone/OpenOrca_FLAN | ---
license: mit
---
This is the [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) GPT4 subset with the original FLAN answers. Each even row (indexed starting from 0) contains the OpenOrca GPT4 answer, while each odd row contains the corresponding FLAN answer. |
48xrf/prodanca | ---
license: wtfpl
---
|
CyberHarem/lupusregina_beta_overlord | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lupusregina_beta_overlord
This is the dataset of lupusregina_beta_overlord, containing 102 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
jmgb0127/FronxOwnerManual | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1330685
num_examples: 1177
- name: test
num_bytes: 332811
num_examples: 294
download_size: 990561
dataset_size: 1663496
---
# Dataset Card for "FronxOwnerManual"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jacobbieker/gk2a-kerchunk | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_sst2_here_come | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 1351
num_examples: 9
- name: test
num_bytes: 2306
num_examples: 18
- name: train
num_bytes: 30466
num_examples: 263
download_size: 20603
dataset_size: 34123
---
# Dataset Card for "MULTI_VALUE_sst2_here_come"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wanony/mika | ---
license: gpl-3.0
---
|
open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b | ---
pretty_name: Evaluation run of Aeala/GPT4-x-AlpacaDente-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aeala/GPT4-x-AlpacaDente-30b](https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T18:13:58.646455](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b/blob/main/results_2023-09-17T18-13-58.646455.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32120385906040266,\n\
\ \"em_stderr\": 0.004781891422636473,\n \"f1\": 0.43280620805369485,\n\
\ \"f1_stderr\": 0.0045611946956929435,\n \"acc\": 0.5439418899180396,\n\
\ \"acc_stderr\": 0.012071731077966974\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.32120385906040266,\n \"em_stderr\": 0.004781891422636473,\n\
\ \"f1\": 0.43280620805369485,\n \"f1_stderr\": 0.0045611946956929435\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3009855951478393,\n \
\ \"acc_stderr\": 0.012634504465211194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722754\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|arc:challenge|25_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T18_13_58.646455
path:
- '**/details_harness|drop|3_2023-09-17T18-13-58.646455.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T18-13-58.646455.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T18_13_58.646455
path:
- '**/details_harness|gsm8k|5_2023-09-17T18-13-58.646455.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T18-13-58.646455.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hellaswag|10_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T23:04:17.245052.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T23:04:17.245052.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T18_13_58.646455
path:
- '**/details_harness|winogrande|5_2023-09-17T18-13-58.646455.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T18-13-58.646455.parquet'
- config_name: results
data_files:
- split: 2023_07_19T23_04_17.245052
path:
- results_2023-07-19T23:04:17.245052.parquet
- split: 2023_09_17T18_13_58.646455
path:
- results_2023-09-17T18-13-58.646455.parquet
- split: latest
path:
- results_2023-09-17T18-13-58.646455.parquet
---
# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-AlpacaDente-30b](https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T18:13:58.646455](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b/blob/main/results_2023-09-17T18-13-58.646455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.32120385906040266,
"em_stderr": 0.004781891422636473,
"f1": 0.43280620805369485,
"f1_stderr": 0.0045611946956929435,
"acc": 0.5439418899180396,
"acc_stderr": 0.012071731077966974
},
"harness|drop|3": {
"em": 0.32120385906040266,
"em_stderr": 0.004781891422636473,
"f1": 0.43280620805369485,
"f1_stderr": 0.0045611946956929435
},
"harness|gsm8k|5": {
"acc": 0.3009855951478393,
"acc_stderr": 0.012634504465211194
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722754
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
metredo085/drod | ---
license: apache-2.0
---
|
as-cle-bert/scerevisiae-proteins-reduced | ---
license: mit
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': Verified_Coding
'1': Probably_Non_Coding
- name: text
dtype: string
splits:
- name: train
num_bytes: 343452
num_examples: 480
- name: test
num_bytes: 81480
num_examples: 120
download_size: 236731
dataset_size: 424932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ar852/scraped-chatgpt-conversations | ---
task_categories:
- question-answering
- text-generation
- conversational
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Repository: {https://github.com/ar852/chatgpt-scraping}**
### Dataset Summary
scraped-chatgpt-conversations contains ~100k conversations between a user and chatgpt that were shared online through reddit, twitter, or sharegpt. For sharegpt, the conversations were directly scraped from the website. For reddit and twitter, images were downloaded from submissions, segmented, and run through an OCR pipeline to obtain a conversation list. For information on how the each json file is structured, please see `json_guides.md`
### Languages
- twitter 1, twitter 2, and sharegpt json files are multilingual
- reddit and twitter 2 json files are english only
## Dataset Structure
- refer to *json_guide.txt*
## Dataset Creation
This dataset was created by scraping images from twitter, reddit, and sharegpt.com using the pushshift and twitter APIs, respectively. The images are run through a filter to check if they contain a chatgpt conversation, then the image is processed and run through an OCR pipeline to obtain the conversation text. More info can be found in the repository.
### Source Data
- twitter.com
- reddit.com
- sharegpt.com
## Considerations for Using the Data
A significant amount of dicts created from parsing reddit and twitter images may be parsed incorrectly for a number of reasons: cropping done by the image poster, incorrectly classifying the image as containing a chatgpt conversation, incorrect image parsing (segmentation) by the parser, incorrect OCR by pytesseract.
### Licensing Information
[More Information Needed]
### Contributions
[More Information Needed] |
KushT/bbc_news_multiclass_train_val_test | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3414429
num_examples: 1512
- name: validation
num_bytes: 888603
num_examples: 379
- name: test
num_bytes: 751863
num_examples: 334
download_size: 0
dataset_size: 5054895
---
Label Names:
{
'business': 0,
'entertainment': 1,
'politics': 2,
'sport': 3,
'tech': 4
}
Dataset: [Kaggle - BBC Full Text Document Classification](https://www.kaggle.com/datasets/shivamkushwaha/bbc-full-text-document-classification/code) |
mespinosami/global230k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 8623524876.02
num_examples: 162940
- name: validation
num_bytes: 1335636495.768
num_examples: 23416
- name: test
num_bytes: 2572452087.661
num_examples: 46463
download_size: 10844373816
dataset_size: 12531613459.449
---
# Dataset Card for "global230k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aletos/Peixe | ---
license: openrail
---
|
LennardZuendorf/openlegaldata-processed | ---
license: mit
dataset_info:
features:
- name: id
dtype: int64
- name: court
struct:
- name: id
dtype: int64
- name: jurisdiction
dtype: string
- name: level_of_appeal
dtype: string
- name: name
dtype: string
- name: state
dtype: int64
- name: file_number
dtype: string
- name: date
dtype: timestamp[s]
- name: type
dtype: string
- name: content
dtype: string
- name: tenor
dtype: string
- name: facts
dtype: string
- name: reasoning
dtype: string
splits:
- name: three
num_bytes: 169494251
num_examples: 2828
- name: two
num_bytes: 183816899
num_examples: 4954
download_size: 172182482
dataset_size: 353311150
task_categories:
- text-classification
language:
- de
tags:
- legal
pretty_name: Edited German Court case decision
size_categories:
- 1K<n<10K
---
# Dataset Card for openlegaldata.io bulk case data
## Dataset Description
This is a edit/cleanup of Bulk Data of [openlegaldata.io](https://de.openlegaldata.io/), which I also brought onto Huggingface [here](LennardZuendorf/openlegaldata-bulk-data).
#### The Entire Dataset Is In German
- **Github Repository:** [uniArchive-legalis]](https://github.com/LennardZuendorf/uniArchive-legalis)
- **Repository:** [Bulk Data](https://static.openlegaldata.io/dumps/de/)
## Edit Summary
I have done some cleaning and splitting of the data and filtered out large parts that were not (easily) usable, cutting down the number of cases to at max 4000 - from 250000. This results in two different splits. Which is because German Courts don't format their case decision the same way.
### Data Fields
Independent of the split, most fields are the same, they are:
| id | court | file_number | date | type | content
| - | - | - | - | - | - |
| numeric id | name of the court that made the decision | file number of the case ("Aktenzeichen") | decision date | type of the case decision | entire content (text) of the case decision
Additionally, I added 3 more fields because of the splitting of the content:
#### Two Split
- Case Decision I could split into two parts: tenor and reasoning.
- Which means the three fields tenor, content and facts contain the following:
| tenor | reasoning | facts
| - | - | - |
| An abstract, legal summary of the cases decision | the entire rest of the decision, explaining in detail why the decision has been made | an empty text field |
#### Three Split
- Case Decision I could split into three parts: tenor, reasoning and facts
- This Data I have used to create binary labels with the help of ChatGPT, see [legalis](https://huggingface.co/datasets/LennardZuendorf/legalis) for that
- The three fields tenor, content and facts contain the following:
| tenor | reasoning | facts
| - | - | - |
| An abstract, legal summary of the cases decision | the entire rest of the decision, explaining in detail why the decision has been made | the facts and details of a case |
### Languages
- German
## Additional Information
### Licensing/Citation Information
The [openlegaldata platform](https://github.com/openlegaldata/oldp) is licensed under the MIT license, you can access the dataset by citing the original source, [openlegaldata.io](https://de.openlegaldata.io/) and me, [Lennard Zündorf](https://github.com/LennardZuendorf) as the editor of this dataset. |
7Jes6riv/multiclass | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 195846200
num_examples: 59168
download_size: 32590912
dataset_size: 195846200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lucasss876/Tagizzy | ---
license: openrail
---
|
curry99/rampage | ---
license: mit
---
|
mteb/stsbenchmark-sts | ---
language:
- en
--- |
lshowway/wikipedia.reorder.sov.pl | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1958124685
num_examples: 1772445
download_size: 549518463
dataset_size: 1958124685
---
# Dataset Card for "wikipedia.reorder.sov.pl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/chapayev_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chapayev/チャパエフ/恰巴耶夫 (Azur Lane)
This is the dataset of chapayev/チャパエフ/恰巴耶夫 (Azur Lane), containing 289 images and their tags.
The core tags of this character are `blue_hair, breasts, short_hair, blue_eyes, large_breasts, mole, mole_on_breast, bangs, hair_ornament, hat, white_headwear, military_hat, peaked_cap, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 289 | 483.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chapayev_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 289 | 250.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chapayev_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 734 | 552.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chapayev_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 289 | 413.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chapayev_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 734 | 808.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chapayev_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chapayev_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 47 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, chain, looking_at_viewer, torn_shirt, solo, shackles, short_sleeves, navel, cleavage, smile, sitting, open_mouth |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, cleavage, looking_at_viewer, pleated_skirt, smile, solo, white_jacket, blush, chain, long_sleeves, closed_mouth, white_skirt, black_pantyhose |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_gloves, blush, cleavage, looking_at_viewer, smile, solo, upper_body, white_jacket, closed_mouth, simple_background, white_background |
| 3 | 27 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, official_alternate_costume, looking_at_viewer, white_hairband, earrings, solo, cleavage, white_choker, blush, lace-trimmed_dress, white_dress, lying, smile, nightgown, see-through_dress, thigh_strap |
| 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, looking_at_viewer, solo, white_dress, elbow_gloves, sitting, white_gloves, bare_shoulders, no_shoes, official_alternate_costume, thighs, toes, white_thighhighs, blush, cleavage, garter_straps, legs, smile, chair, soles, ass, choker, closed_mouth, foot_focus, collarbone, lying |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | chain | looking_at_viewer | torn_shirt | solo | shackles | short_sleeves | navel | cleavage | smile | sitting | open_mouth | black_gloves | pleated_skirt | white_jacket | long_sleeves | closed_mouth | white_skirt | black_pantyhose | upper_body | simple_background | white_background | official_alternate_costume | white_hairband | earrings | white_choker | lace-trimmed_dress | white_dress | lying | nightgown | see-through_dress | thigh_strap | elbow_gloves | white_gloves | bare_shoulders | no_shoes | thighs | toes | white_thighhighs | garter_straps | legs | chair | soles | ass | choker | foot_focus | collarbone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------------|:-------|:-----------|:----------------|:--------|:-----------|:--------|:----------|:-------------|:---------------|:----------------|:---------------|:---------------|:---------------|:--------------|:------------------|:-------------|:--------------------|:-------------------|:-----------------------------|:-----------------|:-----------|:---------------|:---------------------|:--------------|:--------|:------------|:--------------------|:--------------|:---------------|:---------------|:-----------------|:-----------|:---------|:-------|:-------------------|:----------------|:-------|:--------|:--------|:------|:---------|:-------------|:-------------|
| 0 | 47 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | | | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | | X | | | | X | X | | | X | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 27 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | | X | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | X | | | | X | X | X | | | | | | X | | | | | | X | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
freshpearYoon/vr_train_free_43 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6900570085
num_examples: 10000
download_size: 1224346103
dataset_size: 6900570085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Cohere/wikipedia-22-12-fr-embeddings | ---
annotations_creators:
- expert-generated
language:
- fr
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# Wikipedia (fr) embedded with cohere.ai `multilingual-22-12` encoder
We encoded [Wikipedia (fr)](https://fr.wikipedia.org) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
To get an overview how this dataset was created and pre-processed, have a look at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12).
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Further languages
We provide embeddings of Wikipedia in many different languages:
[ar](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ar-embeddings), [de](https://huggingface.co/datasets/Cohere/wikipedia-22-12-de-embeddings), [en](https://huggingface.co/datasets/Cohere/wikipedia-22-12-en-embeddings), [es](https://huggingface.co/datasets/Cohere/wikipedia-22-12-es-embeddings), [fr](https://huggingface.co/datasets/Cohere/wikipedia-22-12-fr-embeddings), [hi](https://huggingface.co/datasets/Cohere/wikipedia-22-12-hi-embeddings), [it](https://huggingface.co/datasets/Cohere/wikipedia-22-12-it-embeddings), [ja](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ja-embeddings), [ko](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ko-embeddings), [simple english](https://huggingface.co/datasets/Cohere/wikipedia-22-12-simple-embeddings), [zh](https://huggingface.co/datasets/Cohere/wikipedia-22-12-zh-embeddings),
You can find the Wikipedia datasets without embeddings at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12).
## Loading the dataset
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/wikipedia-22-12-fr-embeddings", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/wikipedia-22-12-fr-embeddings", split="train", streaming=True)
for doc in docs:
docid = doc['id']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
A full search example:
```python
#Run: pip install cohere datasets
from datasets import load_dataset
import torch
import cohere
co = cohere.Client(f"<<COHERE_API_KEY>>") # Add your cohere API key from www.cohere.com
#Load at max 1000 documents + embeddings
max_docs = 1000
docs_stream = load_dataset(f"Cohere/wikipedia-22-12-fr-embeddings", split="train", streaming=True)
docs = []
doc_embeddings = []
for doc in docs_stream:
docs.append(doc)
doc_embeddings.append(doc['emb'])
if len(docs) >= max_docs:
break
doc_embeddings = torch.tensor(doc_embeddings)
query = 'Who founded Youtube'
response = co.embed(texts=[query], model='multilingual-22-12')
query_embedding = response.embeddings
query_embedding = torch.tensor(query_embedding)
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query)
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'], "\n")
```
## Performance
You can find performance on the MIRACL dataset (a semantic search evaluation dataset) here: [miracl-en-queries-22-12#performance](https://huggingface.co/datasets/Cohere/miracl-en-queries-22-12#performance) |
rbiojout/odoo_python_15 | ---
license: agpl-3.0
dataset_info:
features:
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: branch
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
splits:
- name: train
num_bytes: 40721998
num_examples: 9349
download_size: 12783255
dataset_size: 40721998
---
|
one-sec-cv12/chunk_186 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 12103776864.75
num_examples: 126018
download_size: 9851992082
dataset_size: 12103776864.75
---
# Dataset Card for "chunk_186"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fadime3/Hayalet | ---
license: openrail
---
|
joey234/mmlu-moral_disputes | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4935
num_examples: 5
- name: test
num_bytes: 1532082
num_examples: 346
download_size: 153575
dataset_size: 1537017
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-moral_disputes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ProfEngel/spieltheorie | ---
license: cc-by-nc-4.0
---
|
Denilsonic/Datasets | ---
license: openrail
---
|
DanGoldBr/PTT-20230720-WA0159 | ---
license: openrail
---
|
open-llm-leaderboard/details_CorticalStack__gemma-7b-ultrachat-sft | ---
pretty_name: Evaluation run of CorticalStack/gemma-7b-ultrachat-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CorticalStack/gemma-7b-ultrachat-sft](https://huggingface.co/CorticalStack/gemma-7b-ultrachat-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__gemma-7b-ultrachat-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T17:43:57.046792](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__gemma-7b-ultrachat-sft/blob/main/results_2024-02-23T17-43-57.046792.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6393700582955444,\n\
\ \"acc_stderr\": 0.03219234730039111,\n \"acc_norm\": 0.6439639463015079,\n\
\ \"acc_norm_stderr\": 0.03283632439673242,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5450293631615253,\n\
\ \"mc2_stderr\": 0.015380202565099867\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6127265484963155,\n\
\ \"acc_stderr\": 0.004861314613286841,\n \"acc_norm\": 0.8082055367456682,\n\
\ \"acc_norm_stderr\": 0.003929076276473383\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777028,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777028\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876105,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603617,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603617\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465708,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465708\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010333,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388676992,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388676992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29720670391061454,\n\
\ \"acc_stderr\": 0.015285313353641597,\n \"acc_norm\": 0.29720670391061454,\n\
\ \"acc_norm_stderr\": 0.015285313353641597\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818723,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818723\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"\
acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.0193533605475537,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.0193533605475537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252092,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252092\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5450293631615253,\n\
\ \"mc2_stderr\": 0.015380202565099867\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773225\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44655041698256254,\n \
\ \"acc_stderr\": 0.01369356654974314\n }\n}\n```"
repo_url: https://huggingface.co/CorticalStack/gemma-7b-ultrachat-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|arc:challenge|25_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|gsm8k|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hellaswag|10_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T17-43-57.046792.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T17-43-57.046792.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- '**/details_harness|winogrande|5_2024-02-23T17-43-57.046792.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T17-43-57.046792.parquet'
- config_name: results
data_files:
- split: 2024_02_23T17_43_57.046792
path:
- results_2024-02-23T17-43-57.046792.parquet
- split: latest
path:
- results_2024-02-23T17-43-57.046792.parquet
---
# Dataset Card for Evaluation run of CorticalStack/gemma-7b-ultrachat-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/gemma-7b-ultrachat-sft](https://huggingface.co/CorticalStack/gemma-7b-ultrachat-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__gemma-7b-ultrachat-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T17:43:57.046792](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__gemma-7b-ultrachat-sft/blob/main/results_2024-02-23T17-43-57.046792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6393700582955444,
"acc_stderr": 0.03219234730039111,
"acc_norm": 0.6439639463015079,
"acc_norm_stderr": 0.03283632439673242,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5450293631615253,
"mc2_stderr": 0.015380202565099867
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6127265484963155,
"acc_stderr": 0.004861314613286841,
"acc_norm": 0.8082055367456682,
"acc_norm_stderr": 0.003929076276473383
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777028,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777028
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876105,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603617,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465708,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465708
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010333,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388676992,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388676992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29720670391061454,
"acc_stderr": 0.015285313353641597,
"acc_norm": 0.29720670391061454,
"acc_norm_stderr": 0.015285313353641597
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818723,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818723
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.0193533605475537,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.0193533605475537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252092,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252092
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5450293631615253,
"mc2_stderr": 0.015380202565099867
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773225
},
"harness|gsm8k|5": {
"acc": 0.44655041698256254,
"acc_stderr": 0.01369356654974314
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Eduardovco/Edu | ---
license: openrail
---
|
open-llm-leaderboard/details_Azure99__blossom-v4-mistral-7b | ---
pretty_name: Evaluation run of Azure99/blossom-v4-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v4-mistral-7b](https://huggingface.co/Azure99/blossom-v4-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v4-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-28T11:10:20.298869](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-mistral-7b/blob/main/results_2023-12-28T11-10-20.298869.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6235420002967518,\n\
\ \"acc_stderr\": 0.03272388603364805,\n \"acc_norm\": 0.6281854377869052,\n\
\ \"acc_norm_stderr\": 0.03338061598239654,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.5384391963865467,\n\
\ \"mc2_stderr\": 0.015414673673859326\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508397,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6390161322445728,\n\
\ \"acc_stderr\": 0.004793042992396035,\n \"acc_norm\": 0.8290181238797052,\n\
\ \"acc_norm_stderr\": 0.0037572368063973345\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n\
\ \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n\
\ \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n\
\ \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n\
\ \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"\
acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212378,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066468,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066468\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381398,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381398\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n\
\ \"acc_stderr\": 0.015761716178397566,\n \"acc_norm\": 0.3329608938547486,\n\
\ \"acc_norm_stderr\": 0.015761716178397566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.01918463932809249,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.01918463932809249\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.5384391963865467,\n\
\ \"mc2_stderr\": 0.015414673673859326\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4313874147081122,\n \
\ \"acc_stderr\": 0.013642195352511575\n }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v4-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|arc:challenge|25_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|gsm8k|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hellaswag|10_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T11-10-20.298869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-28T11-10-20.298869.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- '**/details_harness|winogrande|5_2023-12-28T11-10-20.298869.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-28T11-10-20.298869.parquet'
- config_name: results
data_files:
- split: 2023_12_28T11_10_20.298869
path:
- results_2023-12-28T11-10-20.298869.parquet
- split: latest
path:
- results_2023-12-28T11-10-20.298869.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v4-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azure99/blossom-v4-mistral-7b](https://huggingface.co/Azure99/blossom-v4-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v4-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-28T11:10:20.298869](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-mistral-7b/blob/main/results_2023-12-28T11-10-20.298869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6235420002967518,
"acc_stderr": 0.03272388603364805,
"acc_norm": 0.6281854377869052,
"acc_norm_stderr": 0.03338061598239654,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.5384391963865467,
"mc2_stderr": 0.015414673673859326
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.014426211252508397,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6390161322445728,
"acc_stderr": 0.004793042992396035,
"acc_norm": 0.8290181238797052,
"acc_norm_stderr": 0.0037572368063973345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066468,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066468
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217902,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217902
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381398,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381398
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397566,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.01918463932809249,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.01918463932809249
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.5384391963865467,
"mc2_stderr": 0.015414673673859326
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.4313874147081122,
"acc_stderr": 0.013642195352511575
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Zuntan/Animagine_XL_3.0-Character | ---
license: unknown
---
# Animagine XL 3.0 Character
[EasySdxlWebUi](https://github.com/Zuntan03/EasySdxlWebUi) による [Animagine XL 3.0](https://huggingface.co/cagliostrolab/animagine-xl-3.0) の [公式 Character ワイルドカード](https://huggingface.co/spaces/Linaqruf/animagine-xl/resolve/main/wildcard/character.txt) の立ち絵データセットです。
データセットのダウンロードは [こちら(2880枚、497MB)](https://huggingface.co/datasets/Zuntan/Animagine_XL_3.0-Character/resolve/main/character.zip?download=true)。
**[表情(278MB)](https://huggingface.co/datasets/Zuntan/Animagine_XL_3.0-Character/resolve/main/face.zip?download=true) と [画風(115MB)](https://yyy.wpx.jp/EasySdxlWebUi/style.zip) も用意しました。**
![face](./face_grid.webp)
画像の類似度や Tagger の結果比較で正常動作するワイルドカードリストを用意できないかな?と思って始めてみました。
が、衣装違いなどの不正解画像でも作品名やキャラ名の影響を大きく受けるため、他のソースなしの正否分類は難しそうです。
- 各 webp 画像を [Stable Diffusion web UI](https://github.com/AUTOMATIC1111/stable-diffusion-webui) の `PNG内の情報を表示` にドラッグ&ドロップすると生成情報を確認できます。
- プロンプトは `__animagine/character__, solo, full body, standing, no background, simple background, masterpiece, best quality <lora:lcm-animagine-3:1>` です。
- ネガティブプロンプト Animagine XL のデフォルトネガティブの先頭に NSFW 対策付与で `nsfw, rating: sensitive, lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, artist name` です。
- アップスケール前の生成サイズは `832` x `1216` です。
- Seed は `1234567` です。
- 他のシードで正否が変わる可能性があります。
- 他は EasySdxlWebUi のデフォルト設定です。
[grid0](https://yyy.wpx.jp/m/202401/animagine_character/grid0.webp),
[grid1](https://yyy.wpx.jp/m/202401/animagine_character/grid1.webp),
[grid2](https://yyy.wpx.jp/m/202401/animagine_character/grid2.webp),
[grid3](https://yyy.wpx.jp/m/202401/animagine_character/grid3.webp)
|
ramachaitanya22/mental_health_and_fitness_data | ---
dataset_info:
features:
- name: Human
dtype: string
- name: Assistant
dtype: string
splits:
- name: train
num_bytes: 4021848.8
num_examples: 3552
- name: test
num_bytes: 1005462.2
num_examples: 888
download_size: 2746185
dataset_size: 5027311.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mboth/luftVersorgen-50-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': LuftBereitstellen
'1': LuftVerteilen
splits:
- name: train
num_bytes: 19757.430602572782
num_examples: 100
- name: test
num_bytes: 290707
num_examples: 1477
- name: valid
num_bytes: 290707
num_examples: 1477
download_size: 227539
dataset_size: 601171.4306025729
---
# Dataset Card for "luftVersorgen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/roads-segmentation-dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-segmentation
- image-to-image
language:
- en
tags:
- code
---
# Roads Segmentation Dataset
This dataset comprises a collection of images captured through **DVRs** (Digital Video Recorders) showcasing roads. Each image is accompanied by segmentation masks demarcating different entities (**road surface, cars, road signs, marking and background**) within the scene.
The dataset can be utilized for enhancing computer vision algorithms involved in road surveillance, navigation, and intelligent transportation systemsand and in autonomous driving systems.
![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2Fb0789a0ec8075d9c7abdb0aa9faced59%2FFrame%2012.png?generation=1694606364403023&alt=media)
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/roads-segmentation?utm_source=huggingface&utm_medium=cpc&utm_campaign=roads-segmentation-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of original images of roads
- **masks** - includes segmentation masks created for the original images
- **annotations.xml** - contains coordinates of the bounding boxes and detected text, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the polygons and labels . For each point, the x and y coordinates are provided.
### Сlasses:
- **road_surface**: surface of the road,
- **marking**: white and yellow marking on the road,
- **road_sign**: road signs,
- **car**: cars on the road,
- **background**: side of the road and surronding objects
# Example of XML file structure
![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2Fa74a4214f4dd89a35527ef008abfc151%2Fcarbon.png?generation=1694608637609153&alt=media)
# Roads Segmentation might be made in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/roads-segmentation?utm_source=huggingface&utm_medium=cpc&utm_campaign=roads-segmentation-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
mesolitica/rumi-jawi | ---
language: ms
task_categories:
- text2text-generation
task_ids: []
tags:
- conditional-text-generation
---
# rumi-jawi
Notebooks to gather the dataset at https://github.com/huseinzol05/malay-dataset/tree/master/normalization/rumi-jawi |
open-llm-leaderboard/details_andysalerno__rainbowfish-v6 | ---
pretty_name: Evaluation run of andysalerno/rainbowfish-v6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/rainbowfish-v6](https://huggingface.co/andysalerno/rainbowfish-v6)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__rainbowfish-v6\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T16:40:31.289715](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-v6/blob/main/results_2024-02-09T16-40-31.289715.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6251300156980985,\n\
\ \"acc_stderr\": 0.03253464808226719,\n \"acc_norm\": 0.6311200052519415,\n\
\ \"acc_norm_stderr\": 0.03319319250421297,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4837489625680555,\n\
\ \"mc2_stderr\": 0.015088896132364547\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870655,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.628460466042621,\n\
\ \"acc_stderr\": 0.004822286556305222,\n \"acc_norm\": 0.8251344353714399,\n\
\ \"acc_norm_stderr\": 0.003790757646575897\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908237,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908237\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915435,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915435\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.01526867731760228,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.01526867731760228\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4491525423728814,\n \"acc_stderr\": 0.01270403051885149,\n\
\ \"acc_norm\": 0.4491525423728814,\n \"acc_norm_stderr\": 0.01270403051885149\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"\
acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4837489625680555,\n\
\ \"mc2_stderr\": 0.015088896132364547\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36315390447308565,\n \
\ \"acc_stderr\": 0.013246614539839868\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/rainbowfish-v6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|arc:challenge|25_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|gsm8k|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hellaswag|10_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-40-31.289715.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T16-40-31.289715.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- '**/details_harness|winogrande|5_2024-02-09T16-40-31.289715.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T16-40-31.289715.parquet'
- config_name: results
data_files:
- split: 2024_02_09T16_40_31.289715
path:
- results_2024-02-09T16-40-31.289715.parquet
- split: latest
path:
- results_2024-02-09T16-40-31.289715.parquet
---
# Dataset Card for Evaluation run of andysalerno/rainbowfish-v6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-v6](https://huggingface.co/andysalerno/rainbowfish-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__rainbowfish-v6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T16:40:31.289715](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-v6/blob/main/results_2024-02-09T16-40-31.289715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6251300156980985,
"acc_stderr": 0.03253464808226719,
"acc_norm": 0.6311200052519415,
"acc_norm_stderr": 0.03319319250421297,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4837489625680555,
"mc2_stderr": 0.015088896132364547
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870655,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349814
},
"harness|hellaswag|10": {
"acc": 0.628460466042621,
"acc_stderr": 0.004822286556305222,
"acc_norm": 0.8251344353714399,
"acc_norm_stderr": 0.003790757646575897
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908237,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908237
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915435,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.01526867731760228,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.01526867731760228
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.01270403051885149,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.01270403051885149
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4837489625680555,
"mc2_stderr": 0.015088896132364547
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.36315390447308565,
"acc_stderr": 0.013246614539839868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zolak/twitter_dataset_1713023754 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: float64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 58501863
num_examples: 150576
download_size: 29705449
dataset_size: 58501863
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cenkersisman/viki_soru_cevap | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 5319410
num_examples: 34983
download_size: 2529944
dataset_size: 5319410
---
# Dataset Card for "viki_soru_cevap"
## Hakkında
Bu veri seti, Türkçe Vikipedi üzerindeki içeriklerden oluşturulan bir soru ve cevap veri setidir. Oluşturulan veri seti sentetik olarak üretilmiştir. Cevaplar, context metin üzerinden alınmış olsa da doğruluğu garanti edilmemektedir. Sorular da sentetik olarak üretilmiştir
## Başlıklara göre en fazla soru cevap içeren konular aşağıdadır:
* Futbol rekabetleri listesi: 313 adet
* Cengiz Han: 310 adet
* Triple H: 196 adet
* Lüleburgaz Muharebesi: 158 adet
* Zümrüdüanka Yoldaşlığı: 155 adet
* Shakespeare eserleri çevirileri listesi: 145 adet
* Kırkpınar Yağlı Güreşleri: 142 adet
* Sovyetler Birliği'nin askerî tarihi: 136 adet
* I. Baybars: 135 adet
* Dumbledore'un Ordusu: 126 adet
* Nicolaus Copernicus: 119 adet
* Ermenistan Sovyet Sosyalist Cumhuriyeti: 111 adet
* Boshin Savaşı: 99 adet
* Suvorov Harekâtı: 98 adet
* Gökhan Türkmen: 96 adet
* Wolfgang Amadeus Mozart: 95 adet
* Joachim von Ribbentrop: 95 adet
* Rumyantsev Harekâtı: 94 adet
* Hermann Göring: 93 adet
* Nâzım Hikmet: 90 adet
* Said Nursî: 90 adet
* Emîn: 88 adet
* Antonio Gramsci: 87 adet
* Gilles Deleuze: 86 adet
* Madagaskar: 86 adet
* Faşizm: 85 adet
* Mac OS X Snow Leopard: 85 adet
* Korsun-Şevçenkovski Taarruzu: 84 adet
* Soğuk Savaş: 84 adet
* Adolf Eichmann: 83 adet
* Niccolò Paganini: 83 adet
* II. Dünya Savaşı tankları: 81 adet
* Pergamon: 81 adet
* IV. Mihail: 80 adet
* Bolşeviklere karşı sol ayaklanmalar: 77 adet
* Osman Gazi: 77 adet
* V. Leon: 76 adet
* Ajda Pekkan: 75 adet
* Mehdi Savaşı: 75 adet
* Tsushima Muharebesi: 73 adet
* Mehdî (Abbâsî halifesi): 72 adet
* Franck Ribéry: 72 adet
* I. Basileios: 69 adet
* Antimon: 68 adet
* Kolomb öncesi Amerika: 68 adet
* Otto Skorzeny: 68 adet
* Kâzım Koyuncu: 68 adet
* İmamiye (Şiilik öğretisi): 66 adet
* Oscar Niemeyer: 66 adet
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1 | ---
pretty_name: Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeonsworld/CarbonVillain-en-10.7B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T20:15:36.884484](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1/blob/main/results_2023-12-29T20-15-36.884484.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6677851094474622,\n\
\ \"acc_stderr\": 0.031647346301320364,\n \"acc_norm\": 0.6687652386109932,\n\
\ \"acc_norm_stderr\": 0.032290288467975714,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n\
\ \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7148974307906791,\n\
\ \"acc_stderr\": 0.00450540617660685,\n \"acc_norm\": 0.8845847440748855,\n\
\ \"acc_norm_stderr\": 0.0031886940284536315\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813038,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813038\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"\
acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\
\ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\
\ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n\
\ \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \
\ \"acc_stderr\": 0.013197931775445206\n }\n}\n```"
repo_url: https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|arc:challenge|25_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|gsm8k|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hellaswag|10_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T20-15-36.884484.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- '**/details_harness|winogrande|5_2023-12-29T20-15-36.884484.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T20-15-36.884484.parquet'
- config_name: results
data_files:
- split: 2023_12_29T20_15_36.884484
path:
- results_2023-12-29T20-15-36.884484.parquet
- split: latest
path:
- results_2023-12-29T20-15-36.884484.parquet
---
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-10.7B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:15:36.884484](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1/blob/main/results_2023-12-29T20-15-36.884484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6677851094474622,
"acc_stderr": 0.031647346301320364,
"acc_norm": 0.6687652386109932,
"acc_norm_stderr": 0.032290288467975714,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266125
},
"harness|hellaswag|10": {
"acc": 0.7148974307906791,
"acc_stderr": 0.00450540617660685,
"acc_norm": 0.8845847440748855,
"acc_norm_stderr": 0.0031886940284536315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6429112964366944,
"acc_stderr": 0.013197931775445206
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
karawalla/aqcommands | ---
license: bigcode-openrail-m
---
|
autoevaluate/autoeval-eval-lener_br-lener_br-b36dee-1776161641 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/bertimbau-large-lener_br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/bertimbau-large-lener_br
* Dataset: lener_br
* Config: lener_br
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
CyberHarem/python_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of python/パイソン/蟒蛇 (Girls' Frontline)
This is the dataset of python/パイソン/蟒蛇 (Girls' Frontline), containing 43 images and their tags.
The core tags of this character are `black_hair, breasts, green_eyes, long_hair, mole, mole_under_eye, earrings, large_breasts, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 54.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/python_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 31.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/python_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 115 | 66.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/python_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 48.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/python_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 115 | 94.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/python_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/python_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 31 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, smile, solo, jewelry, navel, black_gloves, simple_background, white_background, jacket, makeup, black_shirt, handgun, blush, holding_weapon, revolver, thighhighs, white_necktie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | jewelry | navel | black_gloves | simple_background | white_background | jacket | makeup | black_shirt | handgun | blush | holding_weapon | revolver | thighhighs | white_necktie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:----------|:--------|:---------------|:--------------------|:-------------------|:---------|:---------|:--------------|:----------|:--------|:-----------------|:-----------|:-------------|:----------------|
| 0 | 31 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Torsaan/NTNU_RED | ---
license: mit
---
Some data collected on NTNU Campus Gjøvik with a Leica p40 then prossesed (normals, downsampled etc) and augmented
Phillip Nerdum - Tor Henrik Øverby Olsen (2024)
Some data collected from http://redwood-data.org/3dscan/ then prossesed (normals, downsampled etc) and augmented.
@article{Choi2016,
author = {Sungjoon Choi and Qian-Yi Zhou and Stephen Miller and Vladlen Koltun},
title = {A Large Dataset of Object Scans},
journal = {arXiv:1602.02481},
year = {2016},
}
Contains filelist , shapelist , and test , train val split.
Used for fine tuning a model trained on the modelnet40 dataset.
Any part of the dataset can be used for any purpose with proper attribution. If you use any of the data, please cite
- Phillip Nerdrum - Tor Henrik Øverby Olsen NTNU 2024
- Bachelor thesis: TBD
@article{Choi2016,
author = {Sungjoon Choi and Qian-Yi Zhou and Stephen Miller and Vladlen Koltun},
title = {A Large Dataset of Object Scans},
journal = {arXiv:1602.02481},
year = {2016},
}
|
CyberHarem/senzaki_ema_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of senzaki_ema/仙崎恵磨 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of senzaki_ema/仙崎恵磨 (THE iDOLM@STER: Cinderella Girls), containing 59 images and their tags.
The core tags of this character are `short_hair, blonde_hair, earrings, very_short_hair, red_eyes, breasts, ear_piercing`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 59 | 54.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senzaki_ema_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 59 | 37.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senzaki_ema_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 127 | 71.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senzaki_ema_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 59 | 49.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senzaki_ema_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 127 | 89.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senzaki_ema_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/senzaki_ema_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, jewelry, solo, card_(medium), character_name, sun_symbol, looking_at_viewer, open_mouth, grin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | solo | card_(medium) | character_name | sun_symbol | looking_at_viewer | open_mouth | grin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:----------------|:-----------------|:-------------|:--------------------|:-------------|:-------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X |
|
Admin0805/Mytokens | ---
license: other
license_name: citibankdemobusiness
license_link: https://citibankdemobusiness.dev
---
|
skrishna/SeqSense_gen_8 | ---
dataset_info:
features:
- name: input
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 25416
num_examples: 300
download_size: 8511
dataset_size: 25416
---
# Dataset Card for "SeqSense_gen_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Cantonese_Dialect_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Cantonese_Dialect_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/54?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It collects 4,888 speakers from Guangdong Province and is recorded in quiet indoor environment. The recorded content covers 500,000 commonly used spoken sentences, including high-frequency words in weico and daily used expressions. The average number of repetitions is 1.5 and the average sentence length is 12.5 words. Recording devices are mainstream Android phones and iPhones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/54?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Cantonese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
salma-remyx/ffmperative_refined_5.5k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3277970
num_examples: 5565
download_size: 1001170
dataset_size: 3277970
---
# Dataset Card for "ffmperative_refined_5.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
codys12/PRM800K | ---
license: mit
---
|
elvincth/durecdial-knowledge | ---
license: mit
---
|
open-llm-leaderboard/details_Isotonic__Hermes-2-Pro-Mixtral-4x7B | ---
pretty_name: Evaluation run of Isotonic/Hermes-2-Pro-Mixtral-4x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Isotonic/Hermes-2-Pro-Mixtral-4x7B](https://huggingface.co/Isotonic/Hermes-2-Pro-Mixtral-4x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isotonic__Hermes-2-Pro-Mixtral-4x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T02:39:47.890512](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Hermes-2-Pro-Mixtral-4x7B/blob/main/results_2024-03-22T02-39-47.890512.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6247175996493741,\n\
\ \"acc_stderr\": 0.03257864192508729,\n \"acc_norm\": 0.6264034162110771,\n\
\ \"acc_norm_stderr\": 0.033228987439788436,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476848,\n \"mc2\": 0.5902365843525761,\n\
\ \"mc2_stderr\": 0.015835546003395855\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909869,\n\
\ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6480780720971918,\n\
\ \"acc_stderr\": 0.004765937515197187,\n \"acc_norm\": 0.8270264887472615,\n\
\ \"acc_norm_stderr\": 0.003774513882615956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010344,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010344\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437416,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437416\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729487,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729487\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n\
\ \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476848,\n \"mc2\": 0.5902365843525761,\n\
\ \"mc2_stderr\": 0.015835546003395855\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183633\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.604245640636846,\n \
\ \"acc_stderr\": 0.013469823701048815\n }\n}\n```"
repo_url: https://huggingface.co/Isotonic/Hermes-2-Pro-Mixtral-4x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|arc:challenge|25_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|gsm8k|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hellaswag|10_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-39-47.890512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T02-39-47.890512.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- '**/details_harness|winogrande|5_2024-03-22T02-39-47.890512.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T02-39-47.890512.parquet'
- config_name: results
data_files:
- split: 2024_03_22T02_39_47.890512
path:
- results_2024-03-22T02-39-47.890512.parquet
- split: latest
path:
- results_2024-03-22T02-39-47.890512.parquet
---
# Dataset Card for Evaluation run of Isotonic/Hermes-2-Pro-Mixtral-4x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Isotonic/Hermes-2-Pro-Mixtral-4x7B](https://huggingface.co/Isotonic/Hermes-2-Pro-Mixtral-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Isotonic__Hermes-2-Pro-Mixtral-4x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T02:39:47.890512](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Hermes-2-Pro-Mixtral-4x7B/blob/main/results_2024-03-22T02-39-47.890512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6247175996493741,
"acc_stderr": 0.03257864192508729,
"acc_norm": 0.6264034162110771,
"acc_norm_stderr": 0.033228987439788436,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476848,
"mc2": 0.5902365843525761,
"mc2_stderr": 0.015835546003395855
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909869,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916573
},
"harness|hellaswag|10": {
"acc": 0.6480780720971918,
"acc_stderr": 0.004765937515197187,
"acc_norm": 0.8270264887472615,
"acc_norm_stderr": 0.003774513882615956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010344,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437416,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437416
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729487,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729487
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476848,
"mc2": 0.5902365843525761,
"mc2_stderr": 0.015835546003395855
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183633
},
"harness|gsm8k|5": {
"acc": 0.604245640636846,
"acc_stderr": 0.013469823701048815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-responsibility/frameworks | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path:
- "catalog.json"
---
https://huggingface.co/datasets/open-responsibility/frameworks
|
hippocrates/qa_train_old | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 485067176
num_examples: 404269
- name: valid
num_bytes: 4491759
num_examples: 5505
download_size: 241040216
dataset_size: 489558935
---
# Dataset Card for "qa_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_132 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20839438512.875
num_examples: 216969
download_size: 19299011373
dataset_size: 20839438512.875
---
# Dataset Card for "chunk_132"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
J-LAB/OpenHermes_PTBR | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1391804745
num_examples: 860176
download_size: 772872708
dataset_size: 1391804745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wisenut-nlp-team/aihub_admin_generated_answers | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
- name: original_answer
dtype: string
- name: similar_contexts
sequence: string
splits:
- name: train
num_bytes: 5293612104
num_examples: 315745
download_size: 2662886163
dataset_size: 5293612104
---
# Dataset Card for "aihub_admin_generated_answers_last"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_context_v5_full_recite_ans_sent_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4850217.0
num_examples: 2385
- name: validation
num_bytes: 631113
num_examples: 300
download_size: 1204825
dataset_size: 5481330.0
---
# Dataset Card for "squad_qa_context_v5_full_recite_ans_sent_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GTZ22/vozmiguel | ---
license: openrail
---
|
ODD2903/ProjetA23 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a16 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r32_a16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r32_a16](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T00:35:29.195349](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a16/blob/main/results_2024-02-10T00-35-29.195349.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5560019624665314,\n\
\ \"acc_stderr\": 0.03364714043907589,\n \"acc_norm\": 0.5619565729235969,\n\
\ \"acc_norm_stderr\": 0.03436835615145709,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.38301200451667206,\n\
\ \"mc2_stderr\": 0.013767815310741604\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.560580204778157,\n \"acc_stderr\": 0.014503747823580123,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.01432225579071987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6163114917347142,\n\
\ \"acc_stderr\": 0.004852896681736758,\n \"acc_norm\": 0.8233419637522406,\n\
\ \"acc_norm_stderr\": 0.0038059961194403754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236397,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n\
\ \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n\
\ \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n\
\ \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n\
\ \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n\
\ \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\"\
: 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n\
\ \"acc_stderr\": 0.02418049716437691,\n \"acc_norm\": 0.328042328042328,\n\
\ \"acc_norm_stderr\": 0.02418049716437691\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n\
\ \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785742,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785742\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.01850814360254781,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.01850814360254781\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n\
\ \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n\
\ \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n\
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648372,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n\
\ \"acc_stderr\": 0.015569254692045757,\n \"acc_norm\": 0.7458492975734355,\n\
\ \"acc_norm_stderr\": 0.015569254692045757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.014987325439963539,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.014987325439963539\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037103,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557308,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557308\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.01260496081608737,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.01260496081608737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.38301200451667206,\n\
\ \"mc2_stderr\": 0.013767815310741604\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838236\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2357846853677028,\n \
\ \"acc_stderr\": 0.011692515650666792\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-35-29.195349.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-35-29.195349.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- '**/details_harness|winogrande|5_2024-02-10T00-35-29.195349.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T00-35-29.195349.parquet'
- config_name: results
data_files:
- split: 2024_02_10T00_35_29.195349
path:
- results_2024-02-10T00-35-29.195349.parquet
- split: latest
path:
- results_2024-02-10T00-35-29.195349.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r32_a16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r32_a16](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T00:35:29.195349](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a16/blob/main/results_2024-02-10T00-35-29.195349.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5560019624665314,
"acc_stderr": 0.03364714043907589,
"acc_norm": 0.5619565729235969,
"acc_norm_stderr": 0.03436835615145709,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.38301200451667206,
"mc2_stderr": 0.013767815310741604
},
"harness|arc:challenge|25": {
"acc": 0.560580204778157,
"acc_stderr": 0.014503747823580123,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.01432225579071987
},
"harness|hellaswag|10": {
"acc": 0.6163114917347142,
"acc_stderr": 0.004852896681736758,
"acc_norm": 0.8233419637522406,
"acc_norm_stderr": 0.0038059961194403754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236397,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437691,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437691
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785742,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785742
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.01850814360254781,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.01850814360254781
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648372,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7458492975734355,
"acc_stderr": 0.015569254692045757,
"acc_norm": 0.7458492975734355,
"acc_norm_stderr": 0.015569254692045757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963539,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963539
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037103,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557308,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557308
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.01260496081608737,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.01260496081608737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.38301200451667206,
"mc2_stderr": 0.013767815310741604
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838236
},
"harness|gsm8k|5": {
"acc": 0.2357846853677028,
"acc_stderr": 0.011692515650666792
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Barnie2/testDataset | ---
license: apache-2.0
---
|
AnuraSet/AnuraSet_v1.0.0 | ---
license: mit
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_140 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1147145696.0
num_examples: 223528
download_size: 1170538394
dataset_size: 1147145696.0
---
# Dataset Card for "chunk_140"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/DTD_parition1_test_facebook_opt_1.3b_Visclues_ns_1880_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 92562773.0
num_examples: 1880
- name: fewshot_3_bs_16
num_bytes: 93877652.0
num_examples: 1880
download_size: 182697216
dataset_size: 186440425.0
---
# Dataset Card for "DTD_parition1_test_facebook_opt_1.3b_Visclues_ns_1880_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jondurbin/airoboros-gpt4-1.4.1 | ---
license: cc-by-nc-4.0
---
The same as 1.4, but with coding updates:
- rosettacode instructions were removed, due to a few issues found when spot-checking examples
- limited the coding examples to fewer languages, to test if a more focused dataset would produce better results |
carnival13/xlmr_hard_curr_uda_ep3 | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 774087578
num_examples: 519240
download_size: 233619604
dataset_size: 774087578
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xlmr_hard_curr_uda_ep3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/massive_chatgpt_20pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 750498
num_examples: 11514
download_size: 267643
dataset_size: 750498
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TrustLLM/TrustLLM-dataset | ---
license: apache-2.0
language:
- en
configs:
- config_name: safety
data_files: "safety/*json"
- config_name: ethics
data_files: "ethics/*json"
- config_name: fairness
data_files: "fairness/*json"
- config_name: robustness
data_files: "robustness/*json"
- config_name: privacy
data_files: "privacy/*json"
- config_name: truthfulness
data_files: "truthfulness/*json"
tags:
- llm
- trustworthy ai
- nlp
size_categories:
- 10K<n<100K
---
# Dataset Card for TrustLLM
## Dataset Summary
This repository provides datasets from the TrustLLM benchmark, including six aspects: truthfulness, safety, fairness, robustness, privacy, and machine ethics.
To find more details about TrustLLM, please visit the [project website](https://trustllmbenchmark.github.io/TrustLLM-Website/).
## Disclaimer
The dataset contains harmful content such as partial pornography, violence, bloodshed, or bias. The opinions expressed in the data do not reflect the views of the TrustLLM team. This dataset is strictly intended for research purposes and should not be used for any other illegal activities. We advocate for the responsible use of large language models.
### Download
Use `trustllm` toolkit to download the dataset: [link](https://howiehwong.github.io/TrustLLM/#dataset-download).
Use `hugginface` to download the dataset:
```python
from datasets import load_dataset
# Load all sections
dataset = load_dataset("TrustLLM/TrustLLM-dataset")
# Load one of the sections
dataset = load_dataset("TrustLLM/TrustLLM-dataset", data_dir="safety")
```
## Contact
Contact Us: [trustllm.benchmark@gmail.com](mailto:trustllm.benchmark@gmail.com)
|
ophycare/icliniq-dataset-1 | ---
license: llama2
---
|
SantiagoPG/doc_qa | ---
language:
- en
--- |
moficodes/guanaco-gemma-500 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 842593
num_examples: 500
download_size: 478887
dataset_size: 842593
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Revankumar/fingpt | ---
license: mit
---
|
enzeberg/crowded_fishes | ---
license: cc-by-4.0
---
|
Kamyar-zeinalipour/ITA_CW | ---
dataset_info:
features:
- name: Clue
dtype: string
- name: Answer
dtype: string
- name: couple_occurencies
dtype: int64
splits:
- name: train
num_bytes: 5767721
num_examples: 125202
download_size: 3409199
dataset_size: 5767721
---
# Dataset Card for "ITA_CW"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/alpaca_tasksource20 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 135229939
num_examples: 253970
download_size: 76708825
dataset_size: 135229939
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|