Datasets:
image
image |
---|
Dataset Card for "super_glue"
Dataset Summary
SuperGLUE (https://super.gluebenchmark.com/) is a new benchmark styled after GLUE with a new set of more difficult language understanding tasks, improved resources, and a new public leaderboard.
BoolQ (Boolean Questions, Clark et al., 2019a) is a QA task where each example consists of a short passage and a yes/no question about the passage. The questions are provided anonymously and unsolicited by users of the Google search engine, and afterwards paired with a paragraph from a Wikipedia article containing the answer. Following the original work, we evaluate with accuracy.
Supported Tasks and Leaderboards
Languages
Dataset Structure
Data Instances
axb
- Size of downloaded dataset files: 0.03 MB
- Size of the generated dataset: 0.23 MB
- Total amount of disk used: 0.26 MB
An example of 'test' looks as follows.
axg
- Size of downloaded dataset files: 0.01 MB
- Size of the generated dataset: 0.05 MB
- Total amount of disk used: 0.06 MB
An example of 'test' looks as follows.
boolq
- Size of downloaded dataset files: 3.93 MB
- Size of the generated dataset: 9.92 MB
- Total amount of disk used: 13.85 MB
An example of 'train' looks as follows.
cb
- Size of downloaded dataset files: 0.07 MB
- Size of the generated dataset: 0.19 MB
- Total amount of disk used: 0.27 MB
An example of 'train' looks as follows.
copa
- Size of downloaded dataset files: 0.04 MB
- Size of the generated dataset: 0.12 MB
- Total amount of disk used: 0.16 MB
An example of 'train' looks as follows.
Data Fields
The data fields are the same among all splits.
axb
sentence1
: astring
feature.sentence2
: astring
feature.idx
: aint32
feature.label
: a classification label, with possible values includingentailment
(0),not_entailment
(1).
axg
premise
: astring
feature.hypothesis
: astring
feature.idx
: aint32
feature.label
: a classification label, with possible values includingentailment
(0),not_entailment
(1).
boolq
question
: astring
feature.passage
: astring
feature.idx
: aint32
feature.label
: a classification label, with possible values includingFalse
(0),True
(1).
cb
premise
: astring
feature.hypothesis
: astring
feature.idx
: aint32
feature.label
: a classification label, with possible values includingentailment
(0),contradiction
(1),neutral
(2).
copa
premise
: astring
feature.choice1
: astring
feature.choice2
: astring
feature.question
: astring
feature.idx
: aint32
feature.label
: a classification label, with possible values includingchoice1
(0),choice2
(1).
Data Splits
axb
test | |
---|---|
axb | 1104 |
axg
test | |
---|---|
axg | 356 |
boolq
train | validation | test | |
---|---|---|---|
boolq | 9427 | 3270 | 3245 |
cb
train | validation | test | |
---|---|---|---|
cb | 250 | 56 | 250 |
copa
train | validation | test | |
---|---|---|---|
copa | 400 | 100 | 500 |
Dataset Creation
Curation Rationale
Source Data
Initial Data Collection and Normalization
Who are the source language producers?
Annotations
Annotation process
Who are the annotators?
Personal and Sensitive Information
Considerations for Using the Data
Social Impact of Dataset
Discussion of Biases
Other Known Limitations
Additional Information
Dataset Curators
Licensing Information
Citation Information
@inproceedings{clark2019boolq,
title={BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions},
author={Clark, Christopher and Lee, Kenton and Chang, Ming-Wei, and Kwiatkowski, Tom and Collins, Michael, and Toutanova, Kristina},
booktitle={NAACL},
year={2019}
}
@article{wang2019superglue,
title={SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems},
author={Wang, Alex and Pruksachatkun, Yada and Nangia, Nikita and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R},
journal={arXiv preprint arXiv:1905.00537},
year={2019}
}
Note that each SuperGLUE dataset has its own citation. Please see the source to
get the correct citation for each contained dataset.
Contributions
Thanks to @thomwolf, @lewtun, @patrickvonplaten for adding this dataset.
- Downloads last month
- 78