FewGLUE / README.md
librarian-bot's picture
Librarian Bot: Add language metadata for dataset
3ce3f4b verified
|
raw
history blame
No virus
1.63 kB
---
language:
- en
configs:
- config_name: BoolQ_train
data_files: BoolQ/train.jsonl
- config_name: BoolQ_unlabeled
data_files: BoolQ/unlabeled.jsonl
- config_name: CB_train
data_files: CB/train.jsonl
- config_name: CB_unlabeled
data_files: CB/unlabeled.jsonl
- config_name: COPA_train
data_files: COPA/train.jsonl
- config_name: COPA_unlabeled
data_files: COPA/unlabeled.jsonl
- config_name: MultiRC_train
data_files: MultiRC/train.jsonl
- config_name: MultiRC_unlabeled
data_files: MultiRC/unlabeled.jsonl
- config_name: RTE_train
data_files: RTE/train.jsonl
- config_name: RTE_unlabeled
data_files: RTE/unlabeled.jsonl
- config_name: ReCoRD_train
data_files: ReCoRD/train.jsonl
- config_name: ReCoRD_unlabeled
data_files: ReCoRD/unlabeled.jsonl
- config_name: WSC_train
data_files: WSC/train.jsonl
- config_name: WSC_unlabeled
data_files: WSC/unlabeled.jsonl
- config_name: WiC_train
data_files: WiC/train.jsonl
- config_name: WiC_unlabeled
data_files: WiC/unlabeled.jsonl
---
# [FewGLUE](https://arxiv.org/abs/2009.07118)
FewGLUE dataset, consisting of a random selection of 32 training examples from the SuperGLUE training sets and up to 20,000 unlabeled examples for each SuperGLUE task.
[Adapted from Original Repository](https://github.com/timoschick/fewglue)
## 📕 Citation
```misc
@article{schick2020small,
title={It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners},
author={Timo Schick and Hinrich Schütze},
journal={Computing Research Repository},
volume={arXiv:2009.07118},
url={http://arxiv.org/abs/2009.07118},
year={2020}
}
```