Datasets:

Tasks:
Other
Modalities:
Text
Formats:
parquet
Languages:
English
ArXiv:
Libraries:
Datasets
Dask
License:
VictorSanh commited on
Commit
3444e31
1 Parent(s): 4f5bb3e

update model card

Browse files
Files changed (1) hide show
  1. README.md +687 -10
README.md CHANGED
@@ -11,7 +11,7 @@ multilinguality:
11
  - monolingual
12
  pretty_name: P3
13
  size_categories:
14
- - 10M<n<100M
15
  task_categories:
16
  - other
17
  ---
@@ -66,7 +66,25 @@ The data in P3 are in English (BCP-47 `en`).
66
 
67
  An example of "train" looks as follows:
68
  ```bash
69
- TODO
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
70
  ```
71
 
72
  To check all the prompted examples, you can use the [Promptsource hosted tool](http://bigscience.huggingface.co/promptsource) and choose the `Prompted dataset viewer` mode in the left panel.
@@ -78,17 +96,676 @@ The data fields are the same among all splits:
78
  - `answer_choices`: the choices (in natural language) available to the model
79
  - `inputs_pretokenized`: the natural language input fed to the model
80
  - `targets_pretokenized`: the natural language target that the model has to generate
81
- - `inputs`: the tokenized input with T5's tokenizer
82
- - `targets`: the tokenized target with T5's tokenizer
83
- - `idx`: identifier of the (example, option) in the case of rank classification
84
- - `weight`: a weight for the example produced by seqio (always set to 1.0)
85
- - `is_correct`: whether the (example, option) is the correct one
86
 
87
  ### Data Splits
88
 
89
- |Data(sub)set|Split|Number of examples|
90
- |-|-|-|
91
- |WIP|WIP|WIP|
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
92
 
93
  ## Dataset Creation
94
 
 
11
  - monolingual
12
  pretty_name: P3
13
  size_categories:
14
+ - 100M<n<1B
15
  task_categories:
16
  - other
17
  ---
 
66
 
67
  An example of "train" looks as follows:
68
  ```bash
69
+ {
70
+ 'answer_choices': ['safe', 'trolley'],
71
+ 'inputs': [86, 8, 7142, 666, 6, 405, 8, 3, 834, 1518, 21, 1346, 42, 31682, 58, 37, 3, 929, 9, 3042, 63, 2765, 808, 8, 2045, 6448, 326, 13, 8, 31682, 11, 3, 24052, 135, 16, 8, 1346, 552, 8, 3, 834, 47, 6364, 5], 'inputs_pretokenized': 'In the sentence below, does the _ stand for safe or trolley?\nThe treasury workers took the gold bars off of the trolley and stacked them in the safe until the _ was empty.',
72
+ 'targets': [31682, 1],
73
+ 'targets_pretokenized': '\ntrolley'
74
+ }
75
+ ```
76
+
77
+ In the case of rank classification (letting the model select its the prediction the option with the highest log-likelihood), an example looks as follows:
78
+ ```bash
79
+ {
80
+ 'idx': [5, 0],
81
+ 'inputs': [86, 8, 7142, 666, 6, 405, 8, 3, 834, 1518, 21, 19454, 42, 22227, 58, 19454, 744, 31, 17, 2112, 4553, 17742, 7, 12, 1953, 6, 298, 22227, 966, 373, 405, 5, 3, 834, 19, 72, 952, 12, 619, 16, 3, 9, 17742, 3298, 5],
82
+ 'inputs_pretokenized': "In the sentence below, does the _ stand for Kyle or Logan?\nKyle doesn't wear leg warmers to bed, while Logan almost always does. _ is more likely to live in a warmer climate.",
83
+ 'is_correct': True,
84
+ 'targets': [19454, 1],
85
+ 'targets_pretokenized': 'Kyle',
86
+ 'weight': 1.0
87
+ }
88
  ```
89
 
90
  To check all the prompted examples, you can use the [Promptsource hosted tool](http://bigscience.huggingface.co/promptsource) and choose the `Prompted dataset viewer` mode in the left panel.
 
96
  - `answer_choices`: the choices (in natural language) available to the model
97
  - `inputs_pretokenized`: the natural language input fed to the model
98
  - `targets_pretokenized`: the natural language target that the model has to generate
99
+ - `inputs`: the tokenized input with [T5](https://huggingface.co/google/t5-v1_1-base)'s tokenizer
100
+ - `targets`: the tokenized target with [T5](https://huggingface.co/google/t5-v1_1-base)'s tokenizer
101
+ - `idx`: identifier of the (example, answer_option_id) in the case of rank classification
102
+ - `weight`: a weight for the example produced by seqio (always set to 1.0 in practise)
103
+ - `is_correct`: whether the (example, answer_option_id) is the correct one
104
 
105
  ### Data Splits
106
 
107
+ |Data(sub)set|Number of examples per splits|
108
+ |-|-|
109
+ |adversarial_qa_dbert_answer_the_following_q|{'train': 10000, 'validation': 1000}|
110
+ |adversarial_qa_dbert_based_on|{'train': 10000, 'validation': 1000}|
111
+ |adversarial_qa_dbert_generate_question|{'train': 10000, 'validation': 1000, 'test': 1000}|
112
+ |adversarial_qa_dbert_question_context_answer|{'train': 10000, 'validation': 1000}|
113
+ |adversarial_qa_dbert_tell_what_it_is|{'train': 10000, 'validation': 1000}|
114
+ |adversarial_qa_dbidaf_answer_the_following_q|{'train': 10000, 'validation': 1000}|
115
+ |adversarial_qa_dbidaf_based_on|{'train': 10000, 'validation': 1000}|
116
+ |adversarial_qa_dbidaf_generate_question|{'train': 10000, 'validation': 1000, 'test': 1000}|
117
+ |adversarial_qa_dbidaf_question_context_answer|{'train': 10000, 'validation': 1000}|
118
+ |adversarial_qa_dbidaf_tell_what_it_is|{'train': 10000, 'validation': 1000}|
119
+ |adversarial_qa_droberta_answer_the_following_q|{'train': 10000, 'validation': 1000}|
120
+ |adversarial_qa_droberta_based_on|{'train': 10000, 'validation': 1000}|
121
+ |adversarial_qa_droberta_generate_question|{'train': 10000, 'validation': 1000, 'test': 1000}|
122
+ |adversarial_qa_droberta_question_context_answer|{'train': 10000, 'validation': 1000}|
123
+ |adversarial_qa_droberta_tell_what_it_is|{'train': 10000, 'validation': 1000}|
124
+ |ag_news_classify|{'train': 120000, 'test': 7600}|
125
+ |ag_news_classify_question_first|{'train': 120000, 'test': 7600}|
126
+ |ag_news_classify_with_choices|{'train': 120000, 'test': 7600}|
127
+ |ag_news_classify_with_choices_question_first|{'train': 120000, 'test': 7600}|
128
+ |ag_news_recommend|{'train': 120000, 'test': 7600}|
129
+ |ag_news_which_section|{'train': 120000, 'test': 7600}|
130
+ |ag_news_which_section_choices|{'train': 120000, 'test': 7600}|
131
+ |ai2_arc_ARC_Challenge_heres_a_problem|{'train': 1119, 'validation': 299, 'test': 1172}|
132
+ |ai2_arc_ARC_Challenge_i_am_hesitating|{'train': 1119, 'validation': 299, 'test': 1172}|
133
+ |ai2_arc_ARC_Challenge_multiple_choice|{'train': 1119, 'validation': 299, 'test': 1172}|
134
+ |ai2_arc_ARC_Challenge_pick_false_options|{'train': 1119, 'validation': 299, 'test': 1172}|
135
+ |ai2_arc_ARC_Challenge_pick_the_most_correct_option|{'train': 1119, 'validation': 299, 'test': 1172}|
136
+ |ai2_arc_ARC_Challenge_qa_options|{'train': 1119, 'validation': 299, 'test': 1172}|
137
+ |ai2_arc_ARC_Easy_heres_a_problem|{'train': 2251, 'validation': 570, 'test': 2376}|
138
+ |ai2_arc_ARC_Easy_i_am_hesitating|{'train': 2251, 'validation': 570, 'test': 2376}|
139
+ |ai2_arc_ARC_Easy_multiple_choice|{'train': 2251, 'validation': 570, 'test': 2376}|
140
+ |ai2_arc_ARC_Easy_pick_false_options|{'train': 2251, 'validation': 570, 'test': 2376}|
141
+ |ai2_arc_ARC_Easy_pick_the_most_correct_option|{'train': 2251, 'validation': 570, 'test': 2376}|
142
+ |ai2_arc_ARC_Easy_qa_options|{'train': 2251, 'validation': 570, 'test': 2376}|
143
+ |amazon_polarity_Is_this_product_review_positive|{'train': 3600000, 'test': 400000}|
144
+ |amazon_polarity_Is_this_review|{'train': 3600000, 'test': 400000}|
145
+ |amazon_polarity_Is_this_review_negative|{'train': 3600000, 'test': 400000}|
146
+ |amazon_polarity_User_recommend_this_product|{'train': 3600000, 'test': 400000}|
147
+ |amazon_polarity_convey_negative_or_positive_sentiment|{'train': 3600000, 'test': 400000}|
148
+ |amazon_polarity_flattering_or_not|{'train': 3600000, 'test': 400000}|
149
+ |amazon_polarity_negative_or_positive_tone|{'train': 3600000, 'test': 400000}|
150
+ |amazon_polarity_user_satisfied|{'train': 3600000, 'test': 400000}|
151
+ |amazon_polarity_would_you_buy|{'train': 3600000, 'test': 400000}|
152
+ |anli_GPT_3_style_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
153
+ |anli_GPT_3_style_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
154
+ |anli_GPT_3_style_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
155
+ |anli_GPT_3_style_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
156
+ |anli_GPT_3_style_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
157
+ |anli_GPT_3_style_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
158
+ |anli_MNLI_crowdsource_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
159
+ |anli_MNLI_crowdsource_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
160
+ |anli_MNLI_crowdsource_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
161
+ |anli_MNLI_crowdsource_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
162
+ |anli_MNLI_crowdsource_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
163
+ |anli_MNLI_crowdsource_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
164
+ |anli_always_sometimes_never_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
165
+ |anli_always_sometimes_never_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
166
+ |anli_always_sometimes_never_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
167
+ |anli_always_sometimes_never_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
168
+ |anli_always_sometimes_never_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
169
+ |anli_always_sometimes_never_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
170
+ |anli_based_on_the_previous_passage_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
171
+ |anli_based_on_the_previous_passage_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
172
+ |anli_based_on_the_previous_passage_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
173
+ |anli_based_on_the_previous_passage_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
174
+ |anli_based_on_the_previous_passage_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
175
+ |anli_based_on_the_previous_passage_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
176
+ |anli_can_we_infer_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
177
+ |anli_can_we_infer_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
178
+ |anli_can_we_infer_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
179
+ |anli_can_we_infer_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
180
+ |anli_can_we_infer_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
181
+ |anli_can_we_infer_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
182
+ |anli_claim_true_false_inconclusive_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
183
+ |anli_claim_true_false_inconclusive_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
184
+ |anli_claim_true_false_inconclusive_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
185
+ |anli_claim_true_false_inconclusive_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
186
+ |anli_claim_true_false_inconclusive_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
187
+ |anli_claim_true_false_inconclusive_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
188
+ |anli_consider_always_sometimes_never_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
189
+ |anli_consider_always_sometimes_never_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
190
+ |anli_consider_always_sometimes_never_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
191
+ |anli_consider_always_sometimes_never_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
192
+ |anli_consider_always_sometimes_never_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
193
+ |anli_consider_always_sometimes_never_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
194
+ |anli_does_it_follow_that_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
195
+ |anli_does_it_follow_that_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
196
+ |anli_does_it_follow_that_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
197
+ |anli_does_it_follow_that_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
198
+ |anli_does_it_follow_that_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
199
+ |anli_does_it_follow_that_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
200
+ |anli_does_this_imply_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
201
+ |anli_does_this_imply_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
202
+ |anli_does_this_imply_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
203
+ |anli_does_this_imply_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
204
+ |anli_does_this_imply_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
205
+ |anli_does_this_imply_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
206
+ |anli_guaranteed_possible_impossible_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
207
+ |anli_guaranteed_possible_impossible_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
208
+ |anli_guaranteed_possible_impossible_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
209
+ |anli_guaranteed_possible_impossible_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
210
+ |anli_guaranteed_possible_impossible_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
211
+ |anli_guaranteed_possible_impossible_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
212
+ |anli_guaranteed_true_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
213
+ |anli_guaranteed_true_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
214
+ |anli_guaranteed_true_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
215
+ |anli_guaranteed_true_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
216
+ |anli_guaranteed_true_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
217
+ |anli_guaranteed_true_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
218
+ |anli_justified_in_saying_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
219
+ |anli_justified_in_saying_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
220
+ |anli_justified_in_saying_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
221
+ |anli_justified_in_saying_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
222
+ |anli_justified_in_saying_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
223
+ |anli_justified_in_saying_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
224
+ |anli_must_be_true_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
225
+ |anli_must_be_true_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
226
+ |anli_must_be_true_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
227
+ |anli_must_be_true_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
228
+ |anli_must_be_true_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
229
+ |anli_must_be_true_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
230
+ |anli_should_assume_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
231
+ |anli_should_assume_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
232
+ |anli_should_assume_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
233
+ |anli_should_assume_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
234
+ |anli_should_assume_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
235
+ |anli_should_assume_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
236
+ |anli_take_the_following_as_truth_r1|{'train': 16946, 'validation': 1000, 'test': 1000}|
237
+ |anli_take_the_following_as_truth_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}|
238
+ |anli_take_the_following_as_truth_r2|{'train': 45460, 'validation': 1000, 'test': 1000}|
239
+ |anli_take_the_following_as_truth_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}|
240
+ |anli_take_the_following_as_truth_r3|{'train': 100459, 'validation': 1200, 'test': 1200}|
241
+ |anli_take_the_following_as_truth_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}|
242
+ |app_reviews_categorize_rating_using_review|{'train': 288065}|
243
+ |app_reviews_convert_to_rating|{'train': 288065}|
244
+ |app_reviews_convert_to_star_rating|{'train': 288065}|
245
+ |app_reviews_generate_review|{'train': 288065}|
246
+ |cnn_dailymail_3.0.0_2_or_3_sentences|{'train': 287113, 'validation': 13368, 'test': 11490}|
247
+ |cnn_dailymail_3.0.0_generate_story|{'train': 287113, 'validation': 13368, 'test': 11490}|
248
+ |cnn_dailymail_3.0.0_news_card_view|{'train': 287113, 'validation': 13368, 'test': 11490}|
249
+ |cnn_dailymail_3.0.0_news_stock|{'train': 287113, 'validation': 13368, 'test': 11490}|
250
+ |cnn_dailymail_3.0.0_news_summary|{'train': 287113, 'validation': 13368, 'test': 11490}|
251
+ |cnn_dailymail_3.0.0_spice_up_story|{'train': 287113, 'validation': 13368, 'test': 11490}|
252
+ |cnn_dailymail_3.0.0_sum_in_brief|{'train': 287113, 'validation': 13368, 'test': 11490}|
253
+ |cnn_dailymail_3.0.0_tldr_summary|{'train': 287113, 'validation': 13368, 'test': 11490}|
254
+ |cnn_dailymail_3.0.0_write_an_outline|{'train': 287113, 'validation': 13368, 'test': 11490}|
255
+ |common_gen_Example_prompt|{'train': 67389, 'validation': 4018, 'test': 1497}|
256
+ |common_gen_Given_concepts_type_1|{'train': 67389, 'validation': 4018, 'test': 1497}|
257
+ |common_gen_Given_concepts_type_2|{'train': 67389, 'validation': 4018, 'test': 1497}|
258
+ |common_gen_Put_together|{'train': 67389, 'validation': 4018, 'test': 1497}|
259
+ |common_gen_choice_in_concept_centric_sentence_generation|{'train': 67389, 'validation': 4018, 'test': 1497}|
260
+ |common_gen_random_task_template_prompt|{'train': 67389, 'validation': 4018, 'test': 1497}|
261
+ |common_gen_sentence_to_concepts|{'train': 67389, 'validation': 4018, 'test': 1497}|
262
+ |common_gen_topic_to_sentence|{'train': 67389, 'validation': 4018, 'test': 1497}|
263
+ |common_gen_topics_from_the_sentence|{'train': 67389, 'validation': 4018, 'test': 1497}|
264
+ |cos_e_v1.11_aligned_with_common_sense|{'train': 9741, 'validation': 1221}|
265
+ |cos_e_v1.11_description_question_option_id|{'train': 9741, 'validation': 1221}|
266
+ |cos_e_v1.11_description_question_option_text|{'train': 9741, 'validation': 1221}|
267
+ |cos_e_v1.11_explain_why_human|{'train': 9741, 'validation': 1221}|
268
+ |cos_e_v1.11_generate_explanation_given_text|{'train': 9741, 'validation': 1221}|
269
+ |cos_e_v1.11_i_think|{'train': 9741, 'validation': 1221}|
270
+ |cos_e_v1.11_question_description_option_id|{'train': 9741, 'validation': 1221}|
271
+ |cos_e_v1.11_question_description_option_text|{'train': 9741, 'validation': 1221}|
272
+ |cos_e_v1.11_question_option_description_id|{'train': 9741, 'validation': 1221}|
273
+ |cos_e_v1.11_question_option_description_text|{'train': 9741, 'validation': 1221}|
274
+ |cos_e_v1.11_rationale|{'train': 9741, 'validation': 1221}|
275
+ |cosmos_qa_context_answer_to_question|{'train': 25262, 'validation': 2985, 'test': 6963}|
276
+ |cosmos_qa_context_description_question_answer_id|{'train': 25262, 'validation': 2985, 'test': 6963}|
277
+ |cosmos_qa_context_description_question_answer_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
278
+ |cosmos_qa_context_description_question_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
279
+ |cosmos_qa_context_question_description_answer_id|{'train': 25262, 'validation': 2985, 'test': 6963}|
280
+ |cosmos_qa_context_question_description_answer_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
281
+ |cosmos_qa_context_question_description_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
282
+ |cosmos_qa_description_context_question_answer_id|{'train': 25262, 'validation': 2985, 'test': 6963}|
283
+ |cosmos_qa_description_context_question_answer_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
284
+ |cosmos_qa_description_context_question_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
285
+ |cosmos_qa_no_prompt_id|{'train': 25262, 'validation': 2985, 'test': 6963}|
286
+ |cosmos_qa_no_prompt_text|{'train': 25262, 'validation': 2985, 'test': 6963}|
287
+ |cosmos_qa_only_question_answer|{'train': 25262, 'validation': 2985, 'test': 6963}|
288
+ |dbpedia_14_given_a_choice_of_categories_|{'train': 560000, 'test': 70000}|
289
+ |dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to|{'train': 560000, 'test': 70000}|
290
+ |dbpedia_14_given_list_what_category_does_the_paragraph_belong_to|{'train': 560000, 'test': 70000}|
291
+ |dbpedia_14_pick_one_category_for_the_following_text|{'train': 560000, 'test': 70000}|
292
+ |dream_answer_to_dialogue|{'train': 6116, 'validation': 2040, 'test': 2041}|
293
+ |dream_baseline|{'train': 6116, 'validation': 2040, 'test': 2041}|
294
+ |dream_generate_first_utterance|{'train': 6116, 'validation': 2040, 'test': 2041}|
295
+ |dream_generate_last_utterance|{'train': 6116, 'validation': 2040, 'test': 2041}|
296
+ |dream_read_the_following_conversation_and_answer_the_question|{'train': 6116, 'validation': 2040, 'test': 2041}|
297
+ |duorc_ParaphraseRC_answer_question|{'train': 69524, 'validation': 15591, 'test': 15857}|
298
+ |duorc_ParaphraseRC_build_story_around_qa|{'train': 58752, 'validation': 13111, 'test': 13449}|
299
+ |duorc_ParaphraseRC_decide_worth_it|{'train': 69524, 'validation': 15591, 'test': 15857}|
300
+ |duorc_ParaphraseRC_extract_answer|{'train': 69524, 'validation': 15591, 'test': 15857}|
301
+ |duorc_ParaphraseRC_generate_question|{'train': 69524, 'validation': 15591, 'test': 15857}|
302
+ |duorc_ParaphraseRC_generate_question_by_answer|{'train': 58752, 'validation': 13111, 'test': 13449}|
303
+ |duorc_ParaphraseRC_movie_director|{'train': 69524, 'validation': 15591, 'test': 15857}|
304
+ |duorc_ParaphraseRC_question_answering|{'train': 69524, 'validation': 15591, 'test': 15857}|
305
+ |duorc_ParaphraseRC_title_generation|{'train': 69524, 'validation': 15591, 'test': 15857}|
306
+ |duorc_SelfRC_answer_question|{'train': 60721, 'validation': 12961, 'test': 12559}|
307
+ |duorc_SelfRC_build_story_around_qa|{'train': 60094, 'validation': 12845, 'test': 12415}|
308
+ |duorc_SelfRC_decide_worth_it|{'train': 60721, 'validation': 12961, 'test': 12559}|
309
+ |duorc_SelfRC_extract_answer|{'train': 60721, 'validation': 12961, 'test': 12559}|
310
+ |duorc_SelfRC_generate_question|{'train': 60721, 'validation': 12961, 'test': 12559}|
311
+ |duorc_SelfRC_generate_question_by_answer|{'train': 60094, 'validation': 12845, 'test': 12415}|
312
+ |duorc_SelfRC_movie_director|{'train': 60721, 'validation': 12961, 'test': 12559}|
313
+ |duorc_SelfRC_question_answering|{'train': 60721, 'validation': 12961, 'test': 12559}|
314
+ |duorc_SelfRC_title_generation|{'train': 60721, 'validation': 12961, 'test': 12559}|
315
+ |gigaword_TLDR|{'train': 3803957, 'validation': 189651, 'test': 1951}|
316
+ |gigaword_first_sentence_title|{'train': 3803957, 'validation': 189651, 'test': 1951}|
317
+ |gigaword_generate_summary_for_this|{'train': 3803957, 'validation': 189651, 'test': 1951}|
318
+ |gigaword_in_a_nutshell|{'train': 3803957, 'validation': 189651, 'test': 1951}|
319
+ |gigaword_make_a_title|{'train': 3803957, 'validation': 189651, 'test': 1951}|
320
+ |gigaword_reverse_writing|{'train': 3803957, 'validation': 189651, 'test': 1951}|
321
+ |gigaword_write_a_title_for_this_sentence|{'train': 3803957, 'validation': 189651, 'test': 1951}|
322
+ |gigaword_write_an_article|{'train': 3803957, 'validation': 189651, 'test': 1951}|
323
+ |gigaword_write_its_sentence|{'train': 3803957, 'validation': 189651, 'test': 1951}|
324
+ |glue_mrpc_equivalent|{'train': 3668, 'validation': 408, 'test': 1725}|
325
+ |glue_mrpc_generate_paraphrase|{'train': 2474, 'validation': 279, 'test': 1147}|
326
+ |glue_mrpc_generate_sentence|{'train': 2474, 'validation': 279, 'test': 1147}|
327
+ |glue_mrpc_paraphrase|{'train': 3668, 'validation': 408, 'test': 1725}|
328
+ |glue_mrpc_replace|{'train': 3668, 'validation': 408, 'test': 1725}|
329
+ |glue_mrpc_same_thing|{'train': 3668, 'validation': 408, 'test': 1725}|
330
+ |glue_mrpc_want_to_know|{'train': 3668, 'validation': 408, 'test': 1725}|
331
+ |glue_qqp_answer|{'train': 363846, 'validation': 40430, 'test': 390965}|
332
+ |glue_qqp_duplicate|{'train': 363846, 'validation': 40430, 'test': 390965}|
333
+ |glue_qqp_duplicate_or_not|{'train': 363846, 'validation': 40430, 'test': 390965}|
334
+ |glue_qqp_meaning|{'train': 363846, 'validation': 40430, 'test': 390965}|
335
+ |glue_qqp_quora|{'train': 363846, 'validation': 40430, 'test': 390965}|
336
+ |glue_qqp_same_thing|{'train': 363846, 'validation': 40430, 'test': 390965}|
337
+ |hellaswag_Appropriate_continuation_Yes_or_No|{'train': 39905, 'validation': 10042, 'test': 10003}|
338
+ |hellaswag_Open_ended_completion|{'train': 39905, 'validation': 10042, 'test': 10003}|
339
+ |hellaswag_Open_ended_start|{'train': 39905, 'validation': 10042, 'test': 10003}|
340
+ |hellaswag_Predict_ending_with_hint|{'train': 39905, 'validation': 10042, 'test': 10003}|
341
+ |hellaswag_Predict_ending_with_hint_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}|
342
+ |hellaswag_Randomized_prompts_template|{'train': 39905, 'validation': 10042, 'test': 10003}|
343
+ |hellaswag_Randomized_prompts_template_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}|
344
+ |hellaswag_Reversed_appropriate_continuation_Yes_or_No|{'train': 39905, 'validation': 10042, 'test': 10003}|
345
+ |hellaswag_Topic_of_the_context|{'train': 39905, 'validation': 10042, 'test': 10003}|
346
+ |hellaswag_Topic_without_the_ending_answer|{'train': 39905, 'validation': 10042, 'test': 10003}|
347
+ |hellaswag_complete_first_then|{'train': 39905, 'validation': 10042, 'test': 10003}|
348
+ |hellaswag_complete_first_then_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}|
349
+ |hellaswag_how_ends|{'train': 39905, 'validation': 10042, 'test': 10003}|
350
+ |hellaswag_if_begins_how_continues|{'train': 39905, 'validation': 10042, 'test': 10003}|
351
+ |hellaswag_if_begins_how_continues_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}|
352
+ |imdb_Movie_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
353
+ |imdb_Movie_Expressed_Sentiment_2|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
354
+ |imdb_Negation_template_for_positive_and_negative|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
355
+ |imdb_Reviewer_Enjoyment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
356
+ |imdb_Reviewer_Enjoyment_Yes_No|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
357
+ |imdb_Reviewer_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
358
+ |imdb_Reviewer_Opinion_bad_good_choices|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
359
+ |imdb_Reviewer_Sentiment_Feeling|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
360
+ |imdb_Sentiment_with_choices_|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
361
+ |imdb_Text_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
362
+ |imdb_Writer_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}|
363
+ |kilt_tasks_hotpotqa_combining_facts|{'train': 88869, 'validation': 5600}|
364
+ |kilt_tasks_hotpotqa_complex_question|{'train': 88869, 'validation': 5600}|
365
+ |kilt_tasks_hotpotqa_final_exam|{'train': 88869, 'validation': 5600}|
366
+ |kilt_tasks_hotpotqa_formulate|{'train': 88869, 'validation': 5600}|
367
+ |kilt_tasks_hotpotqa_straighforward_qa|{'train': 88869, 'validation': 5600}|
368
+ |multi_news_distill|{'train': 44972, 'validation': 5622, 'test': 5622}|
369
+ |multi_news_expand_reverse_task_|{'train': 44972, 'validation': 5622, 'test': 5622}|
370
+ |multi_news_summarize|{'train': 44972, 'validation': 5622, 'test': 5622}|
371
+ |multi_news_summary_scenario|{'train': 44972, 'validation': 5622, 'test': 5622}|
372
+ |multi_news_synthesize|{'train': 44972, 'validation': 5622, 'test': 5622}|
373
+ |multi_news_what_are_the_key_points|{'train': 44972, 'validation': 5622, 'test': 5622}|
374
+ |openbookqa_main_choices|{'train': 4957, 'validation': 500, 'test': 500}|
375
+ |openbookqa_main_choose_an_answer_with_options|{'train': 4957, 'validation': 500, 'test': 500}|
376
+ |openbookqa_main_only_options|{'train': 4957, 'validation': 500, 'test': 500}|
377
+ |openbookqa_main_pick_answer_with_options|{'train': 4957, 'validation': 500, 'test': 500}|
378
+ |openbookqa_main_pick_using_id|{'train': 4957, 'validation': 500, 'test': 500}|
379
+ |openbookqa_main_which_correct|{'train': 4957, 'validation': 500, 'test': 500}|
380
+ |openbookqa_main_which_correct_inverse|{'train': 4957, 'validation': 500, 'test': 500}|
381
+ |paws_labeled_final_Concatenation|{'train': 49401, 'validation': 8000, 'test': 8000}|
382
+ |paws_labeled_final_Concatenation_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}|
383
+ |paws_labeled_final_Meaning|{'train': 49401, 'validation': 8000, 'test': 8000}|
384
+ |paws_labeled_final_Meaning_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}|
385
+ |paws_labeled_final_PAWS_ANLI_GPT3|{'train': 49401, 'validation': 8000, 'test': 8000}|
386
+ |paws_labeled_final_PAWS_ANLI_GPT3_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}|
387
+ |paws_labeled_final_Rewrite|{'train': 49401, 'validation': 8000, 'test': 8000}|
388
+ |paws_labeled_final_Rewrite_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}|
389
+ |paws_labeled_final_context_question|{'train': 49401, 'validation': 8000, 'test': 8000}|
390
+ |paws_labeled_final_context_question_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}|
391
+ |paws_labeled_final_paraphrase_task|{'train': 21829, 'validation': 3539, 'test': 3536}|
392
+ |paws_labeled_final_task_description_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}|
393
+ |piqa_Correct_the_solution|{'train': 16113, 'validation': 1838, 'test': 3084}|
394
+ |piqa_Correct_the_solution_if_false_from_sol_1|{'train': 16113, 'validation': 1838, 'test': 3084}|
395
+ |piqa_Correct_the_solution_if_false_from_sol_2|{'train': 16113, 'validation': 1838, 'test': 3084}|
396
+ |piqa_Does_this_solution_make_sense_sol1|{'train': 16113, 'validation': 1838, 'test': 3084}|
397
+ |piqa_Does_this_solution_make_sense_sol2|{'train': 16113, 'validation': 1838, 'test': 3084}|
398
+ |piqa_choose_the_most_appropriate_solution|{'train': 16113, 'validation': 1838, 'test': 3084}|
399
+ |piqa_finish_sentence_with_correct_choice|{'train': 16113, 'validation': 1838, 'test': 3084}|
400
+ |piqa_no_prompt_needed|{'train': 16113, 'validation': 1838, 'test': 3084}|
401
+ |piqa_pick_correct_choice_index|{'train': 16113, 'validation': 1838, 'test': 3084}|
402
+ |piqa_pick_correct_choice_with_choice_given_before_goal|{'train': 16113, 'validation': 1838, 'test': 3084}|
403
+ |piqa_what_is_the_correct_ending|{'train': 16113, 'validation': 1838, 'test': 3084}|
404
+ |qasc_is_correct_1|{'train': 8134, 'validation': 926, 'test': 920}|
405
+ |qasc_is_correct_2|{'train': 8134, 'validation': 926, 'test': 920}|
406
+ |qasc_qa_with_combined_facts_1|{'train': 8134, 'validation': 926, 'test': 920}|
407
+ |qasc_qa_with_separated_facts_1|{'train': 8134, 'validation': 926, 'test': 920}|
408
+ |qasc_qa_with_separated_facts_2|{'train': 8134, 'validation': 926, 'test': 920}|
409
+ |qasc_qa_with_separated_facts_3|{'train': 8134, 'validation': 926, 'test': 920}|
410
+ |qasc_qa_with_separated_facts_4|{'train': 8134, 'validation': 926, 'test': 920}|
411
+ |qasc_qa_with_separated_facts_5|{'train': 8134, 'validation': 926, 'test': 920}|
412
+ |quail_context_description_question_answer_id|{'train': 10246, 'validation': 2164, 'challenge': 556}|
413
+ |quail_context_description_question_answer_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
414
+ |quail_context_description_question_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
415
+ |quail_context_question_answer_description_id|{'train': 10246, 'validation': 2164, 'challenge': 556}|
416
+ |quail_context_question_answer_description_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
417
+ |quail_context_question_description_answer_id|{'train': 10246, 'validation': 2164, 'challenge': 556}|
418
+ |quail_context_question_description_answer_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
419
+ |quail_context_question_description_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
420
+ |quail_description_context_question_answer_id|{'train': 10246, 'validation': 2164, 'challenge': 556}|
421
+ |quail_description_context_question_answer_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
422
+ |quail_description_context_question_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
423
+ |quail_no_prompt_id|{'train': 10246, 'validation': 2164, 'challenge': 556}|
424
+ |quail_no_prompt_text|{'train': 10246, 'validation': 2164, 'challenge': 556}|
425
+ |quarel_choose_between|{'train': 1941, 'validation': 278, 'test': 552}|
426
+ |quarel_do_not_use|{'train': 1941, 'validation': 278, 'test': 552}|
427
+ |quarel_heres_a_story|{'train': 1941, 'validation': 278, 'test': 552}|
428
+ |quarel_logic_test|{'train': 1941, 'validation': 278, 'test': 552}|
429
+ |quarel_testing_students|{'train': 1941, 'validation': 278, 'test': 552}|
430
+ |quartz_answer_question_based_on|{'train': 2696, 'validation': 384, 'test': 784}|
431
+ |quartz_answer_question_below|{'train': 2696, 'validation': 384, 'test': 784}|
432
+ |quartz_given_the_fact_answer_the_q|{'train': 2696, 'validation': 384, 'test': 784}|
433
+ |quartz_having_read_above_passage|{'train': 2696, 'validation': 384, 'test': 784}|
434
+ |quartz_paragraph_question_plain_concat|{'train': 2696, 'validation': 384, 'test': 784}|
435
+ |quartz_read_passage_below_choose|{'train': 2696, 'validation': 384, 'test': 784}|
436
+ |quartz_use_info_from_paragraph_question|{'train': 2696, 'validation': 384, 'test': 784}|
437
+ |quartz_use_info_from_question_paragraph|{'train': 2696, 'validation': 384, 'test': 784}|
438
+ |quoref_Answer_Friend_Question|{'train': 19399, 'validation': 2418}|
439
+ |quoref_Answer_Question_Given_Context|{'train': 19399, 'validation': 2418}|
440
+ |quoref_Answer_Test|{'train': 19399, 'validation': 2418}|
441
+ |quoref_Context_Contains_Answer|{'train': 19399, 'validation': 2418}|
442
+ |quoref_Find_Answer|{'train': 19399, 'validation': 2418}|
443
+ |quoref_Found_Context_Online|{'train': 19399, 'validation': 2418}|
444
+ |quoref_Given_Context_Answer_Question|{'train': 19399, 'validation': 2418}|
445
+ |quoref_Guess_Answer|{'train': 19399, 'validation': 2418}|
446
+ |quoref_Guess_Title_For_Context|{'train': 19399, 'validation': 2418}|
447
+ |quoref_Read_And_Extract_|{'train': 19399, 'validation': 2418}|
448
+ |quoref_What_Is_The_Answer|{'train': 19399, 'validation': 2418}|
449
+ |race_high_Is_this_the_right_answer|{'train': 62445, 'validation': 3451, 'test': 3498}|
450
+ |race_high_Read_the_article_and_answer_the_question_no_option_|{'train': 62445, 'validation': 3451, 'test': 3498}|
451
+ |race_high_Select_the_best_answer|{'train': 62445, 'validation': 3451, 'test': 3498}|
452
+ |race_high_Select_the_best_answer_generate_span_|{'train': 62445, 'validation': 3451, 'test': 3498}|
453
+ |race_high_Select_the_best_answer_no_instructions_|{'train': 62445, 'validation': 3451, 'test': 3498}|
454
+ |race_high_Taking_a_test|{'train': 62445, 'validation': 3451, 'test': 3498}|
455
+ |race_high_Write_a_multi_choice_question_for_the_following_article|{'train': 62445, 'validation': 3451, 'test': 3498}|
456
+ |race_high_Write_a_multi_choice_question_options_given_|{'train': 62445, 'validation': 3451, 'test': 3498}|
457
+ |race_middle_Is_this_the_right_answer|{'train': 25421, 'validation': 1436, 'test': 1436}|
458
+ |race_middle_Read_the_article_and_answer_the_question_no_option_|{'train': 25421, 'validation': 1436, 'test': 1436}|
459
+ |race_middle_Select_the_best_answer|{'train': 25421, 'validation': 1436, 'test': 1436}|
460
+ |race_middle_Select_the_best_answer_generate_span_|{'train': 25421, 'validation': 1436, 'test': 1436}|
461
+ |race_middle_Select_the_best_answer_no_instructions_|{'train': 25421, 'validation': 1436, 'test': 1436}|
462
+ |race_middle_Taking_a_test|{'train': 25421, 'validation': 1436, 'test': 1436}|
463
+ |race_middle_Write_a_multi_choice_question_for_the_following_article|{'train': 25421, 'validation': 1436, 'test': 1436}|
464
+ |race_middle_Write_a_multi_choice_question_options_given_|{'train': 25421, 'validation': 1436, 'test': 1436}|
465
+ |ropes_background_new_situation_answer|{'train': 10924, 'validation': 1688}|
466
+ |ropes_background_situation_middle|{'train': 10924, 'validation': 1688}|
467
+ |ropes_given_background_situation|{'train': 10924, 'validation': 1688}|
468
+ |ropes_new_situation_background_answer|{'train': 10924, 'validation': 1688}|
469
+ |ropes_plain_background_situation|{'train': 10924, 'validation': 1688}|
470
+ |ropes_plain_bottom_hint|{'train': 10924, 'validation': 1688}|
471
+ |ropes_plain_no_background|{'train': 10924, 'validation': 1688}|
472
+ |ropes_prompt_beginning|{'train': 10924, 'validation': 1688}|
473
+ |ropes_prompt_bottom_hint_beginning|{'train': 10924, 'validation': 1688}|
474
+ |ropes_prompt_bottom_no_hint|{'train': 10924, 'validation': 1688}|
475
+ |ropes_prompt_mix|{'train': 10924, 'validation': 1688}|
476
+ |ropes_read_background_situation|{'train': 10924, 'validation': 1688}|
477
+ |rotten_tomatoes_Movie_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}|
478
+ |rotten_tomatoes_Movie_Expressed_Sentiment_2|{'train': 8530, 'validation': 1066, 'test': 1066}|
479
+ |rotten_tomatoes_Reviewer_Enjoyment|{'train': 8530, 'validation': 1066, 'test': 1066}|
480
+ |rotten_tomatoes_Reviewer_Enjoyment_Yes_No|{'train': 8530, 'validation': 1066, 'test': 1066}|
481
+ |rotten_tomatoes_Reviewer_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}|
482
+ |rotten_tomatoes_Reviewer_Opinion_bad_good_choices|{'train': 8530, 'validation': 1066, 'test': 1066}|
483
+ |rotten_tomatoes_Reviewer_Sentiment_Feeling|{'train': 8530, 'validation': 1066, 'test': 1066}|
484
+ |rotten_tomatoes_Sentiment_with_choices_|{'train': 8530, 'validation': 1066, 'test': 1066}|
485
+ |rotten_tomatoes_Text_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}|
486
+ |rotten_tomatoes_Writer_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}|
487
+ |samsum_Generate_a_summary_for_this_dialogue|{'train': 14732, 'validation': 818, 'test': 819}|
488
+ |samsum_Given_the_above_dialogue_write_a_summary|{'train': 14732, 'validation': 818, 'test': 819}|
489
+ |samsum_Sum_up_the_following_dialogue|{'train': 14732, 'validation': 818, 'test': 819}|
490
+ |samsum_Summarize_|{'train': 14732, 'validation': 818, 'test': 819}|
491
+ |samsum_Summarize_this_dialogue_|{'train': 14732, 'validation': 818, 'test': 819}|
492
+ |samsum_To_sum_up_this_dialog|{'train': 14732, 'validation': 818, 'test': 819}|
493
+ |samsum_Write_a_dialogue_that_match_this_summary|{'train': 14732, 'validation': 818, 'test': 819}|
494
+ |sciq_Direct_Question|{'train': 11679, 'validation': 1000, 'test': 1000}|
495
+ |sciq_Direct_Question_Closed_Book_|{'train': 11679, 'validation': 1000, 'test': 1000}|
496
+ |sciq_Multiple_Choice|{'train': 11679, 'validation': 1000, 'test': 1000}|
497
+ |sciq_Multiple_Choice_Closed_Book_|{'train': 11679, 'validation': 1000, 'test': 1000}|
498
+ |sciq_Multiple_Choice_Question_First|{'train': 11679, 'validation': 1000, 'test': 1000}|
499
+ |social_i_qa_Check_if_a_random_answer_is_valid_or_not|{'train': 33410, 'validation': 1954}|
500
+ |social_i_qa_Generate_answer|{'train': 33410, 'validation': 1954}|
501
+ |social_i_qa_Generate_the_question_from_the_answer|{'train': 33410, 'validation': 1954}|
502
+ |social_i_qa_I_was_wondering|{'train': 33410, 'validation': 1954}|
503
+ |social_i_qa_Show_choices_and_generate_answer|{'train': 33410, 'validation': 1954}|
504
+ |social_i_qa_Show_choices_and_generate_index|{'train': 33410, 'validation': 1954}|
505
+ |squad_v2_Jeopardy_with_Context|{'train': 86821, 'validation': 5928}|
506
+ |squad_v2_Jeopardy_without_Context|{'train': 86821, 'validation': 5928}|
507
+ |squad_v2_Questions_with_Context|{'train': 130319, 'validation': 11873}|
508
+ |squad_v2_Questions_with_Context_Without_Prompt_Keywords|{'train': 130319, 'validation': 11873}|
509
+ |squad_v2_Questions_with_Context_Without_Prompt_Keywords_unanswerable|{'train': 130319, 'validation': 11873}|
510
+ |squad_v2_Questions_with_Context_unanswerable|{'train': 130319, 'validation': 11873}|
511
+ |squad_v2_Topic_Prediction_Context|{'train': 130319, 'validation': 11873}|
512
+ |squad_v2_Topic_Prediction_Context_with_randomized_prompt_options|{'train': 130319, 'validation': 11873}|
513
+ |squad_v2_Topic_Prediction_Context_with_randomized_prompt_options_placed_in_the_end|{'train': 130319, 'validation': 11873}|
514
+ |squad_v2_Topic_Prediction_Question_and_Answer_Pair|{'train': 86821, 'validation': 5928}|
515
+ |squad_v2_Trivia|{'train': 86821, 'validation': 5928}|
516
+ |squad_v2_Unanwerable_question|{'train': 130319, 'validation': 11873}|
517
+ |super_glue_boolq_GPT_3_Style|{'train': 9427, 'validation': 3270, 'test': 3245}|
518
+ |super_glue_boolq_I_wonder_|{'train': 9427, 'validation': 3270, 'test': 3245}|
519
+ |super_glue_boolq_after_reading|{'train': 9427, 'validation': 3270, 'test': 3245}|
520
+ |super_glue_boolq_based_on_the_following_passage|{'train': 9427, 'validation': 3270, 'test': 3245}|
521
+ |super_glue_boolq_based_on_the_previous_passage|{'train': 9427, 'validation': 3270, 'test': 3245}|
522
+ |super_glue_boolq_could_you_tell_me_|{'train': 9427, 'validation': 3270, 'test': 3245}|
523
+ |super_glue_boolq_exam|{'train': 9427, 'validation': 3270, 'test': 3245}|
524
+ |super_glue_boolq_exercise|{'train': 9427, 'validation': 3270, 'test': 3245}|
525
+ |super_glue_boolq_valid_binary|{'train': 9427, 'validation': 3270, 'test': 3245}|
526
+ |super_glue_boolq_yes_no_question|{'train': 9427, 'validation': 3270, 'test': 3245}|
527
+ |super_glue_cb_GPT_3_style|{'train': 250, 'validation': 56, 'test': 250}|
528
+ |super_glue_cb_GPT_3_style_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
529
+ |super_glue_cb_MNLI_crowdsource|{'train': 250, 'validation': 56, 'test': 250}|
530
+ |super_glue_cb_MNLI_crowdsource_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
531
+ |super_glue_cb_always_sometimes_never|{'train': 250, 'validation': 56, 'test': 250}|
532
+ |super_glue_cb_always_sometimes_never_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
533
+ |super_glue_cb_based_on_the_previous_passage|{'train': 250, 'validation': 56, 'test': 250}|
534
+ |super_glue_cb_based_on_the_previous_passage_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
535
+ |super_glue_cb_can_we_infer|{'train': 250, 'validation': 56, 'test': 250}|
536
+ |super_glue_cb_can_we_infer_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
537
+ |super_glue_cb_claim_true_false_inconclusive|{'train': 250, 'validation': 56, 'test': 250}|
538
+ |super_glue_cb_claim_true_false_inconclusive_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
539
+ |super_glue_cb_consider_always_sometimes_never|{'train': 250, 'validation': 56, 'test': 250}|
540
+ |super_glue_cb_consider_always_sometimes_never_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
541
+ |super_glue_cb_does_it_follow_that|{'train': 250, 'validation': 56, 'test': 250}|
542
+ |super_glue_cb_does_it_follow_that_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
543
+ |super_glue_cb_does_this_imply|{'train': 250, 'validation': 56, 'test': 250}|
544
+ |super_glue_cb_does_this_imply_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
545
+ |super_glue_cb_guaranteed_possible_impossible|{'train': 250, 'validation': 56, 'test': 250}|
546
+ |super_glue_cb_guaranteed_possible_impossible_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
547
+ |super_glue_cb_guaranteed_true|{'train': 250, 'validation': 56, 'test': 250}|
548
+ |super_glue_cb_guaranteed_true_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
549
+ |super_glue_cb_justified_in_saying|{'train': 250, 'validation': 56, 'test': 250}|
550
+ |super_glue_cb_justified_in_saying_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
551
+ |super_glue_cb_must_be_true|{'train': 250, 'validation': 56, 'test': 250}|
552
+ |super_glue_cb_must_be_true_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
553
+ |super_glue_cb_should_assume|{'train': 250, 'validation': 56, 'test': 250}|
554
+ |super_glue_cb_should_assume_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
555
+ |super_glue_cb_take_the_following_as_truth|{'train': 250, 'validation': 56, 'test': 250}|
556
+ |super_glue_cb_take_the_following_as_truth_score_eval|{'train': 750, 'validation': 168, 'test': 750}|
557
+ |super_glue_copa_C1_or_C2_premise_so_because_|{'train': 400, 'validation': 100, 'test': 500}|
558
+ |super_glue_copa_C1_or_C2_premise_so_because__score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
559
+ |super_glue_copa__As_a_result_C1_or_C2_|{'train': 202, 'validation': 48, 'test': 250}|
560
+ |super_glue_copa__As_a_result_C1_or_C2__score_eval|{'train': 404, 'validation': 96, 'test': 500}|
561
+ |super_glue_copa__What_could_happen_next_C1_or_C2_|{'train': 202, 'validation': 48, 'test': 250}|
562
+ |super_glue_copa__What_could_happen_next_C1_or_C2__score_eval|{'train': 404, 'validation': 96, 'test': 500}|
563
+ |super_glue_copa__which_may_be_caused_by|{'train': 198, 'validation': 52, 'test': 250}|
564
+ |super_glue_copa__which_may_be_caused_by_score_eval|{'train': 396, 'validation': 104, 'test': 500}|
565
+ |super_glue_copa__why_C1_or_C2|{'train': 198, 'validation': 52, 'test': 250}|
566
+ |super_glue_copa__why_C1_or_C2_score_eval|{'train': 396, 'validation': 104, 'test': 500}|
567
+ |super_glue_copa_best_option|{'train': 400, 'validation': 100, 'test': 500}|
568
+ |super_glue_copa_best_option_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
569
+ |super_glue_copa_cause_effect|{'train': 400, 'validation': 100, 'test': 500}|
570
+ |super_glue_copa_cause_effect_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
571
+ |super_glue_copa_choose|{'train': 400, 'validation': 100, 'test': 500}|
572
+ |super_glue_copa_choose_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
573
+ |super_glue_copa_exercise|{'train': 400, 'validation': 100, 'test': 500}|
574
+ |super_glue_copa_exercise_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
575
+ |super_glue_copa_i_am_hesitating|{'train': 400, 'validation': 100, 'test': 500}|
576
+ |super_glue_copa_i_am_hesitating_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
577
+ |super_glue_copa_more_likely|{'train': 400, 'validation': 100, 'test': 500}|
578
+ |super_glue_copa_more_likely_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
579
+ |super_glue_copa_plausible_alternatives|{'train': 400, 'validation': 100, 'test': 500}|
580
+ |super_glue_copa_plausible_alternatives_score_eval|{'train': 800, 'validation': 200, 'test': 1000}|
581
+ |super_glue_multirc_I_was_going_to_say_|{'train': 27243, 'validation': 4848, 'test': 9693}|
582
+ |super_glue_multirc_Would_it_be_good_to_answer_|{'train': 27243, 'validation': 4848, 'test': 9693}|
583
+ |super_glue_multirc_confirm|{'train': 27243, 'validation': 4848, 'test': 9693}|
584
+ |super_glue_multirc_correct|{'train': 27243, 'validation': 4848, 'test': 9693}|
585
+ |super_glue_multirc_decide_valid|{'train': 27243, 'validation': 4848, 'test': 9693}|
586
+ |super_glue_multirc_found_this_answer|{'train': 27243, 'validation': 4848, 'test': 9693}|
587
+ |super_glue_multirc_grading|{'train': 27243, 'validation': 4848, 'test': 9693}|
588
+ |super_glue_multirc_is_a_correct_answer_|{'train': 27243, 'validation': 4848, 'test': 9693}|
589
+ |super_glue_multirc_is_the_correct_answer_|{'train': 27243, 'validation': 4848, 'test': 9693}|
590
+ |super_glue_multirc_paragraph_question_is_it_|{'train': 27243, 'validation': 4848, 'test': 9693}|
591
+ |super_glue_record_Add_sentence_after_after_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
592
+ |super_glue_record_Add_sentence_after_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
593
+ |super_glue_record_Can_you_figure_out_|{'train': 100730, 'validation': 10000, 'test': 10000}|
594
+ |super_glue_record_GPT_3_style_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
595
+ |super_glue_record_GPT_3_style_summary_only_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
596
+ |super_glue_record_GPT_3_style_with_labels_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
597
+ |super_glue_record_GPT_3_style_with_labels_without_hyphens_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
598
+ |super_glue_record_GPT_3_style_without_hyphens_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
599
+ |super_glue_record_In_the_question_above_the_placeholder_stands_for|{'train': 100730, 'validation': 10000, 'test': 10000}|
600
+ |super_glue_record_New_highlight_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
601
+ |super_glue_record_News_article_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
602
+ |super_glue_record_Summary_first_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}|
603
+ |super_glue_record_What_could_the_placeholder_be_|{'train': 100730, 'validation': 10000, 'test': 10000}|
604
+ |super_glue_record_Which_one_is_the_placeholder_|{'train': 100730, 'validation': 10000, 'test': 10000}|
605
+ |super_glue_record_choose_between|{'train': 100730, 'validation': 10000, 'test': 10000}|
606
+ |super_glue_record_corrupted|{'train': 100730, 'validation': 10000, 'test': 10000}|
607
+ |super_glue_record_exercise|{'train': 100730, 'validation': 10000, 'test': 10000}|
608
+ |super_glue_record_pick_one_option|{'train': 100730, 'validation': 10000, 'test': 10000}|
609
+ |super_glue_record_the_placeholder_refers_to_|{'train': 100730, 'validation': 10000, 'test': 10000}|
610
+ |super_glue_record_trying_to_decide|{'train': 100730, 'validation': 10000, 'test': 10000}|
611
+ |super_glue_rte_GPT_3_style|{'train': 2490, 'validation': 277, 'test': 3000}|
612
+ |super_glue_rte_GPT_3_style_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
613
+ |super_glue_rte_MNLI_crowdsource|{'train': 2490, 'validation': 277, 'test': 3000}|
614
+ |super_glue_rte_MNLI_crowdsource_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
615
+ |super_glue_rte_based_on_the_previous_passage|{'train': 2490, 'validation': 277, 'test': 3000}|
616
+ |super_glue_rte_based_on_the_previous_passage_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
617
+ |super_glue_rte_can_we_infer|{'train': 2490, 'validation': 277, 'test': 3000}|
618
+ |super_glue_rte_can_we_infer_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
619
+ |super_glue_rte_does_it_follow_that|{'train': 2490, 'validation': 277, 'test': 3000}|
620
+ |super_glue_rte_does_it_follow_that_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
621
+ |super_glue_rte_does_this_imply|{'train': 2490, 'validation': 277, 'test': 3000}|
622
+ |super_glue_rte_does_this_imply_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
623
+ |super_glue_rte_guaranteed_true|{'train': 2490, 'validation': 277, 'test': 3000}|
624
+ |super_glue_rte_guaranteed_true_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
625
+ |super_glue_rte_justified_in_saying|{'train': 2490, 'validation': 277, 'test': 3000}|
626
+ |super_glue_rte_justified_in_saying_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
627
+ |super_glue_rte_must_be_true|{'train': 2490, 'validation': 277, 'test': 3000}|
628
+ |super_glue_rte_must_be_true_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
629
+ |super_glue_rte_should_assume|{'train': 2490, 'validation': 277, 'test': 3000}|
630
+ |super_glue_rte_should_assume_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}|
631
+ |super_glue_wic_GPT_3_prompt|{'train': 5428, 'validation': 638, 'test': 1400}|
632
+ |super_glue_wic_GPT_3_prompt_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
633
+ |super_glue_wic_GPT_3_prompt_with_label|{'train': 5428, 'validation': 638, 'test': 1400}|
634
+ |super_glue_wic_GPT_3_prompt_with_label_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
635
+ |super_glue_wic_affirmation_true_or_false|{'train': 5428, 'validation': 638, 'test': 1400}|
636
+ |super_glue_wic_affirmation_true_or_false_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
637
+ |super_glue_wic_grammar_homework|{'train': 5428, 'validation': 638, 'test': 1400}|
638
+ |super_glue_wic_grammar_homework_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
639
+ |super_glue_wic_polysemous|{'train': 5428, 'validation': 638, 'test': 1400}|
640
+ |super_glue_wic_polysemous_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
641
+ |super_glue_wic_question_context|{'train': 5428, 'validation': 638, 'test': 1400}|
642
+ |super_glue_wic_question_context_meaning|{'train': 5428, 'validation': 638, 'test': 1400}|
643
+ |super_glue_wic_question_context_meaning_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
644
+ |super_glue_wic_question_context_meaning_with_label|{'train': 5428, 'validation': 638, 'test': 1400}|
645
+ |super_glue_wic_question_context_meaning_with_label_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
646
+ |super_glue_wic_question_context_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
647
+ |super_glue_wic_same_sense|{'train': 5428, 'validation': 638, 'test': 1400}|
648
+ |super_glue_wic_same_sense_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
649
+ |super_glue_wic_similar_sense|{'train': 5428, 'validation': 638, 'test': 1400}|
650
+ |super_glue_wic_similar_sense_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}|
651
+ |super_glue_wsc.fixed_GPT_3_Style|{'train': 554, 'validation': 104, 'test': 146}|
652
+ |super_glue_wsc.fixed_GPT_3_Style_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
653
+ |super_glue_wsc.fixed_I_think_they_mean|{'train': 554, 'validation': 104, 'test': 146}|
654
+ |super_glue_wsc.fixed_I_think_they_mean_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
655
+ |super_glue_wsc.fixed_Who_or_what_is_are|{'train': 554, 'validation': 104, 'test': 146}|
656
+ |super_glue_wsc.fixed_Who_or_what_is_are_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
657
+ |super_glue_wsc.fixed_by_p_they_mean|{'train': 554, 'validation': 104, 'test': 146}|
658
+ |super_glue_wsc.fixed_by_p_they_mean_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
659
+ |super_glue_wsc.fixed_does_p_stand_for|{'train': 554, 'validation': 104, 'test': 146}|
660
+ |super_glue_wsc.fixed_does_p_stand_for_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
661
+ |super_glue_wsc.fixed_does_the_pronoun_refer_to|{'train': 554, 'validation': 104, 'test': 146}|
662
+ |super_glue_wsc.fixed_does_the_pronoun_refer_to_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
663
+ |super_glue_wsc.fixed_in_other_words|{'train': 554, 'validation': 104, 'test': 146}|
664
+ |super_glue_wsc.fixed_in_other_words_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
665
+ |super_glue_wsc.fixed_p_is_are_r|{'train': 554, 'validation': 104, 'test': 146}|
666
+ |super_glue_wsc.fixed_p_is_are_r_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
667
+ |super_glue_wsc.fixed_replaced_with|{'train': 554, 'validation': 104, 'test': 146}|
668
+ |super_glue_wsc.fixed_replaced_with_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
669
+ |super_glue_wsc.fixed_the_pronoun_refers_to|{'train': 554, 'validation': 104, 'test': 146}|
670
+ |super_glue_wsc.fixed_the_pronoun_refers_to_score_eval|{'train': 1108, 'validation': 208, 'test': 292}|
671
+ |trec_fine_grained_ABBR|{'train': 86, 'test': 9}|
672
+ |trec_fine_grained_ABBR_context_first|{'train': 86, 'test': 9}|
673
+ |trec_fine_grained_DESC|{'train': 1162, 'test': 138}|
674
+ |trec_fine_grained_DESC_context_first|{'train': 1162, 'test': 138}|
675
+ |trec_fine_grained_ENTY|{'train': 1250, 'test': 94}|
676
+ |trec_fine_grained_HUM|{'train': 1223, 'test': 65}|
677
+ |trec_fine_grained_HUM_context_first|{'train': 1223, 'test': 65}|
678
+ |trec_fine_grained_LOC|{'train': 835, 'test': 81}|
679
+ |trec_fine_grained_LOC_context_first|{'train': 835, 'test': 81}|
680
+ |trec_fine_grained_NUM|{'train': 896, 'test': 113}|
681
+ |trec_fine_grained_NUM_context_first|{'train': 896, 'test': 113}|
682
+ |trec_fine_grained_open|{'train': 5452, 'test': 500}|
683
+ |trec_fine_grained_open_context_first|{'train': 5452, 'test': 500}|
684
+ |trec_pick_the_best_descriptor|{'train': 5452, 'test': 500}|
685
+ |trec_trec1|{'train': 5452, 'test': 500}|
686
+ |trec_trec2|{'train': 5452, 'test': 500}|
687
+ |trec_what_category_best_describe|{'train': 5452, 'test': 500}|
688
+ |trec_which_category_best_describes|{'train': 5452, 'test': 500}|
689
+ |trivia_qa_unfiltered_first_person_context|{'train': 87622, 'validation': 11313, 'test': 10832}|
690
+ |trivia_qa_unfiltered_formal_description|{'train': 87622, 'validation': 11313, 'test': 10832}|
691
+ |trivia_qa_unfiltered_guess_question|{'train': 87622, 'validation': 11313}|
692
+ |trivia_qa_unfiltered_question_answer|{'train': 87622, 'validation': 11313, 'test': 10832}|
693
+ |trivia_qa_unfiltered_question_with_instruction|{'train': 87622, 'validation': 11313, 'test': 10832}|
694
+ |web_questions_get_the_answer|{'train': 3778, 'test': 2032}|
695
+ |web_questions_potential_correct_answer|{'train': 3778, 'test': 2032}|
696
+ |web_questions_question_answer|{'train': 3778, 'test': 2032}|
697
+ |web_questions_short_general_knowledge_q|{'train': 3778, 'test': 2032}|
698
+ |web_questions_whats_the_answer|{'train': 3778, 'test': 2032}|
699
+ |wiki_bio_comprehension|{'train': 582639, 'test': 72829, 'val': 72831}|
700
+ |wiki_bio_guess_person|{'train': 582639, 'test': 72829, 'val': 72831}|
701
+ |wiki_bio_key_content|{'train': 582639, 'test': 72829, 'val': 72831}|
702
+ |wiki_bio_what_content|{'train': 582639, 'test': 72829, 'val': 72831}|
703
+ |wiki_bio_who|{'train': 582639, 'test': 72829, 'val': 72831}|
704
+ |wiki_hop_original_choose_best_object_affirmative_1|{'train': 43738, 'validation': 5129}|
705
+ |wiki_hop_original_choose_best_object_affirmative_2|{'train': 43738, 'validation': 5129}|
706
+ |wiki_hop_original_choose_best_object_affirmative_3|{'train': 43738, 'validation': 5129}|
707
+ |wiki_hop_original_choose_best_object_interrogative_1|{'train': 43738, 'validation': 5129}|
708
+ |wiki_hop_original_choose_best_object_interrogative_2|{'train': 43738, 'validation': 5129}|
709
+ |wiki_hop_original_explain_relation|{'train': 43738, 'validation': 5129}|
710
+ |wiki_hop_original_generate_object|{'train': 43738, 'validation': 5129}|
711
+ |wiki_hop_original_generate_subject|{'train': 43738, 'validation': 5129}|
712
+ |wiki_hop_original_generate_subject_and_object|{'train': 43738, 'validation': 5129}|
713
+ |wiki_qa_Decide_good_answer|{'train': 20360, 'validation': 2733, 'test': 6165}|
714
+ |wiki_qa_Direct_Answer_to_Question|{'train': 1040, 'validation': 140, 'test': 293}|
715
+ |wiki_qa_Generate_Question_from_Topic|{'train': 1040, 'validation': 140, 'test': 293}|
716
+ |wiki_qa_Is_This_True_|{'train': 20360, 'validation': 2733, 'test': 6165}|
717
+ |wiki_qa_Jeopardy_style|{'train': 1040, 'validation': 140, 'test': 293}|
718
+ |wiki_qa_Topic_Prediction_Answer_Only|{'train': 1040, 'validation': 140, 'test': 293}|
719
+ |wiki_qa_Topic_Prediction_Question_Only|{'train': 1040, 'validation': 140, 'test': 293}|
720
+ |wiki_qa_Topic_Prediction_Question_and_Answer_Pair|{'train': 1040, 'validation': 140, 'test': 293}|
721
+ |wiki_qa_automatic_system|{'train': 20360, 'validation': 2733, 'test': 6165}|
722
+ |wiki_qa_exercise|{'train': 20360, 'validation': 2733, 'test': 6165}|
723
+ |wiki_qa_found_on_google|{'train': 20360, 'validation': 2733, 'test': 6165}|
724
+ |winogrande_winogrande_debiased_Replace|{'train': 9248, 'validation': 1267, 'test': 1767}|
725
+ |winogrande_winogrande_debiased_Replace_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}|
726
+ |winogrande_winogrande_debiased_does_underscore_refer_to|{'train': 9248, 'validation': 1267, 'test': 1767}|
727
+ |winogrande_winogrande_debiased_does_underscore_refer_to_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}|
728
+ |winogrande_winogrande_debiased_fill_in_the_blank|{'train': 9248, 'validation': 1267, 'test': 1767}|
729
+ |winogrande_winogrande_debiased_fill_in_the_blank_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}|
730
+ |winogrande_winogrande_debiased_stand_for|{'train': 9248, 'validation': 1267, 'test': 1767}|
731
+ |winogrande_winogrande_debiased_stand_for_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}|
732
+ |winogrande_winogrande_debiased_underscore_refer_to|{'train': 9248, 'validation': 1267, 'test': 1767}|
733
+ |winogrande_winogrande_debiased_underscore_refer_to_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}|
734
+ |winogrande_winogrande_xl_Replace|{'train': 40398, 'validation': 1267, 'test': 1767}|
735
+ |winogrande_winogrande_xl_Replace_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}|
736
+ |winogrande_winogrande_xl_does_underscore_refer_to|{'train': 40398, 'validation': 1267, 'test': 1767}|
737
+ |winogrande_winogrande_xl_does_underscore_refer_to_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}|
738
+ |winogrande_winogrande_xl_fill_in_the_blank|{'train': 40398, 'validation': 1267, 'test': 1767}|
739
+ |winogrande_winogrande_xl_fill_in_the_blank_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}|
740
+ |winogrande_winogrande_xl_stand_for|{'train': 40398, 'validation': 1267, 'test': 1767}|
741
+ |winogrande_winogrande_xl_stand_for_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}|
742
+ |winogrande_winogrande_xl_underscore_refer_to|{'train': 40398, 'validation': 1267, 'test': 1767}|
743
+ |winogrande_winogrande_xl_underscore_refer_to_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}|
744
+ |wiqa_does_the_supposed_perturbation_have_an_effect|{'train': 29808, 'validation': 6894, 'test': 3003}|
745
+ |wiqa_effect_with_label_answer|{'train': 29808, 'validation': 6894, 'test': 3003}|
746
+ |wiqa_effect_with_string_answer|{'train': 29808, 'validation': 6894, 'test': 3003}|
747
+ |wiqa_what_is_the_final_step_of_the_following_process|{'train': 29808, 'validation': 6894, 'test': 3003}|
748
+ |wiqa_what_is_the_missing_first_step|{'train': 29808, 'validation': 6894, 'test': 3003}|
749
+ |wiqa_what_might_be_the_first_step_of_the_process|{'train': 29808, 'validation': 6894, 'test': 3003}|
750
+ |wiqa_what_might_be_the_last_step_of_the_process|{'train': 29808, 'validation': 6894, 'test': 3003}|
751
+ |wiqa_which_of_the_following_is_the_supposed_perturbation|{'train': 29808, 'validation': 6894, 'test': 3003}|
752
+ |xsum_DOC_boils_down_to_simple_idea_that|{'train': 204045, 'validation': 11332, 'test': 11334}|
753
+ |xsum_DOC_given_above_write_one_sentence|{'train': 204045, 'validation': 11332, 'test': 11334}|
754
+ |xsum_DOC_how_would_you_rephrase_few_words|{'train': 204045, 'validation': 11332, 'test': 11334}|
755
+ |xsum_DOC_tldr|{'train': 204045, 'validation': 11332, 'test': 11334}|
756
+ |xsum_DOC_write_summary_of_above|{'train': 204045, 'validation': 11332, 'test': 11334}|
757
+ |xsum_article_DOC_summary|{'train': 204045, 'validation': 11332, 'test': 11334}|
758
+ |xsum_college_roommate_asked_DOC_so_I_recap|{'train': 204045, 'validation': 11332, 'test': 11334}|
759
+ |xsum_read_below_DOC_write_abstract|{'train': 204045, 'validation': 11332, 'test': 11334}|
760
+ |xsum_summarize_DOC|{'train': 204045, 'validation': 11332, 'test': 11334}|
761
+ |xsum_summarize_this_DOC_summary|{'train': 204045, 'validation': 11332, 'test': 11334}|
762
+ |yelp_review_full_based_on_that|{'train': 650000, 'test': 50000}|
763
+ |yelp_review_full_format_rating|{'train': 650000, 'test': 50000}|
764
+ |yelp_review_full_format_score|{'train': 650000, 'test': 50000}|
765
+ |yelp_review_full_format_star|{'train': 650000, 'test': 50000}|
766
+ |yelp_review_full_on_a_scale|{'train': 650000, 'test': 50000}|
767
+ |yelp_review_full_so_i_would|{'train': 650000, 'test': 50000}|
768
+ |yelp_review_full_this_place|{'train': 650000, 'test': 50000}|
769
 
770
  ## Dataset Creation
771