Datasets:
Tasks:
Question Answering
Modalities:
Text
Formats:
parquet
Sub-tasks:
multiple-choice-qa
Size:
100K - 1M
ArXiv:
License:
Meaningless Paragraph
#2
by
keremturgutlu
- opened
Paragraphs provided for each example looks like some random text, I manually went over some examples and I can verify that there is no coherency between sentences in the paragraph or relevancy to the given question . Could you please provide more information in the datasheet on how the paragraphs were obtained, more detail beyond the source: wikipedia. Thanks!
@mhardalov I am also curious about how these paragraphs were selected, and why there are 4 slightly different supporting paragraphs for each question?