The dataset viewer is not available for this split.
Error code: StreamingRowsError Exception: ValueError Message: Cannot seek streaming HTTP file Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 322, in compute compute_first_rows_from_parquet_response( File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 88, in compute_first_rows_from_parquet_response rows_index = indexer.get_rows_index( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 444, in get_rows_index return RowsIndex( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 347, in __init__ self.parquet_index = self._init_parquet_index( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 364, in _init_parquet_index response = get_previous_step_or_raise( File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 566, in get_previous_step_or_raise raise CachedArtifactError( libcommon.simple_cache.CachedArtifactError: The previous step failed. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/utils.py", line 126, in get_rows_or_raise return get_rows( File "/src/services/worker/src/worker/utils.py", line 64, in decorator return func(*args, **kwargs) File "/src/services/worker/src/worker/utils.py", line 87, in get_rows ds = load_dataset( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 2567, in load_dataset return builder_instance.as_streaming_dataset(split=split) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1382, in as_streaming_dataset splits_generators = {sg.name: sg for sg in self._split_generators(dl_manager)} File "/tmp/modules-cache/datasets_modules/datasets/qangaroo/59cbc2fcf4563f0500b5ee4d3439c961097f8a27b612591ea6a072c79d073899/qangaroo.py", line 96, in _split_generators dl_dir = dl_manager.download_and_extract(_URL) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 1089, in download_and_extract return self.extract(self.download(url_or_urls)) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 1041, in extract urlpaths = map_nested(self._extract, url_or_urls, map_tuple=True) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 459, in map_nested return function(data_struct) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 1046, in _extract protocol = _get_extraction_protocol(urlpath, download_config=self.download_config) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 401, in _get_extraction_protocol return _get_extraction_protocol_with_magic_number(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 375, in _get_extraction_protocol_with_magic_number f.seek(0) File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 759, in seek raise ValueError("Cannot seek streaming HTTP file") ValueError: Cannot seek streaming HTTP file
Need help to make the dataset viewer work? Open a discussion for direct support.
Dataset Card for "qangaroo"
Dataset Summary
We have created two new Reading Comprehension datasets focussing on multi-hop (alias multi-step) inference.
Several pieces of information often jointly imply another fact. In multi-hop inference, a new fact is derived by combining facts via a chain of multiple steps.
Our aim is to build Reading Comprehension methods that perform multi-hop inference on text, where individual facts are spread out across different documents.
The two QAngaroo datasets provide a training and evaluation resource for such methods.
Supported Tasks and Leaderboards
Languages
Dataset Structure
Data Instances
masked_medhop
- Size of downloaded dataset files: 339.84 MB
- Size of the generated dataset: 112.63 MB
- Total amount of disk used: 452.47 MB
An example of 'validation' looks as follows.
masked_wikihop
- Size of downloaded dataset files: 339.84 MB
- Size of the generated dataset: 391.98 MB
- Total amount of disk used: 731.82 MB
An example of 'validation' looks as follows.
medhop
- Size of downloaded dataset files: 339.84 MB
- Size of the generated dataset: 110.42 MB
- Total amount of disk used: 450.26 MB
An example of 'validation' looks as follows.
wikihop
- Size of downloaded dataset files: 339.84 MB
- Size of the generated dataset: 366.87 MB
- Total amount of disk used: 706.71 MB
An example of 'validation' looks as follows.
Data Fields
The data fields are the same among all splits.
masked_medhop
query
: astring
feature.supports
: alist
ofstring
features.candidates
: alist
ofstring
features.answer
: astring
feature.id
: astring
feature.
masked_wikihop
query
: astring
feature.supports
: alist
ofstring
features.candidates
: alist
ofstring
features.answer
: astring
feature.id
: astring
feature.
medhop
query
: astring
feature.supports
: alist
ofstring
features.candidates
: alist
ofstring
features.answer
: astring
feature.id
: astring
feature.
wikihop
query
: astring
feature.supports
: alist
ofstring
features.candidates
: alist
ofstring
features.answer
: astring
feature.id
: astring
feature.
Data Splits
name | train | validation |
---|---|---|
masked_medhop | 1620 | 342 |
masked_wikihop | 43738 | 5129 |
medhop | 1620 | 342 |
wikihop | 43738 | 5129 |
Dataset Creation
Curation Rationale
Source Data
Initial Data Collection and Normalization
Who are the source language producers?
Annotations
Annotation process
Who are the annotators?
Personal and Sensitive Information
Considerations for Using the Data
Social Impact of Dataset
Discussion of Biases
Other Known Limitations
Additional Information
Dataset Curators
Licensing Information
Citation Information
Contributions
Thanks to @thomwolf, @jplu, @lewtun, @lhoestq, @mariamabarham for adding this dataset.
- Downloads last month
- 336