Datasets:

Languages:
English
Multilinguality:
monolingual
Size Categories:
100K<n<1M
Language Creators:
crowdsourced
Annotations Creators:
no-annotation
Source Datasets:
original
Tags:
License:

🚩 Report

#2
by Arij - opened

from datasets import load_dataset

dataset = load_dataset("natural_questions")

does not work directly
gives error

File ~/anaconda3/lib/python3.9/site-packages/datasets/builder.py:1879, in BeamBasedBuilder._download_and_prepare(self, dl_manager, verify_infos, **prepare_splits_kwargs)
1877 if not beam_runner and not beam_options:
1878 usage_example = f"load_dataset(‘{self.name}’, ‘{self.config.name}’, beam_runner=‘DirectRunner’)"
→ 1879 raise MissingBeamOptions(
1880 "Trying to generate a dataset using Apache Beam, yet no Beam Runner "
1881 "or PipelineOptions() has been provided in load_dataset or in the "
1882 "builder arguments. For big datasets it has to run on large-scale data "
1883 "processing tools like Dataflow, Spark, etc. More information about "
1884 "Apache Beam runners at "
1885 “Apache Beam Capability Matrix”
1886 "\nIf you really want to run it locally because you feel like the "
1887 “Dataset is small enough, you can use the local beam runner called "
1888 “DirectRunner (you may run out of memory). \nExample of usage: "
1889 f”\n\t{usage_example}”
1890 )
1892 # Beam type checking assumes transforms multiple outputs are of same type,
1893 # which is not our case. Plus it doesn’t handle correctly all types, so we
1894 # are better without it.
1895 pipeline_options = {“pipeline_type_check”: False}

MissingBeamOptions: Trying to generate a dataset using Apache Beam, yet no Beam Runner or PipelineOptions() has been provided in load_dataset or in the builder arguments. For big datasets it has to run on large-scale data processing tools like Dataflow, Spark, etc. More information about Apache Beam runners at Apache Beam Capability Matrix
If you really want to run it locally because you feel like the Dataset is small enough, you can use the local beam runner called DirectRunner (you may run out of memory).

Sign up or log in to comment