FileNotFoundError: Couldn't find file at https://dumps.wikimedia.org/yowiki/20220301/dumpstatus.json

#49
by Davlan - opened

I tried running:

ds = load_dataset("wikipedia", "20220301.yo", beam_runner='DirectRunner')

It gives the following error:

Downloading and preparing dataset wikipedia/20220301.yo to /Users/dadelani/.cache/huggingface/datasets/wikipedia/20220301.yo/2.0.0/aa542ed919df55cc5d3347f42dd4521d05ca68751f50dbc32bae2a7f1e167559...
Downloading data files: 0%| | 0/1 [00:00<?, ?it/s]Traceback (most recent call last):
File "/Users/dadelani/PycharmProjects/NaijaRep/preprocess.py", line 4, in
ds = load_dataset("wikipedia", "20220301.yo", beam_runner='DirectRunner')
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/load.py", line 1742, in load_dataset
builder_instance.download_and_prepare(
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/builder.py", line 814, in download_and_prepare
self._download_and_prepare(
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/builder.py", line 1647, in _download_and_prepare
super()._download_and_prepare(
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/builder.py", line 883, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/Users/dadelani/.cache/huggingface/modules/datasets_modules/datasets/wikipedia/aa542ed919df55cc5d3347f42dd4521d05ca68751f50dbc32bae2a7f1e167559/wikipedia.py", line 945, in _split_generators
downloaded_files = dl_manager.download_and_extract({"info": info_url})
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/download/download_manager.py", line 433, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/download/download_manager.py", line 310, in download
downloaded_path_or_paths = map_nested(
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/utils/py_utils.py", line 429, in map_nested
mapped = [
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/utils/py_utils.py", line 430, in
_single_map_nested((function, obj, types, None, True, None))
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/utils/py_utils.py", line 331, in _single_map_nested
return function(data_struct)
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/download/download_manager.py", line 337, in _download
return cached_path(url_or_filename, download_config=download_config)
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/utils/file_utils.py", line 188, in cached_path
output_path = get_from_cache(
File "/Users/dadelani/miniconda3/lib/python3.8/site-packages/datasets/utils/file_utils.py", line 535, in get_from_cache
raise FileNotFoundError(f"Couldn't find file at {url}")
FileNotFoundError: Couldn't find file at https://dumps.wikimedia.org/yowiki/20220301/dumpstatus.json
Downloading data files: 0%| | 0/1 [00:00<?, ?it/s]

Can you please help?

Hi @Davlan .

You are trying to use the legacy dataset "wikipedia": https://huggingface.co/datasets/wikipedia

This is the new "wikimedia/wikipedia" dataset and for the moment you can load only the 20231101 dump, using this code:

ds = load_dataset("wikimedia/wikipedia", "20231101.yo")
albertvillanova changed discussion status to closed

thank you for the reply.

Sign up or log in to comment