Datasets:

Multilinguality:
multilingual
Size Categories:
100K<n<1M
Language Creators:
machine-generated
Annotations Creators:
expert-generated
Source Datasets:
original
ArXiv:
License:

Error While Using the Dataset

#2
by damnathews - opened

FileNotFoundError Traceback (most recent call last)
File ~/anaconda3/envs/hugging-face/lib/python3.9/site-packages/datasets/builder.py:1676, in GeneratorBasedBuilder._prepare_split_single(self, gen_kwargs, fpath, file_format, max_shard_size, split_info, check_duplicate_keys, job_id)
1675 _time = time.time()
-> 1676 for key, record in generator:
1677 if max_shard_size is not None and writer._num_bytes > max_shard_size:

File ~/.cache/huggingface/modules/datasets_modules/datasets/ai4bharat--kathbath/3baf116837b04bb852e9b4f24e45227491f87e34a6b53160283d519931104ae3/kathbath.py:161, in Kathbath.generate_examples(self, local_extracted_archive, audio_files, metadata_path, path_to_clips)
160 id
= 0
--> 161 for path in audio_files:
162 print(path)

File ~/anaconda3/envs/hugging-face/lib/python3.9/site-packages/datasets/download/download_manager.py:158, in _IterableFromGenerator.iter(self)
157 def iter(self):
--> 158 yield from self.generator(*self.args, **self.kwargs)

File ~/anaconda3/envs/hugging-face/lib/python3.9/site-packages/datasets/download/download_manager.py:206, in ArchiveIterable._iter_from_path(cls, urlpath)
204 @classmethod
205 def _iter_from_path(cls, urlpath: str) -> Generator[Tuple, None, None]:
--> 206 compression = _get_extraction_protocol(urlpath)
207 with open(urlpath, "rb") as f:

File ~/anaconda3/envs/hugging-face/lib/python3.9/site-packages/datasets/download/download_manager.py:145, in _get_extraction_protocol(path)
144 return None
--> 145 with open(path, "rb") as f:
...
1711 e = e.context
-> 1712 raise DatasetGenerationError("An error occurred while generating the dataset") from e
1714 yield job_id, True, (total_num_examples, total_num_bytes, writer._features, num_shards, shard_lengths)

DatasetGenerationError: An error occurred while generating the dataset

Any news regarding this? I am also facing the same issue

Sign up or log in to comment