Dataset Viewer issue

#1
by SaulLu - opened

The dataset viewer is not working.

Error details:

Error code:   StreamingRowsError
Exception:    ClientResponseError
Message:      429, message='Too Many Requests', url=URL('https://dl.fbaipublicfiles.com/textvqa/images/train_val_images.zip')
Traceback:    Traceback (most recent call last):
                File "/src/workers/datasets_based/src/datasets_based/workers/first_rows.py", line 484, in compute_first_rows_response
                  rows = get_rows(
                File "/src/workers/datasets_based/src/datasets_based/workers/first_rows.py", line 119, in decorator
                  return func(*args, **kwargs)
                File "/src/workers/datasets_based/src/datasets_based/workers/first_rows.py", line 175, in get_rows
                  rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 751, in __iter__
                  yield _apply_feature_types(example, self.features, token_per_repo_id=self._token_per_repo_id)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 635, in _apply_feature_types
                  decoded_example = features.decode_example(encoded_example, token_per_repo_id=token_per_repo_id)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/features/features.py", line 1794, in decode_example
                  return {
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/features/features.py", line 1795, in <dictcomp>
                  column_name: decode_nested_example(feature, value, token_per_repo_id=token_per_repo_id)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/features/features.py", line 1262, in decode_nested_example
                  return schema.decode_example(obj, token_per_repo_id=token_per_repo_id) if obj is not None else None
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 148, in decode_example
                  with xopen(path, "rb", use_auth_token=use_auth_token) as f:
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 469, in xopen
                  file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open()
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/core.py", line 135, in open
                  return self.__enter__()
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/core.py", line 103, in __enter__
                  f = self.fs.open(self.path, mode=mode)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1106, in open
                  f = self._open(
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/implementations/zip.py", line 106, in _open
                  out = self.zip.open(path, mode.strip("b"))
                File "/usr/local/lib/python3.9/zipfile.py", line 1527, in open
                  fheader = zef_file.read(sizeFileHeader)
                File "/usr/local/lib/python3.9/zipfile.py", line 744, in read
                  data = self._file.read(n)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 590, in read
                  return super().read(length)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1655, in read
                  out = self.cache._fetch(self.loc, self.loc + length)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/caching.py", line 381, in _fetch
                  self.cache = self.fetcher(start, bend)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/asyn.py", line 113, in wrapper
                  return sync(self.loop, func, *args, **kwargs)
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/asyn.py", line 98, in sync
                  raise return_result
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/asyn.py", line 53, in _runner
                  result[0] = await coro
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 631, in async_fetch_range
                  r.raise_for_status()
                File "/src/workers/datasets_based/.venv/lib/python3.9/site-packages/aiohttp/client_reqrep.py", line 1005, in raise_for_status
                  raise ClientResponseError(
              aiohttp.client_exceptions.ClientResponseError: 429, message='Too Many Requests', url=URL('https://dl.fbaipublicfiles.com/textvqa/images/train_val_images.zip')

The error makes sense but I don't know how to retry the data download

cc @albertvillanova @lhoestq @severo .

Thanks for reporting, @SaulLu .

This error raises when the source data files are hosted in servers with strong constraints in the allowed request limit:

  • One possible solution is to optimize the loading script so that it makes as few as possible filesystem requests. But I have had a look at your code and I think it is already optimized in this sense.
  • Therefore, the remaining possible solution is to host their data on the Hub (under a data directory in the same repo) if their license allows so.

Thanks a lot for the quick answer @albertvillanova ! I really appreciate it! I must admit that it's a lot of things to do and check to see if the viewer can work, so for the moment we will do without the viewer.

OK @SaulLu .

Just note that this issues is not exclusive of the viewer, but of streaming. There will be no problem as long as you you do not use the dataset in streaming mode.

If you want to use it in streaming mode, then I think you should host the data files here in this repo as well, so that the error will be fixed.

Sign up or log in to comment