NonMatchingSplitsSizeError

#2
by yuyang-xue-ed - opened

Hello, there is an error when loading data. I try to re-download the data but it still the same

>>> dataset = load_dataset("OPTML-Group/UnlearnCanvas")
Resolving data files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 331/331 [00:00<00:00, 768.19it/s]
Downloading data: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 331/331 [00:00<00:00, 1601.78files/s]
Generating train split: 52745 examples [24:56, 35.25 examples/s]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "site-packages/datasets/load.py", line 2609, in load_dataset
    builder_instance.download_and_prepare(
  File "site-packages/datasets/builder.py", line 1027, in download_and_prepare
    self._download_and_prepare(
  File "/site-packages/datasets/builder.py", line 1140, in _download_and_prepare
    verify_splits(self.info.splits, split_dict)
  File "site-packages/datasets/utils/info_utils.py", line 101, in verify_splits
    raise NonMatchingSplitsSizesError(str(bad_splits))
datasets.utils.info_utils.NonMatchingSplitsSizesError: [{'expected': SplitInfo(name='train', num_bytes=76080381824.0, num_examples=24400, shard_lengths=None, dataset_name=None), 'recorded': SplitInfo(name='train', num_bytes=167271171931, num_examples=52745, shard_lengths=[160, 320, 320, 160, 160, 160, 320, 160, 160, 160, 320, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 320, 320, 320, 320, 320, 320, 320, 160, 160, 160, 160, 160, 160, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 318, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159], dataset_name='unlearn_canvas')}]

Does this mean the data from huggingface is not complete?

Sign up or log in to comment