index 9287 (a mourning dove) is corrupted

#1
by chigozie - opened

dataset[9287]:

...
--> 266         raise OSError(msg)
    268 b = b + s
    269 n, err_code = decoder.decode(b)

OSError: image file is truncated (45 bytes not processed)

I redownloaded with:

from datasets import load_dataset, VerificationMode, DownloadMode
dataset = load_dataset(
    'sasha/birdsnap',
    split="train",
    verification_mode=VerificationMode.ALL_CHECKS, 
   download_mode=DownloadMode.FORCE_REDOWNLOAD,
    streaming=True
)

to make sure it was the image on the repo that was corrupted, not an issue that occurred in transport, and got the same error, whether or not streaming=True was set (if streaming=True you have to iterate through to that entry to get there, and you get an error from the generator, so you can't just put a try/except inside the for loop to catch the error.

Doing something like:

for idx in range(39860):
    try:
        row = next(iter(dataset))
    except Exception as e:
        print('WARNING', idx, e)

adds 3 seconds per row compared to for row in dataset! Not viable for a dataset this large. (plus you have to know the size of the dataset in advance if streaming=True, as you can't get len() of an IterableDataset)

I was able to avoid it in this specific case eventually by:

dataset = load_dataset('sasha/birdsnap', split='train[:9287]+train[9288:]', verification_mode=VerificationMode.ALL_CHECKS)

but that relies on knowing in advance which one is corrupted, and doesn't work with streaming=True

Is there a good way to generally catch these kinds of errors with huggingface datasets? And @sasha could you fix this particular one?

Thanks!

Sign up or log in to comment