Datasets:

Languages:
English
Multilinguality:
monolingual
Size Categories:
10K<n<100K
Language Creators:
found
Annotations Creators:
crowdsourced
Source Datasets:
original
ArXiv:
License:

TypeError via loading the dataset

#1
by teasgen - opened

I'm trying to load this dataset and save to disk, but it fails with error:

Saving the dataset (0/122 shards):   0%|                                                                                                                                                                             | 0/3563 [00:24<?, ? examples/s]
Traceback (most recent call last):
  File "/home/teasgen/mmmu/script.py", line 4, in <module>
    dataset.save_to_disk("winogavil")
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 1530, in save_to_disk
    for job_id, done, content in Dataset._save_to_disk_single(**kwargs):
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 1563, in _save_to_disk_single
    writer.write_table(pa_table)
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/arrow_writer.py", line 575, in write_table
    pa_table = embed_table_storage(pa_table)
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 2310, in embed_table_storage
    arrays = [
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 2311, in <listcomp>
    embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name]
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 1834, in wrapper
    return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 1834, in <listcomp>
    return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 2200, in embed_array_storage
    return pa.ListArray.from_arrays(array.offsets, _e(array.values, feature[0]))
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 1836, in wrapper
    return func(array, *args, **kwargs)
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/table.py", line 2180, in embed_array_storage
    return feature.embed_storage(array)
  File "/home/teasgen/miniconda3/envs/vlm_eval/lib/python3.10/site-packages/datasets/features/image.py", line 276, in embed_storage
    storage = pa.StructArray.from_arrays([bytes_array, path_array], ["bytes", "path"], mask=bytes_array.is_null())
  File "pyarrow/array.pxi", line 3205, in pyarrow.lib.StructArray.from_arrays
  File "pyarrow/array.pxi", line 3645, in pyarrow.lib.c_mask_inverted_from_obj
TypeError: Mask must be a pyarrow.Array of type boolean

Code to reproduce:

import datasets
dataset = datasets.load_dataset("nlphuji/winogavil", split="test")
dataset.save_to_disk("winogavil")

How can I fix it?

Thanks for your interest. I see (and also tried myself now) that the dataset is being loaded, however the "save_to_disk" seem to fail.
I think it's a limitations of huggingface datasets in saving dataset that includes multiple images.
I suggest to ask someone from the huggingface team. @severo , any idea who can we ask? Thank you.

also cc @lhoestq @albertvillanova (I'm off for one week)

Maybe this PR will fix the issue ? cc @mariosasko

https://github.com/huggingface/datasets/pull/6283

Sign up or log in to comment