Data files too large

#1
by albertvillanova HF staff - opened

This discussion is a follow-up of this issue: https://github.com/huggingface/datasets-server/issues/1212

Currently the viewer shows a JobManagerCrashedError:

Job manager crashed while running this job (missing heartbeats).

Some of the data files are quite large (with more 1.5 GB).

One solution would be to split the data files into smaller ones (around maximum 500 MB each).

Tarteel AI org

That was a hack to resolve the connection being dropped: https://github.com/huggingface/datasets/issues/4677

@msis please note this issue here is not related to the one you pointed out:

  • The issue you pointed out is a 400 HTTP error generated by the huggingface_hub client, which is used under the hood when one pushes a local dataset to the Hub by using push_to_hub.
  • The issue here is that the operating system process crashed while trying to load the dataset (not pushing local files to the Hub), because the files are too large and do not fit in the RAM memory.

I am currently faceing the same issue ,even my each file size is less than 500 MB

This comment has been hidden

Nobody can solve this

just renaming the dataset fixed it for me

Sign up or log in to comment