Possible size discrepancy for large subset

#17
by Bill2462 - opened

Hello
I just have downloaded diffusionDB large and converted it to webdataset format. This process involved unpacking .zip files and moving images to .tar files. There was no change in image format. I have also deleted images blurred out by stable diffusion NSFW detector. 100k images were removed.

What is strange is that the resulting dataset takes approximately 5.6TB of disc space instead of 6.5TB given by dataset card. It is unlikely that 100k blurred images would take 1TB of disc space so there is a lot of space that is unaccounted for.

I am currently checking for duplicate downloads. Maybe filesystem deduplicated some files and this resulted smaller size. But in the meantime I wanted to ask how was the size calculated? Maybe it was calculated before encoding to webp?

Polo Club of Data Science org
edited Oct 18, 2023

Hi! The zip files and tar ball files might have different compression strategies. What is the total size of your downloaded diffusiondb-large-part-1 and diffusiondb-large-part-2?

I just re-run du -sh on my local copy and below are the numbers (~7TB in total with metadata-large.parquet)

4379650 diffusiondb-hugging/diffusiondb-large-part-1
1819257 diffusiondb-hugging/diffusiondb-large-part-2
802226  diffusiondb-hugging/metadata-large.parquet
xiaohk changed discussion status to closed

Sign up or log in to comment