KBlueLeaf's picture
Update README.md
2a8c703 verified
metadata
license: mit
task_categories:
  - image-classification
  - zero-shot-image-classification
  - text-to-image
language:
  - en
tags:
  - art
  - anime
  - not-for-all-audiences
size_categories:
  - 1M<n<10M

Danbooru 2023 webp: A space-efficient version of Danbooru 2023

This dataset is a resized/re-encoded version of danbooru2023.
Which removed the non-image/truncated files and resize all of them into smaller size.

This dataset already be updated to latest_id = 7,832,883. Thx to DeepGHS!

Notice: content of updates folder and deepghs/danbooru_newest-webp-4Mpixel have been merged to 2000~2999.tar, You can ignore all the content in updates folder safely!


Details

This dataset employs few method to reduce the size and improve the efficiency.

Size and Format

This dataset resize all the image which have more than 2048x2048 pixel into near 2048x2048 pixels with bicubic algorithm.
And remove all the image with longer edge larger than 16383 after resize.
(one reason is beacuse webp doesn't allow that, another is that aspect ratio is too large/small.)

This dataset encode/save all the image with 90% quality webp with pillow library in Python. Which is half size of the 100% quality lossy webp.

The total size of this dataset is around 1.3~1.4TB. Which is less than the 20% of original file size.

Webdataset

This dataset use webdataset library to save all the tarfile, therefore, you can also use webdataset to load them easily. This is also a recommended way.

The __key__ of each files is the id of it. You can use this id to query the metadata database easily.