ppbrown commited on
Commit
50f308d
·
verified ·
1 Parent(s): 706ff4b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -8
README.md CHANGED
@@ -1,23 +1,25 @@
1
  ---
2
  license: openrail
3
  ---
4
- This will host a collaborative effort to clean up the mess that is the danbooru dataset.
5
- The Danbooru dataset is a wealth of free anime style images.... The problem being that it was indiscriminately put together.
6
- Some are explicitly copyrighted and shouldnt be in it.
 
7
  Some have watermarks.
8
  Some just frankly arent good.
9
  But the rest... are *really good*
10
 
11
- So, we have volunteers creating a child dataset, with just the clean, usable images.
12
 
13
  We are working from
14
  https://huggingface.co/datasets/animelover/danbooru2022/commits/main/data
15
 
16
- which has the advantage of already been somewhat appropriately sized images.
17
- All that is needed is to copy a block of them, and delete the "bad" images.
18
  (See the "STANDARDS.txt") file, for suggested criteria for deleting images.
19
 
20
- There are literally hundreds of blocks of data, each block having perhaps 4000 images. So this is an organized, crowd-sourced volunteer effort.
 
21
 
22
  Currently spoken-for segments:
23
- * 0000-0010: ppbrown
 
1
  ---
2
  license: openrail
3
  ---
4
+ This hosts a collaborative effort to clean up the mess that is the danbooru dataset.
5
+ The Danbooru dataset is a wealth of (mostly) free anime style images.
6
+ The problem being that it was indiscriminately put together.
7
+ Some pics are explicitly copyrighted and shouldnt be in it.
8
  Some have watermarks.
9
  Some just frankly arent good.
10
  But the rest... are *really good*
11
 
12
+ So, this is a public effort to pull out just the clean, AI-training-usable images.
13
 
14
  We are working from
15
  https://huggingface.co/datasets/animelover/danbooru2022/commits/main/data
16
 
17
+ which has the advantage of containing images that have already been somewhat appropriately resized.
18
+ So now we copy a block of them, and delete the "bad" images. One zip file at a time.
19
  (See the "STANDARDS.txt") file, for suggested criteria for deleting images.
20
 
21
+ There are literally hundreds of blocks of data, each block having perhaps 4000-5000 images.
22
+ So this is an organized, crowd-sourced volunteer effort.
23
 
24
  Currently spoken-for segments:
25
+ * 0000-0010: ppbrown (data-0001 complete)