Birchlabs commited on
Commit
f421a01
1 Parent(s): 851947b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -2,6 +2,8 @@
2
 
3
  Dataset created using [this script](https://github.com/Birch-san/sdxl-diffusion-decoder/blob/main/script/make_sdxl_latent_dataset.py).
4
 
 
 
5
  Didn't bother saving mean & logvar, because variance is low enough it's not worth the doubling of filesize to retain.
6
  Sampled from diagonal gaussian distribution, saved the resulting latents.
7
  Also kept the original image.
@@ -15,8 +17,8 @@ Sample = TypedDict('Sample', {
15
  '__key__': str,
16
  '__url__': str,
17
  'cls.txt': bytes, # UTF-8 encoded class id from 0 to 9 inclusive
18
- 'img.png': bytes, # PIL image, serialized. 256*256px
19
- 'latent.pth': bytes, # FloatTensor, serialized. 32*32 latents
20
  })
21
 
22
  it: Iterator[Sample] = WebDataset('train/{00000..00004}.tar')
 
2
 
3
  Dataset created using [this script](https://github.com/Birch-san/sdxl-diffusion-decoder/blob/main/script/make_sdxl_latent_dataset.py).
4
 
5
+ VAE encoder used NATTEN attention, kernel size 17.
6
+
7
  Didn't bother saving mean & logvar, because variance is low enough it's not worth the doubling of filesize to retain.
8
  Sampled from diagonal gaussian distribution, saved the resulting latents.
9
  Also kept the original image.
 
17
  '__key__': str,
18
  '__url__': str,
19
  'cls.txt': bytes, # UTF-8 encoded class id from 0 to 9 inclusive
20
+ 'img.png': bytes, # PIL image, serialized. 1024*1024px
21
+ 'latent.pth': bytes, # FloatTensor, serialized. 128*128 latents
22
  })
23
 
24
  it: Iterator[Sample] = WebDataset('train/{00000..00004}.tar')