xyfJASON commited on
Commit
063f3cf
1 Parent(s): 7478aed

Upload transformer-vqvae-kmeans-celeba checkpoints and training logs

Browse files
transformer-vqvae-kmeans-celeba/ckpt/step499999/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df99aa5f4769833bb82ff2e829751b4d07f62f0674714325cb3fc3ab673c2dc8
3
+ size 852
transformer-vqvae-kmeans-celeba/ckpt/step499999/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:295c9d2118dc425eab6dfaaebf8d1c91b5935dd33293df92971004e2db0702ef
3
+ size 191815406
transformer-vqvae-kmeans-celeba/ckpt/step499999/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:23681085acd321b74718907489fc4f6683eef06f9c9a61a1a615e12ef1fa8553
3
+ size 383681786
transformer-vqvae-kmeans-celeba/config-2024-02-03-09-29-17.yaml ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ seed: 8888
2
+ data:
3
+ target: datasets.celeba.CelebA
4
+ params:
5
+ root: ~/data/CelebA/
6
+ img_size: 64
7
+ split: train
8
+ dataloader:
9
+ num_workers: 16
10
+ pin_memory: true
11
+ prefetch_factor: 2
12
+ model:
13
+ target: models.transformer.Transformer
14
+ params:
15
+ codebook_num: 512
16
+ embed_dim: 512
17
+ n_heads: 8
18
+ n_layers: 12
19
+ max_tokens: 256
20
+ vqvae:
21
+ target: models.vqvae.VQVAE
22
+ params:
23
+ img_channels: 3
24
+ hidden_dim: 256
25
+ n_resblocks: 2
26
+ codebook_num: 512
27
+ codebook_dim: 64
28
+ codebook_update: kmeans
29
+ ema_decay: 0.99
30
+ pretrained: ./runs/vqvae-kmeans-celeba/ckpt/step499999/model.pt
31
+ train:
32
+ n_steps: 200000
33
+ batch_size: 64
34
+ print_freq: 100
35
+ sample_freq: 1000
36
+ save_freq: 10000
37
+ optim:
38
+ target: torch.optim.Adam
39
+ params:
40
+ lr: 0.0001
transformer-vqvae-kmeans-celeba/config-2024-02-03-16-30-34.yaml ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ seed: 8888
2
+ data:
3
+ target: datasets.celeba.CelebA
4
+ params:
5
+ root: ~/data/CelebA/
6
+ img_size: 64
7
+ split: train
8
+ dataloader:
9
+ num_workers: 16
10
+ pin_memory: true
11
+ prefetch_factor: 2
12
+ model:
13
+ target: models.transformer.Transformer
14
+ params:
15
+ codebook_num: 512
16
+ embed_dim: 512
17
+ n_heads: 8
18
+ n_layers: 12
19
+ max_tokens: 256
20
+ vqvae:
21
+ target: models.vqvae.VQVAE
22
+ params:
23
+ img_channels: 3
24
+ hidden_dim: 256
25
+ n_resblocks: 2
26
+ codebook_num: 512
27
+ codebook_dim: 64
28
+ codebook_update: kmeans
29
+ ema_decay: 0.99
30
+ pretrained: ./runs/vqvae-kmeans-celeba/ckpt/step499999/model.pt
31
+ train:
32
+ n_steps: 500000
33
+ batch_size: 64
34
+ print_freq: 100
35
+ sample_freq: 1000
36
+ save_freq: 10000
37
+ optim:
38
+ target: torch.optim.Adam
39
+ params:
40
+ lr: 0.0001
transformer-vqvae-kmeans-celeba/output-2024-02-03-09-29-17.log ADDED
The diff for this file is too large to render. See raw diff
 
transformer-vqvae-kmeans-celeba/output-2024-02-03-16-30-34.log ADDED
The diff for this file is too large to render. See raw diff
 
transformer-vqvae-kmeans-celeba/samples.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77920c384515f9aededb430c9c894fc1909d7ab754ad21101b7bfc3c9901351c
3
+ size 211348748
transformer-vqvae-kmeans-celeba/tensorboard/events.out.tfevents.1706923758.amax.708885.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8ce1e44bdb79f5ed05663d5c7d594da15ea367abe414e41f43d0e15eeb7f6f9d
3
+ size 19567060
transformer-vqvae-kmeans-celeba/tensorboard/events.out.tfevents.1706949035.amax.1111906.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:18891b6e445e0f3206af1e28326c245eebc2ea52132a445aafe3f92ac596ab12
3
+ size 29400088