xyfJASON commited on
Commit
ff128dd
1 Parent(s): 4870d91

Upload lsgan_cifar10 checkpoints and training logs

Browse files
lsgan_cifar10/ckpt/best/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e623d38e9e2f8b5bd2c130bfca829cb4667f7dbbbb306d40347db1f6bcd6f94e
3
+ size 425
lsgan_cifar10/ckpt/best/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dbf0ebf9dee1578828a906d1ba7f66c913c076144d138985a9cc7f3634cc3b21
3
+ size 90682196
lsgan_cifar10/ckpt/best/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e32cb1e6b192b72fb27c0c67054ffeeb5c21fab1ca994ab9ec76367665ee8fb3
3
+ size 181310875
lsgan_cifar10/ckpt/step039999/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0581c0a3ba4dd8beedbfcf85ca6fc9d8cf15026f285bde50d47f848dee46f481
3
+ size 425
lsgan_cifar10/ckpt/step039999/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f577e613834a49843b89b1049b5865a3d271feb53f5897d9e44651d0ea740f5e
3
+ size 90682196
lsgan_cifar10/ckpt/step039999/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:13563dc9fe3ee19a659f25e0a9a0d29cf9edaf02eae54e61c2ee91eaba5dd1f7
3
+ size 181310875
lsgan_cifar10/config-2023-07-01-15-00-09.yaml ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ seed: 5678
2
+ data:
3
+ name: CIFAR-10
4
+ dataroot: /data/CIFAR-10/
5
+ img_size: 32
6
+ img_channels: 3
7
+ dataloader:
8
+ num_workers: 4
9
+ pin_memory: true
10
+ prefetch_factor: 2
11
+ G:
12
+ target: models.simple_cnn.Generator
13
+ params:
14
+ z_dim: 100
15
+ dim: 256
16
+ dim_mults:
17
+ - 4
18
+ - 2
19
+ - 1
20
+ out_dim: 3
21
+ with_bn: true
22
+ with_tanh: true
23
+ D:
24
+ target: models.simple_cnn.Discriminator
25
+ params:
26
+ in_dim: 3
27
+ dim: 256
28
+ dim_mults:
29
+ - 1
30
+ - 2
31
+ - 4
32
+ with_bn: true
33
+ train:
34
+ n_steps: 40000
35
+ batch_size: 512
36
+ d_iters: 5
37
+ resume: null
38
+ print_freq: 100
39
+ save_freq: 10000
40
+ sample_freq: 1000
41
+ eval_freq: 1000
42
+ n_samples: 64
43
+ loss_fn:
44
+ target: losses.lsgan_loss.LSGANLoss
45
+ params:
46
+ a: 0
47
+ b: 1
48
+ c: 1
49
+ optim_G:
50
+ target: torch.optim.Adam
51
+ params:
52
+ lr: 0.0008
53
+ betas:
54
+ - 0.5
55
+ - 0.999
56
+ optim_D:
57
+ target: torch.optim.Adam
58
+ params:
59
+ lr: 0.0008
60
+ betas:
61
+ - 0.5
62
+ - 0.999
lsgan_cifar10/output-2023-07-01-15-00-09.log ADDED
@@ -0,0 +1,852 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-07-01 15:00:09,799 - INFO - Experiment directory: ./runs/lsgan_cifar10/
2
+ 2023-07-01 15:00:09,800 - INFO - Number of processes: 1
3
+ 2023-07-01 15:00:09,800 - INFO - Distributed type: DistributedType.NO
4
+ 2023-07-01 15:00:09,800 - INFO - Mixed precision: no
5
+ 2023-07-01 15:00:09,800 - INFO - ==============================
6
+ 2023-07-01 15:00:10,089 - INFO - Size of training set: 50000
7
+ 2023-07-01 15:00:10,089 - INFO - Batch size per process: 512
8
+ 2023-07-01 15:00:10,089 - INFO - Total batch size: 512
9
+ 2023-07-01 15:00:10,090 - INFO - ==============================
10
+ 2023-07-01 15:00:10,633 - INFO - Start training...
11
+ 2023-07-01 15:01:07,058 - INFO - [Train] step: 99, loss_D: 0.040776, lr_D: 0.000800
12
+ 2023-07-01 15:01:07,139 - INFO - [Train] step: 99, loss_G: 0.467325, lr_G: 0.000800
13
+ 2023-07-01 15:02:02,998 - INFO - [Train] step: 199, loss_D: 0.040615, lr_D: 0.000800
14
+ 2023-07-01 15:02:03,080 - INFO - [Train] step: 199, loss_G: 0.467046, lr_G: 0.000800
15
+ 2023-07-01 15:02:59,064 - INFO - [Train] step: 299, loss_D: 0.765039, lr_D: 0.000800
16
+ 2023-07-01 15:02:59,145 - INFO - [Train] step: 299, loss_G: 1.567808, lr_G: 0.000800
17
+ 2023-07-01 15:03:55,318 - INFO - [Train] step: 399, loss_D: 0.437440, lr_D: 0.000800
18
+ 2023-07-01 15:03:55,399 - INFO - [Train] step: 399, loss_G: 1.201215, lr_G: 0.000800
19
+ 2023-07-01 15:04:51,573 - INFO - [Train] step: 499, loss_D: 0.724107, lr_D: 0.000800
20
+ 2023-07-01 15:04:51,654 - INFO - [Train] step: 499, loss_G: 1.377260, lr_G: 0.000800
21
+ 2023-07-01 15:05:47,858 - INFO - [Train] step: 599, loss_D: 0.804712, lr_D: 0.000800
22
+ 2023-07-01 15:05:47,940 - INFO - [Train] step: 599, loss_G: 1.749799, lr_G: 0.000800
23
+ 2023-07-01 15:06:44,297 - INFO - [Train] step: 699, loss_D: 0.098723, lr_D: 0.000800
24
+ 2023-07-01 15:06:44,379 - INFO - [Train] step: 699, loss_G: 0.767797, lr_G: 0.000800
25
+ 2023-07-01 15:07:40,618 - INFO - [Train] step: 799, loss_D: 0.077412, lr_D: 0.000800
26
+ 2023-07-01 15:07:40,700 - INFO - [Train] step: 799, loss_G: 0.650187, lr_G: 0.000800
27
+ 2023-07-01 15:08:36,893 - INFO - [Train] step: 899, loss_D: 0.232728, lr_D: 0.000800
28
+ 2023-07-01 15:08:36,974 - INFO - [Train] step: 899, loss_G: 0.196605, lr_G: 0.000800
29
+ 2023-07-01 15:09:33,168 - INFO - [Train] step: 999, loss_D: 0.379194, lr_D: 0.000800
30
+ 2023-07-01 15:09:33,250 - INFO - [Train] step: 999, loss_G: 0.122714, lr_G: 0.000800
31
+ 2023-07-01 15:10:37,746 - INFO - [Eval] step: 999, fid: 204.688629
32
+ 2023-07-01 15:11:33,900 - INFO - [Train] step: 1099, loss_D: 1.031090, lr_D: 0.000800
33
+ 2023-07-01 15:11:33,982 - INFO - [Train] step: 1099, loss_G: 0.149993, lr_G: 0.000800
34
+ 2023-07-01 15:12:30,236 - INFO - [Train] step: 1199, loss_D: 0.073962, lr_D: 0.000800
35
+ 2023-07-01 15:12:30,317 - INFO - [Train] step: 1199, loss_G: 0.366868, lr_G: 0.000800
36
+ 2023-07-01 15:13:26,801 - INFO - [Train] step: 1299, loss_D: 0.066455, lr_D: 0.000800
37
+ 2023-07-01 15:13:26,883 - INFO - [Train] step: 1299, loss_G: 0.631458, lr_G: 0.000800
38
+ 2023-07-01 15:14:23,158 - INFO - [Train] step: 1399, loss_D: 1.648886, lr_D: 0.000800
39
+ 2023-07-01 15:14:23,240 - INFO - [Train] step: 1399, loss_G: 2.199118, lr_G: 0.000800
40
+ 2023-07-01 15:15:19,538 - INFO - [Train] step: 1499, loss_D: 0.059617, lr_D: 0.000800
41
+ 2023-07-01 15:15:19,620 - INFO - [Train] step: 1499, loss_G: 0.638867, lr_G: 0.000800
42
+ 2023-07-01 15:16:15,921 - INFO - [Train] step: 1599, loss_D: 0.019324, lr_D: 0.000800
43
+ 2023-07-01 15:16:16,002 - INFO - [Train] step: 1599, loss_G: 0.514553, lr_G: 0.000800
44
+ 2023-07-01 15:17:12,276 - INFO - [Train] step: 1699, loss_D: 0.239341, lr_D: 0.000800
45
+ 2023-07-01 15:17:12,357 - INFO - [Train] step: 1699, loss_G: 0.284433, lr_G: 0.000800
46
+ 2023-07-01 15:18:08,625 - INFO - [Train] step: 1799, loss_D: 0.244401, lr_D: 0.000800
47
+ 2023-07-01 15:18:08,707 - INFO - [Train] step: 1799, loss_G: 0.255333, lr_G: 0.000800
48
+ 2023-07-01 15:19:05,019 - INFO - [Train] step: 1899, loss_D: 0.113152, lr_D: 0.000800
49
+ 2023-07-01 15:19:05,100 - INFO - [Train] step: 1899, loss_G: 0.335814, lr_G: 0.000800
50
+ 2023-07-01 15:20:01,560 - INFO - [Train] step: 1999, loss_D: 0.082486, lr_D: 0.000800
51
+ 2023-07-01 15:20:01,641 - INFO - [Train] step: 1999, loss_G: 0.431042, lr_G: 0.000800
52
+ 2023-07-01 15:21:06,142 - INFO - [Eval] step: 1999, fid: 171.175541
53
+ 2023-07-01 15:22:02,570 - INFO - [Train] step: 2099, loss_D: 0.091768, lr_D: 0.000800
54
+ 2023-07-01 15:22:02,652 - INFO - [Train] step: 2099, loss_G: 0.503902, lr_G: 0.000800
55
+ 2023-07-01 15:22:58,937 - INFO - [Train] step: 2199, loss_D: 0.017427, lr_D: 0.000800
56
+ 2023-07-01 15:22:59,019 - INFO - [Train] step: 2199, loss_G: 0.486828, lr_G: 0.000800
57
+ 2023-07-01 15:23:55,303 - INFO - [Train] step: 2299, loss_D: 0.022969, lr_D: 0.000800
58
+ 2023-07-01 15:23:55,384 - INFO - [Train] step: 2299, loss_G: 0.518168, lr_G: 0.000800
59
+ 2023-07-01 15:24:51,668 - INFO - [Train] step: 2399, loss_D: 0.014084, lr_D: 0.000800
60
+ 2023-07-01 15:24:51,749 - INFO - [Train] step: 2399, loss_G: 0.438347, lr_G: 0.000800
61
+ 2023-07-01 15:25:48,011 - INFO - [Train] step: 2499, loss_D: 0.045845, lr_D: 0.000800
62
+ 2023-07-01 15:25:48,092 - INFO - [Train] step: 2499, loss_G: 0.370210, lr_G: 0.000800
63
+ 2023-07-01 15:26:44,519 - INFO - [Train] step: 2599, loss_D: 0.022699, lr_D: 0.000800
64
+ 2023-07-01 15:26:44,600 - INFO - [Train] step: 2599, loss_G: 0.495508, lr_G: 0.000800
65
+ 2023-07-01 15:27:40,898 - INFO - [Train] step: 2699, loss_D: 0.082826, lr_D: 0.000800
66
+ 2023-07-01 15:27:40,980 - INFO - [Train] step: 2699, loss_G: 0.400814, lr_G: 0.000800
67
+ 2023-07-01 15:28:37,275 - INFO - [Train] step: 2799, loss_D: 0.160345, lr_D: 0.000800
68
+ 2023-07-01 15:28:37,356 - INFO - [Train] step: 2799, loss_G: 0.302427, lr_G: 0.000800
69
+ 2023-07-01 15:29:33,632 - INFO - [Train] step: 2899, loss_D: 0.066747, lr_D: 0.000800
70
+ 2023-07-01 15:29:33,714 - INFO - [Train] step: 2899, loss_G: 0.363609, lr_G: 0.000800
71
+ 2023-07-01 15:30:29,989 - INFO - [Train] step: 2999, loss_D: 0.048972, lr_D: 0.000800
72
+ 2023-07-01 15:30:30,071 - INFO - [Train] step: 2999, loss_G: 0.446368, lr_G: 0.000800
73
+ 2023-07-01 15:31:34,357 - INFO - [Eval] step: 2999, fid: 176.967672
74
+ 2023-07-01 15:32:30,442 - INFO - [Train] step: 3099, loss_D: 0.024519, lr_D: 0.000800
75
+ 2023-07-01 15:32:30,524 - INFO - [Train] step: 3099, loss_G: 0.513590, lr_G: 0.000800
76
+ 2023-07-01 15:33:26,846 - INFO - [Train] step: 3199, loss_D: 0.040996, lr_D: 0.000800
77
+ 2023-07-01 15:33:26,927 - INFO - [Train] step: 3199, loss_G: 0.401883, lr_G: 0.000800
78
+ 2023-07-01 15:34:23,361 - INFO - [Train] step: 3299, loss_D: 0.023857, lr_D: 0.000800
79
+ 2023-07-01 15:34:23,442 - INFO - [Train] step: 3299, loss_G: 0.565304, lr_G: 0.000800
80
+ 2023-07-01 15:35:19,724 - INFO - [Train] step: 3399, loss_D: 0.111633, lr_D: 0.000800
81
+ 2023-07-01 15:35:19,805 - INFO - [Train] step: 3399, loss_G: 0.776083, lr_G: 0.000800
82
+ 2023-07-01 15:36:16,086 - INFO - [Train] step: 3499, loss_D: 0.034294, lr_D: 0.000800
83
+ 2023-07-01 15:36:16,167 - INFO - [Train] step: 3499, loss_G: 0.402164, lr_G: 0.000800
84
+ 2023-07-01 15:37:12,439 - INFO - [Train] step: 3599, loss_D: 0.022949, lr_D: 0.000800
85
+ 2023-07-01 15:37:12,521 - INFO - [Train] step: 3599, loss_G: 0.441130, lr_G: 0.000800
86
+ 2023-07-01 15:38:08,811 - INFO - [Train] step: 3699, loss_D: 0.012377, lr_D: 0.000800
87
+ 2023-07-01 15:38:08,892 - INFO - [Train] step: 3699, loss_G: 0.462345, lr_G: 0.000800
88
+ 2023-07-01 15:39:05,195 - INFO - [Train] step: 3799, loss_D: 0.023728, lr_D: 0.000800
89
+ 2023-07-01 15:39:05,277 - INFO - [Train] step: 3799, loss_G: 0.450673, lr_G: 0.000800
90
+ 2023-07-01 15:40:01,722 - INFO - [Train] step: 3899, loss_D: 0.019619, lr_D: 0.000800
91
+ 2023-07-01 15:40:01,804 - INFO - [Train] step: 3899, loss_G: 0.572972, lr_G: 0.000800
92
+ 2023-07-01 15:40:58,117 - INFO - [Train] step: 3999, loss_D: 0.046678, lr_D: 0.000800
93
+ 2023-07-01 15:40:58,198 - INFO - [Train] step: 3999, loss_G: 0.483426, lr_G: 0.000800
94
+ 2023-07-01 15:42:02,133 - INFO - [Eval] step: 3999, fid: 110.422782
95
+ 2023-07-01 15:42:58,550 - INFO - [Train] step: 4099, loss_D: 0.126218, lr_D: 0.000800
96
+ 2023-07-01 15:42:58,632 - INFO - [Train] step: 4099, loss_G: 0.291913, lr_G: 0.000800
97
+ 2023-07-01 15:43:54,935 - INFO - [Train] step: 4199, loss_D: 0.035849, lr_D: 0.000800
98
+ 2023-07-01 15:43:55,017 - INFO - [Train] step: 4199, loss_G: 0.523266, lr_G: 0.000800
99
+ 2023-07-01 15:44:51,309 - INFO - [Train] step: 4299, loss_D: 0.037320, lr_D: 0.000800
100
+ 2023-07-01 15:44:51,391 - INFO - [Train] step: 4299, loss_G: 0.458659, lr_G: 0.000800
101
+ 2023-07-01 15:45:47,676 - INFO - [Train] step: 4399, loss_D: 0.011471, lr_D: 0.000800
102
+ 2023-07-01 15:45:47,758 - INFO - [Train] step: 4399, loss_G: 0.508021, lr_G: 0.000800
103
+ 2023-07-01 15:46:44,053 - INFO - [Train] step: 4499, loss_D: 0.089249, lr_D: 0.000800
104
+ 2023-07-01 15:46:44,134 - INFO - [Train] step: 4499, loss_G: 0.694754, lr_G: 0.000800
105
+ 2023-07-01 15:47:40,575 - INFO - [Train] step: 4599, loss_D: 0.114922, lr_D: 0.000800
106
+ 2023-07-01 15:47:40,657 - INFO - [Train] step: 4599, loss_G: 0.099662, lr_G: 0.000800
107
+ 2023-07-01 15:48:36,973 - INFO - [Train] step: 4699, loss_D: 0.075923, lr_D: 0.000800
108
+ 2023-07-01 15:48:37,055 - INFO - [Train] step: 4699, loss_G: 0.317860, lr_G: 0.000800
109
+ 2023-07-01 15:49:33,327 - INFO - [Train] step: 4799, loss_D: 0.025180, lr_D: 0.000800
110
+ 2023-07-01 15:49:33,409 - INFO - [Train] step: 4799, loss_G: 0.552436, lr_G: 0.000800
111
+ 2023-07-01 15:50:29,667 - INFO - [Train] step: 4899, loss_D: 0.020879, lr_D: 0.000800
112
+ 2023-07-01 15:50:29,749 - INFO - [Train] step: 4899, loss_G: 0.466652, lr_G: 0.000800
113
+ 2023-07-01 15:51:26,027 - INFO - [Train] step: 4999, loss_D: 0.025203, lr_D: 0.000800
114
+ 2023-07-01 15:51:26,109 - INFO - [Train] step: 4999, loss_G: 0.398335, lr_G: 0.000800
115
+ 2023-07-01 15:52:30,087 - INFO - [Eval] step: 4999, fid: 118.414793
116
+ 2023-07-01 15:53:26,158 - INFO - [Train] step: 5099, loss_D: 0.048037, lr_D: 0.000800
117
+ 2023-07-01 15:53:26,240 - INFO - [Train] step: 5099, loss_G: 0.530245, lr_G: 0.000800
118
+ 2023-07-01 15:54:22,652 - INFO - [Train] step: 5199, loss_D: 0.021130, lr_D: 0.000800
119
+ 2023-07-01 15:54:22,733 - INFO - [Train] step: 5199, loss_G: 0.541339, lr_G: 0.000800
120
+ 2023-07-01 15:55:19,015 - INFO - [Train] step: 5299, loss_D: 0.041751, lr_D: 0.000800
121
+ 2023-07-01 15:55:19,097 - INFO - [Train] step: 5299, loss_G: 0.454968, lr_G: 0.000800
122
+ 2023-07-01 15:56:15,395 - INFO - [Train] step: 5399, loss_D: 0.023214, lr_D: 0.000800
123
+ 2023-07-01 15:56:15,477 - INFO - [Train] step: 5399, loss_G: 0.482839, lr_G: 0.000800
124
+ 2023-07-01 15:57:11,742 - INFO - [Train] step: 5499, loss_D: 0.019600, lr_D: 0.000800
125
+ 2023-07-01 15:57:11,823 - INFO - [Train] step: 5499, loss_G: 0.506283, lr_G: 0.000800
126
+ 2023-07-01 15:58:08,120 - INFO - [Train] step: 5599, loss_D: 0.014186, lr_D: 0.000800
127
+ 2023-07-01 15:58:08,201 - INFO - [Train] step: 5599, loss_G: 0.557838, lr_G: 0.000800
128
+ 2023-07-01 15:59:04,493 - INFO - [Train] step: 5699, loss_D: 0.221925, lr_D: 0.000800
129
+ 2023-07-01 15:59:04,574 - INFO - [Train] step: 5699, loss_G: 0.242728, lr_G: 0.000800
130
+ 2023-07-01 16:00:00,834 - INFO - [Train] step: 5799, loss_D: 0.165530, lr_D: 0.000800
131
+ 2023-07-01 16:00:00,915 - INFO - [Train] step: 5799, loss_G: 0.227829, lr_G: 0.000800
132
+ 2023-07-01 16:00:57,358 - INFO - [Train] step: 5899, loss_D: 0.115643, lr_D: 0.000800
133
+ 2023-07-01 16:00:57,440 - INFO - [Train] step: 5899, loss_G: 0.317124, lr_G: 0.000800
134
+ 2023-07-01 16:01:53,721 - INFO - [Train] step: 5999, loss_D: 0.103261, lr_D: 0.000800
135
+ 2023-07-01 16:01:53,803 - INFO - [Train] step: 5999, loss_G: 0.264031, lr_G: 0.000800
136
+ 2023-07-01 16:02:57,825 - INFO - [Eval] step: 5999, fid: 80.259056
137
+ 2023-07-01 16:03:54,238 - INFO - [Train] step: 6099, loss_D: 0.051908, lr_D: 0.000800
138
+ 2023-07-01 16:03:54,319 - INFO - [Train] step: 6099, loss_G: 0.461725, lr_G: 0.000800
139
+ 2023-07-01 16:04:50,585 - INFO - [Train] step: 6199, loss_D: 0.077694, lr_D: 0.000800
140
+ 2023-07-01 16:04:50,667 - INFO - [Train] step: 6199, loss_G: 0.340203, lr_G: 0.000800
141
+ 2023-07-01 16:05:46,954 - INFO - [Train] step: 6299, loss_D: 0.042244, lr_D: 0.000800
142
+ 2023-07-01 16:05:47,035 - INFO - [Train] step: 6299, loss_G: 0.457580, lr_G: 0.000800
143
+ 2023-07-01 16:06:43,308 - INFO - [Train] step: 6399, loss_D: 0.212540, lr_D: 0.000800
144
+ 2023-07-01 16:06:43,390 - INFO - [Train] step: 6399, loss_G: 0.311328, lr_G: 0.000800
145
+ 2023-07-01 16:07:39,849 - INFO - [Train] step: 6499, loss_D: 0.024245, lr_D: 0.000800
146
+ 2023-07-01 16:07:39,931 - INFO - [Train] step: 6499, loss_G: 0.470824, lr_G: 0.000800
147
+ 2023-07-01 16:08:36,216 - INFO - [Train] step: 6599, loss_D: 0.047844, lr_D: 0.000800
148
+ 2023-07-01 16:08:36,298 - INFO - [Train] step: 6599, loss_G: 0.415272, lr_G: 0.000800
149
+ 2023-07-01 16:09:32,579 - INFO - [Train] step: 6699, loss_D: 0.033318, lr_D: 0.000800
150
+ 2023-07-01 16:09:32,661 - INFO - [Train] step: 6699, loss_G: 0.465461, lr_G: 0.000800
151
+ 2023-07-01 16:10:28,943 - INFO - [Train] step: 6799, loss_D: 0.148448, lr_D: 0.000800
152
+ 2023-07-01 16:10:29,025 - INFO - [Train] step: 6799, loss_G: 0.209431, lr_G: 0.000800
153
+ 2023-07-01 16:11:25,292 - INFO - [Train] step: 6899, loss_D: 0.105897, lr_D: 0.000800
154
+ 2023-07-01 16:11:25,373 - INFO - [Train] step: 6899, loss_G: 0.281653, lr_G: 0.000800
155
+ 2023-07-01 16:12:21,643 - INFO - [Train] step: 6999, loss_D: 0.062976, lr_D: 0.000800
156
+ 2023-07-01 16:12:21,725 - INFO - [Train] step: 6999, loss_G: 0.289516, lr_G: 0.000800
157
+ 2023-07-01 16:13:25,882 - INFO - [Eval] step: 6999, fid: 83.707575
158
+ 2023-07-01 16:14:21,910 - INFO - [Train] step: 7099, loss_D: 0.150858, lr_D: 0.000800
159
+ 2023-07-01 16:14:21,991 - INFO - [Train] step: 7099, loss_G: 0.699395, lr_G: 0.000800
160
+ 2023-07-01 16:15:18,400 - INFO - [Train] step: 7199, loss_D: 0.020312, lr_D: 0.000800
161
+ 2023-07-01 16:15:18,481 - INFO - [Train] step: 7199, loss_G: 0.449259, lr_G: 0.000800
162
+ 2023-07-01 16:16:14,742 - INFO - [Train] step: 7299, loss_D: 0.031448, lr_D: 0.000800
163
+ 2023-07-01 16:16:14,824 - INFO - [Train] step: 7299, loss_G: 0.395463, lr_G: 0.000800
164
+ 2023-07-01 16:17:11,071 - INFO - [Train] step: 7399, loss_D: 0.352383, lr_D: 0.000800
165
+ 2023-07-01 16:17:11,153 - INFO - [Train] step: 7399, loss_G: 0.527658, lr_G: 0.000800
166
+ 2023-07-01 16:18:07,400 - INFO - [Train] step: 7499, loss_D: 0.019191, lr_D: 0.000800
167
+ 2023-07-01 16:18:07,481 - INFO - [Train] step: 7499, loss_G: 0.497877, lr_G: 0.000800
168
+ 2023-07-01 16:19:03,731 - INFO - [Train] step: 7599, loss_D: 0.079302, lr_D: 0.000800
169
+ 2023-07-01 16:19:03,812 - INFO - [Train] step: 7599, loss_G: 0.396555, lr_G: 0.000800
170
+ 2023-07-01 16:20:00,088 - INFO - [Train] step: 7699, loss_D: 0.118297, lr_D: 0.000800
171
+ 2023-07-01 16:20:00,170 - INFO - [Train] step: 7699, loss_G: 0.232735, lr_G: 0.000800
172
+ 2023-07-01 16:20:56,583 - INFO - [Train] step: 7799, loss_D: 0.080086, lr_D: 0.000800
173
+ 2023-07-01 16:20:56,664 - INFO - [Train] step: 7799, loss_G: 0.405607, lr_G: 0.000800
174
+ 2023-07-01 16:21:52,884 - INFO - [Train] step: 7899, loss_D: 0.035206, lr_D: 0.000800
175
+ 2023-07-01 16:21:52,966 - INFO - [Train] step: 7899, loss_G: 0.521183, lr_G: 0.000800
176
+ 2023-07-01 16:22:49,232 - INFO - [Train] step: 7999, loss_D: 0.042690, lr_D: 0.000800
177
+ 2023-07-01 16:22:49,314 - INFO - [Train] step: 7999, loss_G: 0.628162, lr_G: 0.000800
178
+ 2023-07-01 16:23:53,332 - INFO - [Eval] step: 7999, fid: 63.824590
179
+ 2023-07-01 16:24:49,626 - INFO - [Train] step: 8099, loss_D: 0.045911, lr_D: 0.000800
180
+ 2023-07-01 16:24:49,707 - INFO - [Train] step: 8099, loss_G: 0.295043, lr_G: 0.000800
181
+ 2023-07-01 16:25:45,949 - INFO - [Train] step: 8199, loss_D: 0.033123, lr_D: 0.000800
182
+ 2023-07-01 16:25:46,031 - INFO - [Train] step: 8199, loss_G: 0.386580, lr_G: 0.000800
183
+ 2023-07-01 16:26:42,281 - INFO - [Train] step: 8299, loss_D: 0.025258, lr_D: 0.000800
184
+ 2023-07-01 16:26:42,363 - INFO - [Train] step: 8299, loss_G: 0.459850, lr_G: 0.000800
185
+ 2023-07-01 16:27:38,620 - INFO - [Train] step: 8399, loss_D: 0.015947, lr_D: 0.000800
186
+ 2023-07-01 16:27:38,701 - INFO - [Train] step: 8399, loss_G: 0.478456, lr_G: 0.000800
187
+ 2023-07-01 16:28:35,096 - INFO - [Train] step: 8499, loss_D: 0.041746, lr_D: 0.000800
188
+ 2023-07-01 16:28:35,177 - INFO - [Train] step: 8499, loss_G: 0.391286, lr_G: 0.000800
189
+ 2023-07-01 16:29:31,402 - INFO - [Train] step: 8599, loss_D: 0.041732, lr_D: 0.000800
190
+ 2023-07-01 16:29:31,483 - INFO - [Train] step: 8599, loss_G: 0.573958, lr_G: 0.000800
191
+ 2023-07-01 16:30:27,710 - INFO - [Train] step: 8699, loss_D: 0.024475, lr_D: 0.000800
192
+ 2023-07-01 16:30:27,792 - INFO - [Train] step: 8699, loss_G: 0.445166, lr_G: 0.000800
193
+ 2023-07-01 16:31:24,034 - INFO - [Train] step: 8799, loss_D: 0.069513, lr_D: 0.000800
194
+ 2023-07-01 16:31:24,116 - INFO - [Train] step: 8799, loss_G: 0.564701, lr_G: 0.000800
195
+ 2023-07-01 16:32:20,329 - INFO - [Train] step: 8899, loss_D: 0.030355, lr_D: 0.000800
196
+ 2023-07-01 16:32:20,410 - INFO - [Train] step: 8899, loss_G: 0.398739, lr_G: 0.000800
197
+ 2023-07-01 16:33:16,666 - INFO - [Train] step: 8999, loss_D: 0.054898, lr_D: 0.000800
198
+ 2023-07-01 16:33:16,747 - INFO - [Train] step: 8999, loss_G: 0.355975, lr_G: 0.000800
199
+ 2023-07-01 16:34:21,042 - INFO - [Eval] step: 8999, fid: 51.856425
200
+ 2023-07-01 16:35:17,557 - INFO - [Train] step: 9099, loss_D: 0.345732, lr_D: 0.000800
201
+ 2023-07-01 16:35:17,639 - INFO - [Train] step: 9099, loss_G: 1.304249, lr_G: 0.000800
202
+ 2023-07-01 16:36:13,902 - INFO - [Train] step: 9199, loss_D: 0.061331, lr_D: 0.000800
203
+ 2023-07-01 16:36:13,983 - INFO - [Train] step: 9199, loss_G: 0.478601, lr_G: 0.000800
204
+ 2023-07-01 16:37:10,210 - INFO - [Train] step: 9299, loss_D: 0.017195, lr_D: 0.000800
205
+ 2023-07-01 16:37:10,292 - INFO - [Train] step: 9299, loss_G: 0.445164, lr_G: 0.000800
206
+ 2023-07-01 16:38:06,527 - INFO - [Train] step: 9399, loss_D: 0.028449, lr_D: 0.000800
207
+ 2023-07-01 16:38:06,609 - INFO - [Train] step: 9399, loss_G: 0.422484, lr_G: 0.000800
208
+ 2023-07-01 16:39:02,889 - INFO - [Train] step: 9499, loss_D: 0.122286, lr_D: 0.000800
209
+ 2023-07-01 16:39:02,970 - INFO - [Train] step: 9499, loss_G: 0.339955, lr_G: 0.000800
210
+ 2023-07-01 16:39:59,192 - INFO - [Train] step: 9599, loss_D: 0.031580, lr_D: 0.000800
211
+ 2023-07-01 16:39:59,273 - INFO - [Train] step: 9599, loss_G: 0.414887, lr_G: 0.000800
212
+ 2023-07-01 16:40:55,534 - INFO - [Train] step: 9699, loss_D: 0.055920, lr_D: 0.000800
213
+ 2023-07-01 16:40:55,615 - INFO - [Train] step: 9699, loss_G: 0.380711, lr_G: 0.000800
214
+ 2023-07-01 16:41:51,983 - INFO - [Train] step: 9799, loss_D: 0.042303, lr_D: 0.000800
215
+ 2023-07-01 16:41:52,065 - INFO - [Train] step: 9799, loss_G: 0.411310, lr_G: 0.000800
216
+ 2023-07-01 16:42:48,305 - INFO - [Train] step: 9899, loss_D: 0.015790, lr_D: 0.000800
217
+ 2023-07-01 16:42:48,387 - INFO - [Train] step: 9899, loss_G: 0.513790, lr_G: 0.000800
218
+ 2023-07-01 16:43:44,627 - INFO - [Train] step: 9999, loss_D: 0.031156, lr_D: 0.000800
219
+ 2023-07-01 16:43:44,709 - INFO - [Train] step: 9999, loss_G: 0.435210, lr_G: 0.000800
220
+ 2023-07-01 16:44:48,944 - INFO - [Eval] step: 9999, fid: 44.175232
221
+ 2023-07-01 16:45:45,437 - INFO - [Train] step: 10099, loss_D: 0.048197, lr_D: 0.000800
222
+ 2023-07-01 16:45:45,519 - INFO - [Train] step: 10099, loss_G: 0.563195, lr_G: 0.000800
223
+ 2023-07-01 16:46:41,735 - INFO - [Train] step: 10199, loss_D: 0.015835, lr_D: 0.000800
224
+ 2023-07-01 16:46:41,816 - INFO - [Train] step: 10199, loss_G: 0.494992, lr_G: 0.000800
225
+ 2023-07-01 16:47:38,028 - INFO - [Train] step: 10299, loss_D: 0.020491, lr_D: 0.000800
226
+ 2023-07-01 16:47:38,109 - INFO - [Train] step: 10299, loss_G: 0.500630, lr_G: 0.000800
227
+ 2023-07-01 16:48:34,465 - INFO - [Train] step: 10399, loss_D: 0.031950, lr_D: 0.000800
228
+ 2023-07-01 16:48:34,547 - INFO - [Train] step: 10399, loss_G: 0.362065, lr_G: 0.000800
229
+ 2023-07-01 16:49:30,770 - INFO - [Train] step: 10499, loss_D: 0.012799, lr_D: 0.000800
230
+ 2023-07-01 16:49:30,852 - INFO - [Train] step: 10499, loss_G: 0.464469, lr_G: 0.000800
231
+ 2023-07-01 16:50:27,066 - INFO - [Train] step: 10599, loss_D: 0.015617, lr_D: 0.000800
232
+ 2023-07-01 16:50:27,148 - INFO - [Train] step: 10599, loss_G: 0.473956, lr_G: 0.000800
233
+ 2023-07-01 16:51:23,381 - INFO - [Train] step: 10699, loss_D: 0.026475, lr_D: 0.000800
234
+ 2023-07-01 16:51:23,463 - INFO - [Train] step: 10699, loss_G: 0.485552, lr_G: 0.000800
235
+ 2023-07-01 16:52:19,674 - INFO - [Train] step: 10799, loss_D: 0.014157, lr_D: 0.000800
236
+ 2023-07-01 16:52:19,755 - INFO - [Train] step: 10799, loss_G: 0.476442, lr_G: 0.000800
237
+ 2023-07-01 16:53:15,958 - INFO - [Train] step: 10899, loss_D: 0.028922, lr_D: 0.000800
238
+ 2023-07-01 16:53:16,040 - INFO - [Train] step: 10899, loss_G: 0.452689, lr_G: 0.000800
239
+ 2023-07-01 16:54:12,409 - INFO - [Train] step: 10999, loss_D: 0.019099, lr_D: 0.000800
240
+ 2023-07-01 16:54:12,491 - INFO - [Train] step: 10999, loss_G: 0.541219, lr_G: 0.000800
241
+ 2023-07-01 16:55:16,605 - INFO - [Eval] step: 10999, fid: 41.510723
242
+ 2023-07-01 16:56:12,939 - INFO - [Train] step: 11099, loss_D: 0.426106, lr_D: 0.000800
243
+ 2023-07-01 16:56:13,020 - INFO - [Train] step: 11099, loss_G: 0.712436, lr_G: 0.000800
244
+ 2023-07-01 16:57:09,286 - INFO - [Train] step: 11199, loss_D: 0.013547, lr_D: 0.000800
245
+ 2023-07-01 16:57:09,368 - INFO - [Train] step: 11199, loss_G: 0.533168, lr_G: 0.000800
246
+ 2023-07-01 16:58:05,595 - INFO - [Train] step: 11299, loss_D: 0.025288, lr_D: 0.000800
247
+ 2023-07-01 16:58:05,676 - INFO - [Train] step: 11299, loss_G: 0.395842, lr_G: 0.000800
248
+ 2023-07-01 16:59:01,888 - INFO - [Train] step: 11399, loss_D: 0.046030, lr_D: 0.000800
249
+ 2023-07-01 16:59:01,969 - INFO - [Train] step: 11399, loss_G: 0.272756, lr_G: 0.000800
250
+ 2023-07-01 16:59:58,217 - INFO - [Train] step: 11499, loss_D: 0.021802, lr_D: 0.000800
251
+ 2023-07-01 16:59:58,299 - INFO - [Train] step: 11499, loss_G: 0.524531, lr_G: 0.000800
252
+ 2023-07-01 17:00:54,516 - INFO - [Train] step: 11599, loss_D: 0.102124, lr_D: 0.000800
253
+ 2023-07-01 17:00:54,597 - INFO - [Train] step: 11599, loss_G: 0.250981, lr_G: 0.000800
254
+ 2023-07-01 17:01:50,982 - INFO - [Train] step: 11699, loss_D: 0.016419, lr_D: 0.000800
255
+ 2023-07-01 17:01:51,064 - INFO - [Train] step: 11699, loss_G: 0.416956, lr_G: 0.000800
256
+ 2023-07-01 17:02:47,278 - INFO - [Train] step: 11799, loss_D: 0.022939, lr_D: 0.000800
257
+ 2023-07-01 17:02:47,360 - INFO - [Train] step: 11799, loss_G: 0.444405, lr_G: 0.000800
258
+ 2023-07-01 17:03:43,572 - INFO - [Train] step: 11899, loss_D: 0.028677, lr_D: 0.000800
259
+ 2023-07-01 17:03:43,654 - INFO - [Train] step: 11899, loss_G: 0.505162, lr_G: 0.000800
260
+ 2023-07-01 17:04:39,891 - INFO - [Train] step: 11999, loss_D: 0.193232, lr_D: 0.000800
261
+ 2023-07-01 17:04:39,973 - INFO - [Train] step: 11999, loss_G: 0.713976, lr_G: 0.000800
262
+ 2023-07-01 17:05:44,158 - INFO - [Eval] step: 11999, fid: 37.335719
263
+ 2023-07-01 17:06:40,488 - INFO - [Train] step: 12099, loss_D: 0.041622, lr_D: 0.000800
264
+ 2023-07-01 17:06:40,569 - INFO - [Train] step: 12099, loss_G: 0.354398, lr_G: 0.000800
265
+ 2023-07-01 17:07:36,806 - INFO - [Train] step: 12199, loss_D: 0.018744, lr_D: 0.000800
266
+ 2023-07-01 17:07:36,887 - INFO - [Train] step: 12199, loss_G: 0.524607, lr_G: 0.000800
267
+ 2023-07-01 17:08:33,267 - INFO - [Train] step: 12299, loss_D: 0.040381, lr_D: 0.000800
268
+ 2023-07-01 17:08:33,349 - INFO - [Train] step: 12299, loss_G: 0.378312, lr_G: 0.000800
269
+ 2023-07-01 17:09:29,602 - INFO - [Train] step: 12399, loss_D: 0.036392, lr_D: 0.000800
270
+ 2023-07-01 17:09:29,683 - INFO - [Train] step: 12399, loss_G: 0.327528, lr_G: 0.000800
271
+ 2023-07-01 17:10:25,908 - INFO - [Train] step: 12499, loss_D: 0.022661, lr_D: 0.000800
272
+ 2023-07-01 17:10:25,989 - INFO - [Train] step: 12499, loss_G: 0.503605, lr_G: 0.000800
273
+ 2023-07-01 17:11:22,227 - INFO - [Train] step: 12599, loss_D: 0.160112, lr_D: 0.000800
274
+ 2023-07-01 17:11:22,309 - INFO - [Train] step: 12599, loss_G: 0.704796, lr_G: 0.000800
275
+ 2023-07-01 17:12:18,564 - INFO - [Train] step: 12699, loss_D: 0.042283, lr_D: 0.000800
276
+ 2023-07-01 17:12:18,646 - INFO - [Train] step: 12699, loss_G: 0.419944, lr_G: 0.000800
277
+ 2023-07-01 17:13:14,892 - INFO - [Train] step: 12799, loss_D: 0.015350, lr_D: 0.000800
278
+ 2023-07-01 17:13:14,974 - INFO - [Train] step: 12799, loss_G: 0.590415, lr_G: 0.000800
279
+ 2023-07-01 17:14:11,201 - INFO - [Train] step: 12899, loss_D: 0.028026, lr_D: 0.000800
280
+ 2023-07-01 17:14:11,282 - INFO - [Train] step: 12899, loss_G: 0.602759, lr_G: 0.000800
281
+ 2023-07-01 17:15:07,686 - INFO - [Train] step: 12999, loss_D: 0.031418, lr_D: 0.000800
282
+ 2023-07-01 17:15:07,767 - INFO - [Train] step: 12999, loss_G: 0.387043, lr_G: 0.000800
283
+ 2023-07-01 17:16:12,106 - INFO - [Eval] step: 12999, fid: 36.465563
284
+ 2023-07-01 17:17:08,377 - INFO - [Train] step: 13099, loss_D: 0.019731, lr_D: 0.000800
285
+ 2023-07-01 17:17:08,458 - INFO - [Train] step: 13099, loss_G: 0.472084, lr_G: 0.000800
286
+ 2023-07-01 17:18:04,480 - INFO - [Train] step: 13199, loss_D: 0.012040, lr_D: 0.000800
287
+ 2023-07-01 17:18:04,561 - INFO - [Train] step: 13199, loss_G: 0.529425, lr_G: 0.000800
288
+ 2023-07-01 17:19:00,770 - INFO - [Train] step: 13299, loss_D: 0.015204, lr_D: 0.000800
289
+ 2023-07-01 17:19:00,852 - INFO - [Train] step: 13299, loss_G: 0.592233, lr_G: 0.000800
290
+ 2023-07-01 17:19:57,077 - INFO - [Train] step: 13399, loss_D: 0.035052, lr_D: 0.000800
291
+ 2023-07-01 17:19:57,159 - INFO - [Train] step: 13399, loss_G: 0.287030, lr_G: 0.000800
292
+ 2023-07-01 17:20:53,407 - INFO - [Train] step: 13499, loss_D: 0.031691, lr_D: 0.000800
293
+ 2023-07-01 17:20:53,488 - INFO - [Train] step: 13499, loss_G: 0.541036, lr_G: 0.000800
294
+ 2023-07-01 17:21:49,845 - INFO - [Train] step: 13599, loss_D: 0.015474, lr_D: 0.000800
295
+ 2023-07-01 17:21:49,927 - INFO - [Train] step: 13599, loss_G: 0.515333, lr_G: 0.000800
296
+ 2023-07-01 17:22:46,157 - INFO - [Train] step: 13699, loss_D: 0.024264, lr_D: 0.000800
297
+ 2023-07-01 17:22:46,239 - INFO - [Train] step: 13699, loss_G: 0.358656, lr_G: 0.000800
298
+ 2023-07-01 17:23:42,463 - INFO - [Train] step: 13799, loss_D: 0.018509, lr_D: 0.000800
299
+ 2023-07-01 17:23:42,544 - INFO - [Train] step: 13799, loss_G: 0.556743, lr_G: 0.000800
300
+ 2023-07-01 17:24:38,784 - INFO - [Train] step: 13899, loss_D: 0.016108, lr_D: 0.000800
301
+ 2023-07-01 17:24:38,866 - INFO - [Train] step: 13899, loss_G: 0.540080, lr_G: 0.000800
302
+ 2023-07-01 17:25:35,095 - INFO - [Train] step: 13999, loss_D: 0.185542, lr_D: 0.000800
303
+ 2023-07-01 17:25:35,176 - INFO - [Train] step: 13999, loss_G: 0.206313, lr_G: 0.000800
304
+ 2023-07-01 17:26:39,437 - INFO - [Eval] step: 13999, fid: 34.038329
305
+ 2023-07-01 17:27:35,694 - INFO - [Train] step: 14099, loss_D: 0.025367, lr_D: 0.000800
306
+ 2023-07-01 17:27:35,775 - INFO - [Train] step: 14099, loss_G: 0.423853, lr_G: 0.000800
307
+ 2023-07-01 17:28:31,852 - INFO - [Train] step: 14199, loss_D: 0.042458, lr_D: 0.000800
308
+ 2023-07-01 17:28:31,934 - INFO - [Train] step: 14199, loss_G: 0.339424, lr_G: 0.000800
309
+ 2023-07-01 17:29:28,330 - INFO - [Train] step: 14299, loss_D: 0.022700, lr_D: 0.000800
310
+ 2023-07-01 17:29:28,411 - INFO - [Train] step: 14299, loss_G: 0.386404, lr_G: 0.000800
311
+ 2023-07-01 17:30:24,670 - INFO - [Train] step: 14399, loss_D: 0.016297, lr_D: 0.000800
312
+ 2023-07-01 17:30:24,752 - INFO - [Train] step: 14399, loss_G: 0.453159, lr_G: 0.000800
313
+ 2023-07-01 17:31:20,985 - INFO - [Train] step: 14499, loss_D: 0.030546, lr_D: 0.000800
314
+ 2023-07-01 17:31:21,066 - INFO - [Train] step: 14499, loss_G: 0.428695, lr_G: 0.000800
315
+ 2023-07-01 17:32:17,281 - INFO - [Train] step: 14599, loss_D: 0.013512, lr_D: 0.000800
316
+ 2023-07-01 17:32:17,362 - INFO - [Train] step: 14599, loss_G: 0.463028, lr_G: 0.000800
317
+ 2023-07-01 17:33:13,608 - INFO - [Train] step: 14699, loss_D: 0.019674, lr_D: 0.000800
318
+ 2023-07-01 17:33:13,690 - INFO - [Train] step: 14699, loss_G: 0.452179, lr_G: 0.000800
319
+ 2023-07-01 17:34:09,910 - INFO - [Train] step: 14799, loss_D: 0.025598, lr_D: 0.000800
320
+ 2023-07-01 17:34:09,992 - INFO - [Train] step: 14799, loss_G: 0.347240, lr_G: 0.000800
321
+ 2023-07-01 17:35:06,380 - INFO - [Train] step: 14899, loss_D: 0.014152, lr_D: 0.000800
322
+ 2023-07-01 17:35:06,461 - INFO - [Train] step: 14899, loss_G: 0.567665, lr_G: 0.000800
323
+ 2023-07-01 17:36:02,698 - INFO - [Train] step: 14999, loss_D: 0.010963, lr_D: 0.000800
324
+ 2023-07-01 17:36:02,780 - INFO - [Train] step: 14999, loss_G: 0.465192, lr_G: 0.000800
325
+ 2023-07-01 17:37:06,934 - INFO - [Eval] step: 14999, fid: 32.735350
326
+ 2023-07-01 17:38:03,239 - INFO - [Train] step: 15099, loss_D: 0.020491, lr_D: 0.000800
327
+ 2023-07-01 17:38:03,320 - INFO - [Train] step: 15099, loss_G: 0.384963, lr_G: 0.000800
328
+ 2023-07-01 17:38:59,470 - INFO - [Train] step: 15199, loss_D: 0.018280, lr_D: 0.000800
329
+ 2023-07-01 17:38:59,552 - INFO - [Train] step: 15199, loss_G: 0.589998, lr_G: 0.000800
330
+ 2023-07-01 17:39:55,773 - INFO - [Train] step: 15299, loss_D: 0.015946, lr_D: 0.000800
331
+ 2023-07-01 17:39:55,854 - INFO - [Train] step: 15299, loss_G: 0.492813, lr_G: 0.000800
332
+ 2023-07-01 17:40:52,077 - INFO - [Train] step: 15399, loss_D: 0.026006, lr_D: 0.000800
333
+ 2023-07-01 17:40:52,159 - INFO - [Train] step: 15399, loss_G: 0.457885, lr_G: 0.000800
334
+ 2023-07-01 17:41:48,413 - INFO - [Train] step: 15499, loss_D: 0.047703, lr_D: 0.000800
335
+ 2023-07-01 17:41:48,494 - INFO - [Train] step: 15499, loss_G: 0.299725, lr_G: 0.000800
336
+ 2023-07-01 17:42:44,883 - INFO - [Train] step: 15599, loss_D: 0.014168, lr_D: 0.000800
337
+ 2023-07-01 17:42:44,965 - INFO - [Train] step: 15599, loss_G: 0.514779, lr_G: 0.000800
338
+ 2023-07-01 17:43:41,181 - INFO - [Train] step: 15699, loss_D: 0.021297, lr_D: 0.000800
339
+ 2023-07-01 17:43:41,263 - INFO - [Train] step: 15699, loss_G: 0.432311, lr_G: 0.000800
340
+ 2023-07-01 17:44:37,493 - INFO - [Train] step: 15799, loss_D: 0.029429, lr_D: 0.000800
341
+ 2023-07-01 17:44:37,575 - INFO - [Train] step: 15799, loss_G: 0.407868, lr_G: 0.000800
342
+ 2023-07-01 17:45:33,799 - INFO - [Train] step: 15899, loss_D: 0.041180, lr_D: 0.000800
343
+ 2023-07-01 17:45:33,881 - INFO - [Train] step: 15899, loss_G: 0.669948, lr_G: 0.000800
344
+ 2023-07-01 17:46:30,117 - INFO - [Train] step: 15999, loss_D: 0.055078, lr_D: 0.000800
345
+ 2023-07-01 17:46:30,199 - INFO - [Train] step: 15999, loss_G: 0.672064, lr_G: 0.000800
346
+ 2023-07-01 17:47:34,364 - INFO - [Eval] step: 15999, fid: 32.581077
347
+ 2023-07-01 17:48:30,631 - INFO - [Train] step: 16099, loss_D: 0.014624, lr_D: 0.000800
348
+ 2023-07-01 17:48:30,712 - INFO - [Train] step: 16099, loss_G: 0.541963, lr_G: 0.000800
349
+ 2023-07-01 17:49:26,885 - INFO - [Train] step: 16199, loss_D: 0.015709, lr_D: 0.000800
350
+ 2023-07-01 17:49:26,966 - INFO - [Train] step: 16199, loss_G: 0.470793, lr_G: 0.000800
351
+ 2023-07-01 17:50:23,135 - INFO - [Train] step: 16299, loss_D: 0.015616, lr_D: 0.000800
352
+ 2023-07-01 17:50:23,216 - INFO - [Train] step: 16299, loss_G: 0.435368, lr_G: 0.000800
353
+ 2023-07-01 17:51:19,433 - INFO - [Train] step: 16399, loss_D: 0.016343, lr_D: 0.000800
354
+ 2023-07-01 17:51:19,514 - INFO - [Train] step: 16399, loss_G: 0.415708, lr_G: 0.000800
355
+ 2023-07-01 17:52:15,754 - INFO - [Train] step: 16499, loss_D: 0.011294, lr_D: 0.000800
356
+ 2023-07-01 17:52:15,836 - INFO - [Train] step: 16499, loss_G: 0.482849, lr_G: 0.000800
357
+ 2023-07-01 17:53:12,051 - INFO - [Train] step: 16599, loss_D: 0.019565, lr_D: 0.000800
358
+ 2023-07-01 17:53:12,133 - INFO - [Train] step: 16599, loss_G: 0.615770, lr_G: 0.000800
359
+ 2023-07-01 17:54:08,413 - INFO - [Train] step: 16699, loss_D: 0.014137, lr_D: 0.000800
360
+ 2023-07-01 17:54:08,495 - INFO - [Train] step: 16699, loss_G: 0.564877, lr_G: 0.000800
361
+ 2023-07-01 17:55:04,714 - INFO - [Train] step: 16799, loss_D: 0.057961, lr_D: 0.000800
362
+ 2023-07-01 17:55:04,795 - INFO - [Train] step: 16799, loss_G: 0.453062, lr_G: 0.000800
363
+ 2023-07-01 17:56:01,182 - INFO - [Train] step: 16899, loss_D: 0.020187, lr_D: 0.000800
364
+ 2023-07-01 17:56:01,263 - INFO - [Train] step: 16899, loss_G: 0.353994, lr_G: 0.000800
365
+ 2023-07-01 17:56:57,505 - INFO - [Train] step: 16999, loss_D: 0.012622, lr_D: 0.000800
366
+ 2023-07-01 17:56:57,587 - INFO - [Train] step: 16999, loss_G: 0.529359, lr_G: 0.000800
367
+ 2023-07-01 17:58:01,832 - INFO - [Eval] step: 16999, fid: 31.229778
368
+ 2023-07-01 17:58:58,123 - INFO - [Train] step: 17099, loss_D: 0.030559, lr_D: 0.000800
369
+ 2023-07-01 17:58:58,204 - INFO - [Train] step: 17099, loss_G: 0.422545, lr_G: 0.000800
370
+ 2023-07-01 17:59:54,344 - INFO - [Train] step: 17199, loss_D: 0.043773, lr_D: 0.000800
371
+ 2023-07-01 17:59:54,425 - INFO - [Train] step: 17199, loss_G: 0.540815, lr_G: 0.000800
372
+ 2023-07-01 18:00:50,692 - INFO - [Train] step: 17299, loss_D: 0.011587, lr_D: 0.000800
373
+ 2023-07-01 18:00:50,773 - INFO - [Train] step: 17299, loss_G: 0.519015, lr_G: 0.000800
374
+ 2023-07-01 18:01:47,007 - INFO - [Train] step: 17399, loss_D: 0.015168, lr_D: 0.000800
375
+ 2023-07-01 18:01:47,089 - INFO - [Train] step: 17399, loss_G: 0.517359, lr_G: 0.000800
376
+ 2023-07-01 18:02:43,479 - INFO - [Train] step: 17499, loss_D: 0.009712, lr_D: 0.000800
377
+ 2023-07-01 18:02:43,561 - INFO - [Train] step: 17499, loss_G: 0.571477, lr_G: 0.000800
378
+ 2023-07-01 18:03:39,809 - INFO - [Train] step: 17599, loss_D: 0.029833, lr_D: 0.000800
379
+ 2023-07-01 18:03:39,891 - INFO - [Train] step: 17599, loss_G: 0.441681, lr_G: 0.000800
380
+ 2023-07-01 18:04:36,148 - INFO - [Train] step: 17699, loss_D: 0.023777, lr_D: 0.000800
381
+ 2023-07-01 18:04:36,229 - INFO - [Train] step: 17699, loss_G: 0.495252, lr_G: 0.000800
382
+ 2023-07-01 18:05:32,458 - INFO - [Train] step: 17799, loss_D: 0.011950, lr_D: 0.000800
383
+ 2023-07-01 18:05:32,539 - INFO - [Train] step: 17799, loss_G: 0.433190, lr_G: 0.000800
384
+ 2023-07-01 18:06:28,769 - INFO - [Train] step: 17899, loss_D: 0.011834, lr_D: 0.000800
385
+ 2023-07-01 18:06:28,850 - INFO - [Train] step: 17899, loss_G: 0.614211, lr_G: 0.000800
386
+ 2023-07-01 18:07:25,079 - INFO - [Train] step: 17999, loss_D: 0.011349, lr_D: 0.000800
387
+ 2023-07-01 18:07:25,161 - INFO - [Train] step: 17999, loss_G: 0.452987, lr_G: 0.000800
388
+ 2023-07-01 18:08:29,368 - INFO - [Eval] step: 17999, fid: 30.362633
389
+ 2023-07-01 18:09:25,638 - INFO - [Train] step: 18099, loss_D: 0.009980, lr_D: 0.000800
390
+ 2023-07-01 18:09:25,719 - INFO - [Train] step: 18099, loss_G: 0.427755, lr_G: 0.000800
391
+ 2023-07-01 18:10:22,059 - INFO - [Train] step: 18199, loss_D: 0.024319, lr_D: 0.000800
392
+ 2023-07-01 18:10:22,141 - INFO - [Train] step: 18199, loss_G: 0.401313, lr_G: 0.000800
393
+ 2023-07-01 18:11:18,394 - INFO - [Train] step: 18299, loss_D: 0.015745, lr_D: 0.000800
394
+ 2023-07-01 18:11:18,476 - INFO - [Train] step: 18299, loss_G: 0.375731, lr_G: 0.000800
395
+ 2023-07-01 18:12:14,735 - INFO - [Train] step: 18399, loss_D: 0.011599, lr_D: 0.000800
396
+ 2023-07-01 18:12:14,816 - INFO - [Train] step: 18399, loss_G: 0.519474, lr_G: 0.000800
397
+ 2023-07-01 18:13:11,096 - INFO - [Train] step: 18499, loss_D: 0.021131, lr_D: 0.000800
398
+ 2023-07-01 18:13:11,177 - INFO - [Train] step: 18499, loss_G: 0.422406, lr_G: 0.000800
399
+ 2023-07-01 18:14:07,434 - INFO - [Train] step: 18599, loss_D: 0.034770, lr_D: 0.000800
400
+ 2023-07-01 18:14:07,516 - INFO - [Train] step: 18599, loss_G: 0.503398, lr_G: 0.000800
401
+ 2023-07-01 18:15:03,752 - INFO - [Train] step: 18699, loss_D: 0.010583, lr_D: 0.000800
402
+ 2023-07-01 18:15:03,833 - INFO - [Train] step: 18699, loss_G: 0.469879, lr_G: 0.000800
403
+ 2023-07-01 18:16:00,236 - INFO - [Train] step: 18799, loss_D: 0.011668, lr_D: 0.000800
404
+ 2023-07-01 18:16:00,318 - INFO - [Train] step: 18799, loss_G: 0.603311, lr_G: 0.000800
405
+ 2023-07-01 18:16:56,560 - INFO - [Train] step: 18899, loss_D: 0.030451, lr_D: 0.000800
406
+ 2023-07-01 18:16:56,641 - INFO - [Train] step: 18899, loss_G: 0.477881, lr_G: 0.000800
407
+ 2023-07-01 18:17:52,894 - INFO - [Train] step: 18999, loss_D: 0.037809, lr_D: 0.000800
408
+ 2023-07-01 18:17:52,975 - INFO - [Train] step: 18999, loss_G: 0.316230, lr_G: 0.000800
409
+ 2023-07-01 18:18:57,240 - INFO - [Eval] step: 18999, fid: 29.796473
410
+ 2023-07-01 18:19:53,528 - INFO - [Train] step: 19099, loss_D: 0.007894, lr_D: 0.000800
411
+ 2023-07-01 18:19:53,609 - INFO - [Train] step: 19099, loss_G: 0.468824, lr_G: 0.000800
412
+ 2023-07-01 18:20:49,872 - INFO - [Train] step: 19199, loss_D: 0.046049, lr_D: 0.000800
413
+ 2023-07-01 18:20:49,953 - INFO - [Train] step: 19199, loss_G: 0.385633, lr_G: 0.000800
414
+ 2023-07-01 18:21:46,205 - INFO - [Train] step: 19299, loss_D: 0.014116, lr_D: 0.000800
415
+ 2023-07-01 18:21:46,286 - INFO - [Train] step: 19299, loss_G: 0.507936, lr_G: 0.000800
416
+ 2023-07-01 18:22:42,578 - INFO - [Train] step: 19399, loss_D: 0.014901, lr_D: 0.000800
417
+ 2023-07-01 18:22:42,660 - INFO - [Train] step: 19399, loss_G: 0.515274, lr_G: 0.000800
418
+ 2023-07-01 18:23:39,066 - INFO - [Train] step: 19499, loss_D: 0.021365, lr_D: 0.000800
419
+ 2023-07-01 18:23:39,147 - INFO - [Train] step: 19499, loss_G: 0.399002, lr_G: 0.000800
420
+ 2023-07-01 18:24:35,373 - INFO - [Train] step: 19599, loss_D: 0.011417, lr_D: 0.000800
421
+ 2023-07-01 18:24:35,455 - INFO - [Train] step: 19599, loss_G: 0.471099, lr_G: 0.000800
422
+ 2023-07-01 18:25:31,709 - INFO - [Train] step: 19699, loss_D: 0.007876, lr_D: 0.000800
423
+ 2023-07-01 18:25:31,791 - INFO - [Train] step: 19699, loss_G: 0.545653, lr_G: 0.000800
424
+ 2023-07-01 18:26:28,052 - INFO - [Train] step: 19799, loss_D: 0.009936, lr_D: 0.000800
425
+ 2023-07-01 18:26:28,133 - INFO - [Train] step: 19799, loss_G: 0.574504, lr_G: 0.000800
426
+ 2023-07-01 18:27:24,355 - INFO - [Train] step: 19899, loss_D: 0.013347, lr_D: 0.000800
427
+ 2023-07-01 18:27:24,436 - INFO - [Train] step: 19899, loss_G: 0.600867, lr_G: 0.000800
428
+ 2023-07-01 18:28:20,702 - INFO - [Train] step: 19999, loss_D: 0.025694, lr_D: 0.000800
429
+ 2023-07-01 18:28:20,783 - INFO - [Train] step: 19999, loss_G: 0.346467, lr_G: 0.000800
430
+ 2023-07-01 18:29:24,970 - INFO - [Eval] step: 19999, fid: 30.423091
431
+ 2023-07-01 18:30:21,286 - INFO - [Train] step: 20099, loss_D: 0.022139, lr_D: 0.000800
432
+ 2023-07-01 18:30:21,367 - INFO - [Train] step: 20099, loss_G: 0.393186, lr_G: 0.000800
433
+ 2023-07-01 18:31:17,554 - INFO - [Train] step: 20199, loss_D: 0.012953, lr_D: 0.000800
434
+ 2023-07-01 18:31:17,636 - INFO - [Train] step: 20199, loss_G: 0.466040, lr_G: 0.000800
435
+ 2023-07-01 18:32:13,856 - INFO - [Train] step: 20299, loss_D: 0.006963, lr_D: 0.000800
436
+ 2023-07-01 18:32:13,938 - INFO - [Train] step: 20299, loss_G: 0.542140, lr_G: 0.000800
437
+ 2023-07-01 18:33:10,183 - INFO - [Train] step: 20399, loss_D: 0.056166, lr_D: 0.000800
438
+ 2023-07-01 18:33:10,264 - INFO - [Train] step: 20399, loss_G: 0.334981, lr_G: 0.000800
439
+ 2023-07-01 18:34:06,496 - INFO - [Train] step: 20499, loss_D: 0.009354, lr_D: 0.000800
440
+ 2023-07-01 18:34:06,578 - INFO - [Train] step: 20499, loss_G: 0.494651, lr_G: 0.000800
441
+ 2023-07-01 18:35:02,841 - INFO - [Train] step: 20599, loss_D: 0.024258, lr_D: 0.000800
442
+ 2023-07-01 18:35:02,922 - INFO - [Train] step: 20599, loss_G: 0.423513, lr_G: 0.000800
443
+ 2023-07-01 18:35:59,343 - INFO - [Train] step: 20699, loss_D: 0.009255, lr_D: 0.000800
444
+ 2023-07-01 18:35:59,425 - INFO - [Train] step: 20699, loss_G: 0.485088, lr_G: 0.000800
445
+ 2023-07-01 18:36:55,665 - INFO - [Train] step: 20799, loss_D: 0.025261, lr_D: 0.000800
446
+ 2023-07-01 18:36:55,746 - INFO - [Train] step: 20799, loss_G: 0.543806, lr_G: 0.000800
447
+ 2023-07-01 18:37:51,995 - INFO - [Train] step: 20899, loss_D: 0.007658, lr_D: 0.000800
448
+ 2023-07-01 18:37:52,076 - INFO - [Train] step: 20899, loss_G: 0.520707, lr_G: 0.000800
449
+ 2023-07-01 18:38:48,335 - INFO - [Train] step: 20999, loss_D: 0.153177, lr_D: 0.000800
450
+ 2023-07-01 18:38:48,417 - INFO - [Train] step: 20999, loss_G: 0.215366, lr_G: 0.000800
451
+ 2023-07-01 18:39:52,815 - INFO - [Eval] step: 20999, fid: 29.908982
452
+ 2023-07-01 18:40:48,807 - INFO - [Train] step: 21099, loss_D: 0.007064, lr_D: 0.000800
453
+ 2023-07-01 18:40:48,889 - INFO - [Train] step: 21099, loss_G: 0.555538, lr_G: 0.000800
454
+ 2023-07-01 18:41:45,083 - INFO - [Train] step: 21199, loss_D: 0.011253, lr_D: 0.000800
455
+ 2023-07-01 18:41:45,164 - INFO - [Train] step: 21199, loss_G: 0.575167, lr_G: 0.000800
456
+ 2023-07-01 18:42:41,395 - INFO - [Train] step: 21299, loss_D: 0.008744, lr_D: 0.000800
457
+ 2023-07-01 18:42:41,476 - INFO - [Train] step: 21299, loss_G: 0.515165, lr_G: 0.000800
458
+ 2023-07-01 18:43:37,874 - INFO - [Train] step: 21399, loss_D: 0.008760, lr_D: 0.000800
459
+ 2023-07-01 18:43:37,955 - INFO - [Train] step: 21399, loss_G: 0.422557, lr_G: 0.000800
460
+ 2023-07-01 18:44:34,223 - INFO - [Train] step: 21499, loss_D: 0.007670, lr_D: 0.000800
461
+ 2023-07-01 18:44:34,305 - INFO - [Train] step: 21499, loss_G: 0.504641, lr_G: 0.000800
462
+ 2023-07-01 18:45:30,533 - INFO - [Train] step: 21599, loss_D: 0.009109, lr_D: 0.000800
463
+ 2023-07-01 18:45:30,615 - INFO - [Train] step: 21599, loss_G: 0.550982, lr_G: 0.000800
464
+ 2023-07-01 18:46:26,841 - INFO - [Train] step: 21699, loss_D: 0.014999, lr_D: 0.000800
465
+ 2023-07-01 18:46:26,923 - INFO - [Train] step: 21699, loss_G: 0.476287, lr_G: 0.000800
466
+ 2023-07-01 18:47:23,172 - INFO - [Train] step: 21799, loss_D: 0.011643, lr_D: 0.000800
467
+ 2023-07-01 18:47:23,254 - INFO - [Train] step: 21799, loss_G: 0.562363, lr_G: 0.000800
468
+ 2023-07-01 18:48:19,471 - INFO - [Train] step: 21899, loss_D: 0.007825, lr_D: 0.000800
469
+ 2023-07-01 18:48:19,553 - INFO - [Train] step: 21899, loss_G: 0.461671, lr_G: 0.000800
470
+ 2023-07-01 18:49:15,956 - INFO - [Train] step: 21999, loss_D: 0.008241, lr_D: 0.000800
471
+ 2023-07-01 18:49:16,038 - INFO - [Train] step: 21999, loss_G: 0.489334, lr_G: 0.000800
472
+ 2023-07-01 18:50:20,209 - INFO - [Eval] step: 21999, fid: 29.695085
473
+ 2023-07-01 18:51:16,513 - INFO - [Train] step: 22099, loss_D: 0.027289, lr_D: 0.000800
474
+ 2023-07-01 18:51:16,595 - INFO - [Train] step: 22099, loss_G: 0.421344, lr_G: 0.000800
475
+ 2023-07-01 18:52:12,821 - INFO - [Train] step: 22199, loss_D: 0.011550, lr_D: 0.000800
476
+ 2023-07-01 18:52:12,903 - INFO - [Train] step: 22199, loss_G: 0.429288, lr_G: 0.000800
477
+ 2023-07-01 18:53:09,153 - INFO - [Train] step: 22299, loss_D: 0.024233, lr_D: 0.000800
478
+ 2023-07-01 18:53:09,235 - INFO - [Train] step: 22299, loss_G: 0.286523, lr_G: 0.000800
479
+ 2023-07-01 18:54:05,488 - INFO - [Train] step: 22399, loss_D: 0.034346, lr_D: 0.000800
480
+ 2023-07-01 18:54:05,569 - INFO - [Train] step: 22399, loss_G: 0.445654, lr_G: 0.000800
481
+ 2023-07-01 18:55:01,793 - INFO - [Train] step: 22499, loss_D: 0.012056, lr_D: 0.000800
482
+ 2023-07-01 18:55:01,875 - INFO - [Train] step: 22499, loss_G: 0.430716, lr_G: 0.000800
483
+ 2023-07-01 18:55:58,117 - INFO - [Train] step: 22599, loss_D: 0.017314, lr_D: 0.000800
484
+ 2023-07-01 18:55:58,197 - INFO - [Train] step: 22599, loss_G: 0.401118, lr_G: 0.000800
485
+ 2023-07-01 18:56:54,573 - INFO - [Train] step: 22699, loss_D: 0.006963, lr_D: 0.000800
486
+ 2023-07-01 18:56:54,655 - INFO - [Train] step: 22699, loss_G: 0.511922, lr_G: 0.000800
487
+ 2023-07-01 18:57:50,896 - INFO - [Train] step: 22799, loss_D: 0.007882, lr_D: 0.000800
488
+ 2023-07-01 18:57:50,978 - INFO - [Train] step: 22799, loss_G: 0.539776, lr_G: 0.000800
489
+ 2023-07-01 18:58:47,253 - INFO - [Train] step: 22899, loss_D: 0.007652, lr_D: 0.000800
490
+ 2023-07-01 18:58:47,334 - INFO - [Train] step: 22899, loss_G: 0.553684, lr_G: 0.000800
491
+ 2023-07-01 18:59:43,570 - INFO - [Train] step: 22999, loss_D: 0.011942, lr_D: 0.000800
492
+ 2023-07-01 18:59:43,651 - INFO - [Train] step: 22999, loss_G: 0.381835, lr_G: 0.000800
493
+ 2023-07-01 19:00:47,913 - INFO - [Eval] step: 22999, fid: 29.447679
494
+ 2023-07-01 19:01:44,203 - INFO - [Train] step: 23099, loss_D: 0.008218, lr_D: 0.000800
495
+ 2023-07-01 19:01:44,285 - INFO - [Train] step: 23099, loss_G: 0.537277, lr_G: 0.000800
496
+ 2023-07-01 19:02:40,471 - INFO - [Train] step: 23199, loss_D: 0.012256, lr_D: 0.000800
497
+ 2023-07-01 19:02:40,552 - INFO - [Train] step: 23199, loss_G: 0.600389, lr_G: 0.000800
498
+ 2023-07-01 19:03:36,958 - INFO - [Train] step: 23299, loss_D: 0.027411, lr_D: 0.000800
499
+ 2023-07-01 19:03:37,040 - INFO - [Train] step: 23299, loss_G: 0.384273, lr_G: 0.000800
500
+ 2023-07-01 19:04:33,265 - INFO - [Train] step: 23399, loss_D: 0.008528, lr_D: 0.000800
501
+ 2023-07-01 19:04:33,347 - INFO - [Train] step: 23399, loss_G: 0.534337, lr_G: 0.000800
502
+ 2023-07-01 19:05:29,582 - INFO - [Train] step: 23499, loss_D: 0.014829, lr_D: 0.000800
503
+ 2023-07-01 19:05:29,664 - INFO - [Train] step: 23499, loss_G: 0.630182, lr_G: 0.000800
504
+ 2023-07-01 19:06:25,897 - INFO - [Train] step: 23599, loss_D: 0.008179, lr_D: 0.000800
505
+ 2023-07-01 19:06:25,978 - INFO - [Train] step: 23599, loss_G: 0.508672, lr_G: 0.000800
506
+ 2023-07-01 19:07:22,206 - INFO - [Train] step: 23699, loss_D: 0.006598, lr_D: 0.000800
507
+ 2023-07-01 19:07:22,288 - INFO - [Train] step: 23699, loss_G: 0.390657, lr_G: 0.000800
508
+ 2023-07-01 19:08:18,522 - INFO - [Train] step: 23799, loss_D: 0.006603, lr_D: 0.000800
509
+ 2023-07-01 19:08:18,604 - INFO - [Train] step: 23799, loss_G: 0.485955, lr_G: 0.000800
510
+ 2023-07-01 19:09:14,834 - INFO - [Train] step: 23899, loss_D: 0.021518, lr_D: 0.000800
511
+ 2023-07-01 19:09:14,915 - INFO - [Train] step: 23899, loss_G: 0.436500, lr_G: 0.000800
512
+ 2023-07-01 19:10:11,323 - INFO - [Train] step: 23999, loss_D: 0.021096, lr_D: 0.000800
513
+ 2023-07-01 19:10:11,405 - INFO - [Train] step: 23999, loss_G: 0.515012, lr_G: 0.000800
514
+ 2023-07-01 19:11:15,668 - INFO - [Eval] step: 23999, fid: 29.460222
515
+ 2023-07-01 19:12:11,728 - INFO - [Train] step: 24099, loss_D: 0.011518, lr_D: 0.000800
516
+ 2023-07-01 19:12:11,810 - INFO - [Train] step: 24099, loss_G: 0.429125, lr_G: 0.000800
517
+ 2023-07-01 19:13:08,011 - INFO - [Train] step: 24199, loss_D: 0.005955, lr_D: 0.000800
518
+ 2023-07-01 19:13:08,093 - INFO - [Train] step: 24199, loss_G: 0.494475, lr_G: 0.000800
519
+ 2023-07-01 19:14:04,308 - INFO - [Train] step: 24299, loss_D: 0.019255, lr_D: 0.000800
520
+ 2023-07-01 19:14:04,390 - INFO - [Train] step: 24299, loss_G: 0.434792, lr_G: 0.000800
521
+ 2023-07-01 19:15:00,634 - INFO - [Train] step: 24399, loss_D: 0.017019, lr_D: 0.000800
522
+ 2023-07-01 19:15:00,715 - INFO - [Train] step: 24399, loss_G: 0.598647, lr_G: 0.000800
523
+ 2023-07-01 19:15:56,962 - INFO - [Train] step: 24499, loss_D: 0.011056, lr_D: 0.000800
524
+ 2023-07-01 19:15:57,044 - INFO - [Train] step: 24499, loss_G: 0.476328, lr_G: 0.000800
525
+ 2023-07-01 19:16:53,412 - INFO - [Train] step: 24599, loss_D: 0.007062, lr_D: 0.000800
526
+ 2023-07-01 19:16:53,493 - INFO - [Train] step: 24599, loss_G: 0.562907, lr_G: 0.000800
527
+ 2023-07-01 19:17:49,703 - INFO - [Train] step: 24699, loss_D: 0.007729, lr_D: 0.000800
528
+ 2023-07-01 19:17:49,785 - INFO - [Train] step: 24699, loss_G: 0.429338, lr_G: 0.000800
529
+ 2023-07-01 19:18:45,995 - INFO - [Train] step: 24799, loss_D: 0.006395, lr_D: 0.000800
530
+ 2023-07-01 19:18:46,076 - INFO - [Train] step: 24799, loss_G: 0.507643, lr_G: 0.000800
531
+ 2023-07-01 19:19:42,271 - INFO - [Train] step: 24899, loss_D: 0.007473, lr_D: 0.000800
532
+ 2023-07-01 19:19:42,352 - INFO - [Train] step: 24899, loss_G: 0.470858, lr_G: 0.000800
533
+ 2023-07-01 19:20:38,582 - INFO - [Train] step: 24999, loss_D: 0.026236, lr_D: 0.000800
534
+ 2023-07-01 19:20:38,664 - INFO - [Train] step: 24999, loss_G: 0.478788, lr_G: 0.000800
535
+ 2023-07-01 19:21:42,893 - INFO - [Eval] step: 24999, fid: 29.629865
536
+ 2023-07-01 19:22:38,940 - INFO - [Train] step: 25099, loss_D: 0.013720, lr_D: 0.000800
537
+ 2023-07-01 19:22:39,022 - INFO - [Train] step: 25099, loss_G: 0.491596, lr_G: 0.000800
538
+ 2023-07-01 19:23:35,210 - INFO - [Train] step: 25199, loss_D: 0.007071, lr_D: 0.000800
539
+ 2023-07-01 19:23:35,291 - INFO - [Train] step: 25199, loss_G: 0.494100, lr_G: 0.000800
540
+ 2023-07-01 19:24:31,672 - INFO - [Train] step: 25299, loss_D: 0.015746, lr_D: 0.000800
541
+ 2023-07-01 19:24:31,753 - INFO - [Train] step: 25299, loss_G: 0.441014, lr_G: 0.000800
542
+ 2023-07-01 19:25:27,969 - INFO - [Train] step: 25399, loss_D: 0.007127, lr_D: 0.000800
543
+ 2023-07-01 19:25:28,050 - INFO - [Train] step: 25399, loss_G: 0.528152, lr_G: 0.000800
544
+ 2023-07-01 19:26:24,257 - INFO - [Train] step: 25499, loss_D: 0.010578, lr_D: 0.000800
545
+ 2023-07-01 19:26:24,339 - INFO - [Train] step: 25499, loss_G: 0.574309, lr_G: 0.000800
546
+ 2023-07-01 19:27:20,552 - INFO - [Train] step: 25599, loss_D: 0.009386, lr_D: 0.000800
547
+ 2023-07-01 19:27:20,633 - INFO - [Train] step: 25599, loss_G: 0.428291, lr_G: 0.000800
548
+ 2023-07-01 19:28:16,858 - INFO - [Train] step: 25699, loss_D: 0.013623, lr_D: 0.000800
549
+ 2023-07-01 19:28:16,939 - INFO - [Train] step: 25699, loss_G: 0.512218, lr_G: 0.000800
550
+ 2023-07-01 19:29:13,161 - INFO - [Train] step: 25799, loss_D: 0.009861, lr_D: 0.000800
551
+ 2023-07-01 19:29:13,242 - INFO - [Train] step: 25799, loss_G: 0.500618, lr_G: 0.000800
552
+ 2023-07-01 19:30:09,619 - INFO - [Train] step: 25899, loss_D: 0.019092, lr_D: 0.000800
553
+ 2023-07-01 19:30:09,701 - INFO - [Train] step: 25899, loss_G: 0.402772, lr_G: 0.000800
554
+ 2023-07-01 19:31:05,920 - INFO - [Train] step: 25999, loss_D: 0.005325, lr_D: 0.000800
555
+ 2023-07-01 19:31:06,002 - INFO - [Train] step: 25999, loss_G: 0.579325, lr_G: 0.000800
556
+ 2023-07-01 19:32:10,282 - INFO - [Eval] step: 25999, fid: 29.136528
557
+ 2023-07-01 19:33:06,600 - INFO - [Train] step: 26099, loss_D: 0.011602, lr_D: 0.000800
558
+ 2023-07-01 19:33:06,681 - INFO - [Train] step: 26099, loss_G: 0.547868, lr_G: 0.000800
559
+ 2023-07-01 19:34:02,897 - INFO - [Train] step: 26199, loss_D: 0.029338, lr_D: 0.000800
560
+ 2023-07-01 19:34:02,979 - INFO - [Train] step: 26199, loss_G: 0.247423, lr_G: 0.000800
561
+ 2023-07-01 19:34:59,195 - INFO - [Train] step: 26299, loss_D: 0.012751, lr_D: 0.000800
562
+ 2023-07-01 19:34:59,276 - INFO - [Train] step: 26299, loss_G: 0.473507, lr_G: 0.000800
563
+ 2023-07-01 19:35:55,494 - INFO - [Train] step: 26399, loss_D: 0.007472, lr_D: 0.000800
564
+ 2023-07-01 19:35:55,576 - INFO - [Train] step: 26399, loss_G: 0.391524, lr_G: 0.000800
565
+ 2023-07-01 19:36:51,770 - INFO - [Train] step: 26499, loss_D: 0.007753, lr_D: 0.000800
566
+ 2023-07-01 19:36:51,851 - INFO - [Train] step: 26499, loss_G: 0.507710, lr_G: 0.000800
567
+ 2023-07-01 19:37:48,217 - INFO - [Train] step: 26599, loss_D: 0.008571, lr_D: 0.000800
568
+ 2023-07-01 19:37:48,299 - INFO - [Train] step: 26599, loss_G: 0.578436, lr_G: 0.000800
569
+ 2023-07-01 19:38:44,497 - INFO - [Train] step: 26699, loss_D: 0.011326, lr_D: 0.000800
570
+ 2023-07-01 19:38:44,579 - INFO - [Train] step: 26699, loss_G: 0.419997, lr_G: 0.000800
571
+ 2023-07-01 19:39:40,778 - INFO - [Train] step: 26799, loss_D: 0.017305, lr_D: 0.000800
572
+ 2023-07-01 19:39:40,860 - INFO - [Train] step: 26799, loss_G: 0.346003, lr_G: 0.000800
573
+ 2023-07-01 19:40:37,063 - INFO - [Train] step: 26899, loss_D: 0.008035, lr_D: 0.000800
574
+ 2023-07-01 19:40:37,144 - INFO - [Train] step: 26899, loss_G: 0.499777, lr_G: 0.000800
575
+ 2023-07-01 19:41:33,336 - INFO - [Train] step: 26999, loss_D: 0.011537, lr_D: 0.000800
576
+ 2023-07-01 19:41:33,417 - INFO - [Train] step: 26999, loss_G: 0.458730, lr_G: 0.000800
577
+ 2023-07-01 19:42:37,585 - INFO - [Eval] step: 26999, fid: 28.942655
578
+ 2023-07-01 19:43:33,919 - INFO - [Train] step: 27099, loss_D: 0.006187, lr_D: 0.000800
579
+ 2023-07-01 19:43:34,000 - INFO - [Train] step: 27099, loss_G: 0.490163, lr_G: 0.000800
580
+ 2023-07-01 19:44:30,366 - INFO - [Train] step: 27199, loss_D: 0.026210, lr_D: 0.000800
581
+ 2023-07-01 19:44:30,447 - INFO - [Train] step: 27199, loss_G: 0.390455, lr_G: 0.000800
582
+ 2023-07-01 19:45:26,665 - INFO - [Train] step: 27299, loss_D: 0.011224, lr_D: 0.000800
583
+ 2023-07-01 19:45:26,746 - INFO - [Train] step: 27299, loss_G: 0.457716, lr_G: 0.000800
584
+ 2023-07-01 19:46:22,972 - INFO - [Train] step: 27399, loss_D: 0.006952, lr_D: 0.000800
585
+ 2023-07-01 19:46:23,053 - INFO - [Train] step: 27399, loss_G: 0.552276, lr_G: 0.000800
586
+ 2023-07-01 19:47:19,258 - INFO - [Train] step: 27499, loss_D: 0.014164, lr_D: 0.000800
587
+ 2023-07-01 19:47:19,339 - INFO - [Train] step: 27499, loss_G: 0.496477, lr_G: 0.000800
588
+ 2023-07-01 19:48:15,563 - INFO - [Train] step: 27599, loss_D: 0.015689, lr_D: 0.000800
589
+ 2023-07-01 19:48:15,644 - INFO - [Train] step: 27599, loss_G: 0.387336, lr_G: 0.000800
590
+ 2023-07-01 19:49:11,838 - INFO - [Train] step: 27699, loss_D: 0.011430, lr_D: 0.000800
591
+ 2023-07-01 19:49:11,919 - INFO - [Train] step: 27699, loss_G: 0.475477, lr_G: 0.000800
592
+ 2023-07-01 19:50:08,134 - INFO - [Train] step: 27799, loss_D: 0.008779, lr_D: 0.000800
593
+ 2023-07-01 19:50:08,216 - INFO - [Train] step: 27799, loss_G: 0.550584, lr_G: 0.000800
594
+ 2023-07-01 19:51:04,583 - INFO - [Train] step: 27899, loss_D: 0.007873, lr_D: 0.000800
595
+ 2023-07-01 19:51:04,664 - INFO - [Train] step: 27899, loss_G: 0.609806, lr_G: 0.000800
596
+ 2023-07-01 19:52:00,886 - INFO - [Train] step: 27999, loss_D: 0.006971, lr_D: 0.000800
597
+ 2023-07-01 19:52:00,967 - INFO - [Train] step: 27999, loss_G: 0.538687, lr_G: 0.000800
598
+ 2023-07-01 19:53:05,171 - INFO - [Eval] step: 27999, fid: 29.974261
599
+ 2023-07-01 19:54:01,186 - INFO - [Train] step: 28099, loss_D: 0.008715, lr_D: 0.000800
600
+ 2023-07-01 19:54:01,267 - INFO - [Train] step: 28099, loss_G: 0.536489, lr_G: 0.000800
601
+ 2023-07-01 19:54:57,458 - INFO - [Train] step: 28199, loss_D: 0.008299, lr_D: 0.000800
602
+ 2023-07-01 19:54:57,540 - INFO - [Train] step: 28199, loss_G: 0.577131, lr_G: 0.000800
603
+ 2023-07-01 19:55:53,751 - INFO - [Train] step: 28299, loss_D: 0.017663, lr_D: 0.000800
604
+ 2023-07-01 19:55:53,833 - INFO - [Train] step: 28299, loss_G: 0.365824, lr_G: 0.000800
605
+ 2023-07-01 19:56:50,058 - INFO - [Train] step: 28399, loss_D: 0.006118, lr_D: 0.000800
606
+ 2023-07-01 19:56:50,138 - INFO - [Train] step: 28399, loss_G: 0.454250, lr_G: 0.000800
607
+ 2023-07-01 19:57:46,505 - INFO - [Train] step: 28499, loss_D: 0.094912, lr_D: 0.000800
608
+ 2023-07-01 19:57:46,586 - INFO - [Train] step: 28499, loss_G: 0.314954, lr_G: 0.000800
609
+ 2023-07-01 19:58:42,788 - INFO - [Train] step: 28599, loss_D: 0.007507, lr_D: 0.000800
610
+ 2023-07-01 19:58:42,870 - INFO - [Train] step: 28599, loss_G: 0.507665, lr_G: 0.000800
611
+ 2023-07-01 19:59:39,065 - INFO - [Train] step: 28699, loss_D: 0.009753, lr_D: 0.000800
612
+ 2023-07-01 19:59:39,146 - INFO - [Train] step: 28699, loss_G: 0.405413, lr_G: 0.000800
613
+ 2023-07-01 20:00:35,349 - INFO - [Train] step: 28799, loss_D: 0.013898, lr_D: 0.000800
614
+ 2023-07-01 20:00:35,430 - INFO - [Train] step: 28799, loss_G: 0.588386, lr_G: 0.000800
615
+ 2023-07-01 20:01:31,640 - INFO - [Train] step: 28899, loss_D: 0.004988, lr_D: 0.000800
616
+ 2023-07-01 20:01:31,721 - INFO - [Train] step: 28899, loss_G: 0.530986, lr_G: 0.000800
617
+ 2023-07-01 20:02:27,912 - INFO - [Train] step: 28999, loss_D: 0.005864, lr_D: 0.000800
618
+ 2023-07-01 20:02:27,993 - INFO - [Train] step: 28999, loss_G: 0.531886, lr_G: 0.000800
619
+ 2023-07-01 20:03:32,247 - INFO - [Eval] step: 28999, fid: 29.637611
620
+ 2023-07-01 20:04:28,335 - INFO - [Train] step: 29099, loss_D: 0.006993, lr_D: 0.000800
621
+ 2023-07-01 20:04:28,416 - INFO - [Train] step: 29099, loss_G: 0.542859, lr_G: 0.000800
622
+ 2023-07-01 20:05:24,753 - INFO - [Train] step: 29199, loss_D: 0.008590, lr_D: 0.000800
623
+ 2023-07-01 20:05:24,835 - INFO - [Train] step: 29199, loss_G: 0.436323, lr_G: 0.000800
624
+ 2023-07-01 20:06:21,039 - INFO - [Train] step: 29299, loss_D: 0.012584, lr_D: 0.000800
625
+ 2023-07-01 20:06:21,121 - INFO - [Train] step: 29299, loss_G: 0.512135, lr_G: 0.000800
626
+ 2023-07-01 20:07:17,337 - INFO - [Train] step: 29399, loss_D: 0.015635, lr_D: 0.000800
627
+ 2023-07-01 20:07:17,419 - INFO - [Train] step: 29399, loss_G: 0.409561, lr_G: 0.000800
628
+ 2023-07-01 20:08:13,631 - INFO - [Train] step: 29499, loss_D: 0.007856, lr_D: 0.000800
629
+ 2023-07-01 20:08:13,713 - INFO - [Train] step: 29499, loss_G: 0.509219, lr_G: 0.000800
630
+ 2023-07-01 20:09:09,941 - INFO - [Train] step: 29599, loss_D: 0.006401, lr_D: 0.000800
631
+ 2023-07-01 20:09:10,022 - INFO - [Train] step: 29599, loss_G: 0.540470, lr_G: 0.000800
632
+ 2023-07-01 20:10:06,214 - INFO - [Train] step: 29699, loss_D: 0.006845, lr_D: 0.000800
633
+ 2023-07-01 20:10:06,295 - INFO - [Train] step: 29699, loss_G: 0.538688, lr_G: 0.000800
634
+ 2023-07-01 20:11:02,640 - INFO - [Train] step: 29799, loss_D: 0.008562, lr_D: 0.000800
635
+ 2023-07-01 20:11:02,722 - INFO - [Train] step: 29799, loss_G: 0.471855, lr_G: 0.000800
636
+ 2023-07-01 20:11:58,939 - INFO - [Train] step: 29899, loss_D: 0.010060, lr_D: 0.000800
637
+ 2023-07-01 20:11:59,020 - INFO - [Train] step: 29899, loss_G: 0.419619, lr_G: 0.000800
638
+ 2023-07-01 20:12:55,241 - INFO - [Train] step: 29999, loss_D: 0.009867, lr_D: 0.000800
639
+ 2023-07-01 20:12:55,322 - INFO - [Train] step: 29999, loss_G: 0.467563, lr_G: 0.000800
640
+ 2023-07-01 20:13:59,545 - INFO - [Eval] step: 29999, fid: 28.438987
641
+ 2023-07-01 20:14:56,071 - INFO - [Train] step: 30099, loss_D: 0.006585, lr_D: 0.000800
642
+ 2023-07-01 20:14:56,153 - INFO - [Train] step: 30099, loss_G: 0.525810, lr_G: 0.000800
643
+ 2023-07-01 20:15:52,365 - INFO - [Train] step: 30199, loss_D: 0.006839, lr_D: 0.000800
644
+ 2023-07-01 20:15:52,446 - INFO - [Train] step: 30199, loss_G: 0.585401, lr_G: 0.000800
645
+ 2023-07-01 20:16:48,648 - INFO - [Train] step: 30299, loss_D: 0.005126, lr_D: 0.000800
646
+ 2023-07-01 20:16:48,729 - INFO - [Train] step: 30299, loss_G: 0.509553, lr_G: 0.000800
647
+ 2023-07-01 20:17:45,091 - INFO - [Train] step: 30399, loss_D: 0.015950, lr_D: 0.000800
648
+ 2023-07-01 20:17:45,172 - INFO - [Train] step: 30399, loss_G: 0.473224, lr_G: 0.000800
649
+ 2023-07-01 20:18:41,373 - INFO - [Train] step: 30499, loss_D: 0.014676, lr_D: 0.000800
650
+ 2023-07-01 20:18:41,455 - INFO - [Train] step: 30499, loss_G: 0.588311, lr_G: 0.000800
651
+ 2023-07-01 20:19:37,673 - INFO - [Train] step: 30599, loss_D: 0.006327, lr_D: 0.000800
652
+ 2023-07-01 20:19:37,754 - INFO - [Train] step: 30599, loss_G: 0.516392, lr_G: 0.000800
653
+ 2023-07-01 20:20:33,965 - INFO - [Train] step: 30699, loss_D: 0.013700, lr_D: 0.000800
654
+ 2023-07-01 20:20:34,046 - INFO - [Train] step: 30699, loss_G: 0.444895, lr_G: 0.000800
655
+ 2023-07-01 20:21:30,232 - INFO - [Train] step: 30799, loss_D: 0.021089, lr_D: 0.000800
656
+ 2023-07-01 20:21:30,313 - INFO - [Train] step: 30799, loss_G: 0.410392, lr_G: 0.000800
657
+ 2023-07-01 20:22:26,513 - INFO - [Train] step: 30899, loss_D: 0.005164, lr_D: 0.000800
658
+ 2023-07-01 20:22:26,595 - INFO - [Train] step: 30899, loss_G: 0.486042, lr_G: 0.000800
659
+ 2023-07-01 20:23:22,778 - INFO - [Train] step: 30999, loss_D: 0.006826, lr_D: 0.000800
660
+ 2023-07-01 20:23:22,859 - INFO - [Train] step: 30999, loss_G: 0.473277, lr_G: 0.000800
661
+ 2023-07-01 20:24:27,079 - INFO - [Eval] step: 30999, fid: 29.204131
662
+ 2023-07-01 20:25:23,280 - INFO - [Train] step: 31099, loss_D: 0.005944, lr_D: 0.000800
663
+ 2023-07-01 20:25:23,362 - INFO - [Train] step: 31099, loss_G: 0.498785, lr_G: 0.000800
664
+ 2023-07-01 20:26:19,543 - INFO - [Train] step: 31199, loss_D: 0.006456, lr_D: 0.000800
665
+ 2023-07-01 20:26:19,625 - INFO - [Train] step: 31199, loss_G: 0.518566, lr_G: 0.000800
666
+ 2023-07-01 20:27:15,821 - INFO - [Train] step: 31299, loss_D: 0.009163, lr_D: 0.000800
667
+ 2023-07-01 20:27:15,903 - INFO - [Train] step: 31299, loss_G: 0.493884, lr_G: 0.000800
668
+ 2023-07-01 20:28:12,117 - INFO - [Train] step: 31399, loss_D: 0.005674, lr_D: 0.000800
669
+ 2023-07-01 20:28:12,198 - INFO - [Train] step: 31399, loss_G: 0.486798, lr_G: 0.000800
670
+ 2023-07-01 20:29:08,439 - INFO - [Train] step: 31499, loss_D: 0.016643, lr_D: 0.000800
671
+ 2023-07-01 20:29:08,520 - INFO - [Train] step: 31499, loss_G: 0.306968, lr_G: 0.000800
672
+ 2023-07-01 20:30:04,710 - INFO - [Train] step: 31599, loss_D: 0.005448, lr_D: 0.000800
673
+ 2023-07-01 20:30:04,792 - INFO - [Train] step: 31599, loss_G: 0.532557, lr_G: 0.000800
674
+ 2023-07-01 20:31:01,162 - INFO - [Train] step: 31699, loss_D: 0.008905, lr_D: 0.000800
675
+ 2023-07-01 20:31:01,244 - INFO - [Train] step: 31699, loss_G: 0.529348, lr_G: 0.000800
676
+ 2023-07-01 20:31:57,439 - INFO - [Train] step: 31799, loss_D: 0.008976, lr_D: 0.000800
677
+ 2023-07-01 20:31:57,520 - INFO - [Train] step: 31799, loss_G: 0.471150, lr_G: 0.000800
678
+ 2023-07-01 20:32:53,726 - INFO - [Train] step: 31899, loss_D: 0.005551, lr_D: 0.000800
679
+ 2023-07-01 20:32:53,808 - INFO - [Train] step: 31899, loss_G: 0.525096, lr_G: 0.000800
680
+ 2023-07-01 20:33:49,986 - INFO - [Train] step: 31999, loss_D: 0.021764, lr_D: 0.000800
681
+ 2023-07-01 20:33:50,068 - INFO - [Train] step: 31999, loss_G: 0.399989, lr_G: 0.000800
682
+ 2023-07-01 20:34:54,299 - INFO - [Eval] step: 31999, fid: 28.903355
683
+ 2023-07-01 20:35:50,358 - INFO - [Train] step: 32099, loss_D: 0.007599, lr_D: 0.000800
684
+ 2023-07-01 20:35:50,440 - INFO - [Train] step: 32099, loss_G: 0.505525, lr_G: 0.000800
685
+ 2023-07-01 20:36:46,632 - INFO - [Train] step: 32199, loss_D: 0.005625, lr_D: 0.000800
686
+ 2023-07-01 20:36:46,713 - INFO - [Train] step: 32199, loss_G: 0.535835, lr_G: 0.000800
687
+ 2023-07-01 20:37:42,934 - INFO - [Train] step: 32299, loss_D: 0.014649, lr_D: 0.000800
688
+ 2023-07-01 20:37:43,015 - INFO - [Train] step: 32299, loss_G: 0.492300, lr_G: 0.000800
689
+ 2023-07-01 20:38:39,368 - INFO - [Train] step: 32399, loss_D: 0.006051, lr_D: 0.000800
690
+ 2023-07-01 20:38:39,449 - INFO - [Train] step: 32399, loss_G: 0.485706, lr_G: 0.000800
691
+ 2023-07-01 20:39:35,661 - INFO - [Train] step: 32499, loss_D: 0.009181, lr_D: 0.000800
692
+ 2023-07-01 20:39:35,742 - INFO - [Train] step: 32499, loss_G: 0.412587, lr_G: 0.000800
693
+ 2023-07-01 20:40:31,954 - INFO - [Train] step: 32599, loss_D: 0.008016, lr_D: 0.000800
694
+ 2023-07-01 20:40:32,036 - INFO - [Train] step: 32599, loss_G: 0.477234, lr_G: 0.000800
695
+ 2023-07-01 20:41:28,242 - INFO - [Train] step: 32699, loss_D: 0.014378, lr_D: 0.000800
696
+ 2023-07-01 20:41:28,323 - INFO - [Train] step: 32699, loss_G: 0.445840, lr_G: 0.000800
697
+ 2023-07-01 20:42:24,555 - INFO - [Train] step: 32799, loss_D: 0.007808, lr_D: 0.000800
698
+ 2023-07-01 20:42:24,636 - INFO - [Train] step: 32799, loss_G: 0.578918, lr_G: 0.000800
699
+ 2023-07-01 20:43:20,839 - INFO - [Train] step: 32899, loss_D: 0.009094, lr_D: 0.000800
700
+ 2023-07-01 20:43:20,921 - INFO - [Train] step: 32899, loss_G: 0.496615, lr_G: 0.000800
701
+ 2023-07-01 20:44:17,294 - INFO - [Train] step: 32999, loss_D: 0.006956, lr_D: 0.000800
702
+ 2023-07-01 20:44:17,376 - INFO - [Train] step: 32999, loss_G: 0.498804, lr_G: 0.000800
703
+ 2023-07-01 20:45:21,581 - INFO - [Eval] step: 32999, fid: 29.391535
704
+ 2023-07-01 20:46:17,643 - INFO - [Train] step: 33099, loss_D: 0.007565, lr_D: 0.000800
705
+ 2023-07-01 20:46:17,724 - INFO - [Train] step: 33099, loss_G: 0.506936, lr_G: 0.000800
706
+ 2023-07-01 20:47:13,942 - INFO - [Train] step: 33199, loss_D: 0.009538, lr_D: 0.000800
707
+ 2023-07-01 20:47:14,024 - INFO - [Train] step: 33199, loss_G: 0.497423, lr_G: 0.000800
708
+ 2023-07-01 20:48:10,246 - INFO - [Train] step: 33299, loss_D: 0.004492, lr_D: 0.000800
709
+ 2023-07-01 20:48:10,327 - INFO - [Train] step: 33299, loss_G: 0.453382, lr_G: 0.000800
710
+ 2023-07-01 20:49:06,542 - INFO - [Train] step: 33399, loss_D: 0.018868, lr_D: 0.000800
711
+ 2023-07-01 20:49:06,623 - INFO - [Train] step: 33399, loss_G: 0.485676, lr_G: 0.000800
712
+ 2023-07-01 20:50:02,827 - INFO - [Train] step: 33499, loss_D: 0.005363, lr_D: 0.000800
713
+ 2023-07-01 20:50:02,909 - INFO - [Train] step: 33499, loss_G: 0.544964, lr_G: 0.000800
714
+ 2023-07-01 20:50:59,097 - INFO - [Train] step: 33599, loss_D: 0.009065, lr_D: 0.000800
715
+ 2023-07-01 20:50:59,178 - INFO - [Train] step: 33599, loss_G: 0.461471, lr_G: 0.000800
716
+ 2023-07-01 20:51:55,554 - INFO - [Train] step: 33699, loss_D: 0.012356, lr_D: 0.000800
717
+ 2023-07-01 20:51:55,636 - INFO - [Train] step: 33699, loss_G: 0.485596, lr_G: 0.000800
718
+ 2023-07-01 20:52:51,848 - INFO - [Train] step: 33799, loss_D: 0.006972, lr_D: 0.000800
719
+ 2023-07-01 20:52:51,930 - INFO - [Train] step: 33799, loss_G: 0.453154, lr_G: 0.000800
720
+ 2023-07-01 20:53:48,142 - INFO - [Train] step: 33899, loss_D: 0.005063, lr_D: 0.000800
721
+ 2023-07-01 20:53:48,223 - INFO - [Train] step: 33899, loss_G: 0.498531, lr_G: 0.000800
722
+ 2023-07-01 20:54:44,424 - INFO - [Train] step: 33999, loss_D: 0.011498, lr_D: 0.000800
723
+ 2023-07-01 20:54:44,506 - INFO - [Train] step: 33999, loss_G: 0.470103, lr_G: 0.000800
724
+ 2023-07-01 20:55:48,720 - INFO - [Eval] step: 33999, fid: 28.693078
725
+ 2023-07-01 20:56:44,782 - INFO - [Train] step: 34099, loss_D: 0.006173, lr_D: 0.000800
726
+ 2023-07-01 20:56:44,864 - INFO - [Train] step: 34099, loss_G: 0.462020, lr_G: 0.000800
727
+ 2023-07-01 20:57:41,075 - INFO - [Train] step: 34199, loss_D: 0.042533, lr_D: 0.000800
728
+ 2023-07-01 20:57:41,157 - INFO - [Train] step: 34199, loss_G: 0.347429, lr_G: 0.000800
729
+ 2023-07-01 20:58:37,542 - INFO - [Train] step: 34299, loss_D: 0.008412, lr_D: 0.000800
730
+ 2023-07-01 20:58:37,623 - INFO - [Train] step: 34299, loss_G: 0.427577, lr_G: 0.000800
731
+ 2023-07-01 20:59:33,861 - INFO - [Train] step: 34399, loss_D: 0.007444, lr_D: 0.000800
732
+ 2023-07-01 20:59:33,943 - INFO - [Train] step: 34399, loss_G: 0.491827, lr_G: 0.000800
733
+ 2023-07-01 21:00:30,162 - INFO - [Train] step: 34499, loss_D: 0.006357, lr_D: 0.000800
734
+ 2023-07-01 21:00:30,244 - INFO - [Train] step: 34499, loss_G: 0.475162, lr_G: 0.000800
735
+ 2023-07-01 21:01:26,490 - INFO - [Train] step: 34599, loss_D: 0.005610, lr_D: 0.000800
736
+ 2023-07-01 21:01:26,571 - INFO - [Train] step: 34599, loss_G: 0.442595, lr_G: 0.000800
737
+ 2023-07-01 21:02:22,798 - INFO - [Train] step: 34699, loss_D: 0.007770, lr_D: 0.000800
738
+ 2023-07-01 21:02:22,879 - INFO - [Train] step: 34699, loss_G: 0.499187, lr_G: 0.000800
739
+ 2023-07-01 21:03:19,093 - INFO - [Train] step: 34799, loss_D: 0.009583, lr_D: 0.000800
740
+ 2023-07-01 21:03:19,174 - INFO - [Train] step: 34799, loss_G: 0.466947, lr_G: 0.000800
741
+ 2023-07-01 21:04:15,743 - INFO - [Train] step: 34899, loss_D: 0.018555, lr_D: 0.000800
742
+ 2023-07-01 21:04:15,824 - INFO - [Train] step: 34899, loss_G: 0.455649, lr_G: 0.000800
743
+ 2023-07-01 21:05:12,632 - INFO - [Train] step: 34999, loss_D: 0.017899, lr_D: 0.000800
744
+ 2023-07-01 21:05:12,713 - INFO - [Train] step: 34999, loss_G: 0.392863, lr_G: 0.000800
745
+ 2023-07-01 21:06:16,973 - INFO - [Eval] step: 34999, fid: 29.209337
746
+ 2023-07-01 21:07:13,058 - INFO - [Train] step: 35099, loss_D: 0.014280, lr_D: 0.000800
747
+ 2023-07-01 21:07:13,140 - INFO - [Train] step: 35099, loss_G: 0.413777, lr_G: 0.000800
748
+ 2023-07-01 21:08:09,362 - INFO - [Train] step: 35199, loss_D: 0.005024, lr_D: 0.000800
749
+ 2023-07-01 21:08:09,444 - INFO - [Train] step: 35199, loss_G: 0.485391, lr_G: 0.000800
750
+ 2023-07-01 21:09:05,671 - INFO - [Train] step: 35299, loss_D: 0.006565, lr_D: 0.000800
751
+ 2023-07-01 21:09:05,752 - INFO - [Train] step: 35299, loss_G: 0.528535, lr_G: 0.000800
752
+ 2023-07-01 21:10:01,961 - INFO - [Train] step: 35399, loss_D: 0.004606, lr_D: 0.000800
753
+ 2023-07-01 21:10:02,042 - INFO - [Train] step: 35399, loss_G: 0.421638, lr_G: 0.000800
754
+ 2023-07-01 21:10:58,247 - INFO - [Train] step: 35499, loss_D: 0.006037, lr_D: 0.000800
755
+ 2023-07-01 21:10:58,328 - INFO - [Train] step: 35499, loss_G: 0.519085, lr_G: 0.000800
756
+ 2023-07-01 21:11:54,683 - INFO - [Train] step: 35599, loss_D: 0.006184, lr_D: 0.000800
757
+ 2023-07-01 21:11:54,765 - INFO - [Train] step: 35599, loss_G: 0.482145, lr_G: 0.000800
758
+ 2023-07-01 21:12:50,972 - INFO - [Train] step: 35699, loss_D: 0.009600, lr_D: 0.000800
759
+ 2023-07-01 21:12:51,054 - INFO - [Train] step: 35699, loss_G: 0.522247, lr_G: 0.000800
760
+ 2023-07-01 21:13:47,263 - INFO - [Train] step: 35799, loss_D: 0.009911, lr_D: 0.000800
761
+ 2023-07-01 21:13:47,344 - INFO - [Train] step: 35799, loss_G: 0.536509, lr_G: 0.000800
762
+ 2023-07-01 21:14:43,542 - INFO - [Train] step: 35899, loss_D: 0.006834, lr_D: 0.000800
763
+ 2023-07-01 21:14:43,624 - INFO - [Train] step: 35899, loss_G: 0.499934, lr_G: 0.000800
764
+ 2023-07-01 21:15:39,818 - INFO - [Train] step: 35999, loss_D: 0.007034, lr_D: 0.000800
765
+ 2023-07-01 21:15:39,900 - INFO - [Train] step: 35999, loss_G: 0.427329, lr_G: 0.000800
766
+ 2023-07-01 21:16:44,109 - INFO - [Eval] step: 35999, fid: 29.273343
767
+ 2023-07-01 21:17:40,130 - INFO - [Train] step: 36099, loss_D: 0.011253, lr_D: 0.000800
768
+ 2023-07-01 21:17:40,212 - INFO - [Train] step: 36099, loss_G: 0.464884, lr_G: 0.000800
769
+ 2023-07-01 21:18:36,402 - INFO - [Train] step: 36199, loss_D: 0.015299, lr_D: 0.000800
770
+ 2023-07-01 21:18:36,483 - INFO - [Train] step: 36199, loss_G: 0.444111, lr_G: 0.000800
771
+ 2023-07-01 21:19:32,855 - INFO - [Train] step: 36299, loss_D: 0.004250, lr_D: 0.000800
772
+ 2023-07-01 21:19:32,937 - INFO - [Train] step: 36299, loss_G: 0.540237, lr_G: 0.000800
773
+ 2023-07-01 21:20:29,150 - INFO - [Train] step: 36399, loss_D: 0.004625, lr_D: 0.000800
774
+ 2023-07-01 21:20:29,231 - INFO - [Train] step: 36399, loss_G: 0.500283, lr_G: 0.000800
775
+ 2023-07-01 21:21:25,454 - INFO - [Train] step: 36499, loss_D: 0.004593, lr_D: 0.000800
776
+ 2023-07-01 21:21:25,536 - INFO - [Train] step: 36499, loss_G: 0.512707, lr_G: 0.000800
777
+ 2023-07-01 21:22:21,738 - INFO - [Train] step: 36599, loss_D: 0.006763, lr_D: 0.000800
778
+ 2023-07-01 21:22:21,820 - INFO - [Train] step: 36599, loss_G: 0.462912, lr_G: 0.000800
779
+ 2023-07-01 21:23:18,030 - INFO - [Train] step: 36699, loss_D: 0.006047, lr_D: 0.000800
780
+ 2023-07-01 21:23:18,111 - INFO - [Train] step: 36699, loss_G: 0.542303, lr_G: 0.000800
781
+ 2023-07-01 21:24:14,291 - INFO - [Train] step: 36799, loss_D: 0.009533, lr_D: 0.000800
782
+ 2023-07-01 21:24:14,372 - INFO - [Train] step: 36799, loss_G: 0.454194, lr_G: 0.000800
783
+ 2023-07-01 21:25:10,748 - INFO - [Train] step: 36899, loss_D: 0.009707, lr_D: 0.000800
784
+ 2023-07-01 21:25:10,830 - INFO - [Train] step: 36899, loss_G: 0.526882, lr_G: 0.000800
785
+ 2023-07-01 21:26:07,043 - INFO - [Train] step: 36999, loss_D: 0.006530, lr_D: 0.000800
786
+ 2023-07-01 21:26:07,124 - INFO - [Train] step: 36999, loss_G: 0.490091, lr_G: 0.000800
787
+ 2023-07-01 21:27:11,304 - INFO - [Eval] step: 36999, fid: 29.200974
788
+ 2023-07-01 21:28:07,295 - INFO - [Train] step: 37099, loss_D: 0.010660, lr_D: 0.000800
789
+ 2023-07-01 21:28:07,377 - INFO - [Train] step: 37099, loss_G: 0.527466, lr_G: 0.000800
790
+ 2023-07-01 21:29:03,595 - INFO - [Train] step: 37199, loss_D: 0.008211, lr_D: 0.000800
791
+ 2023-07-01 21:29:03,676 - INFO - [Train] step: 37199, loss_G: 0.557356, lr_G: 0.000800
792
+ 2023-07-01 21:29:59,909 - INFO - [Train] step: 37299, loss_D: 0.004410, lr_D: 0.000800
793
+ 2023-07-01 21:29:59,998 - INFO - [Train] step: 37299, loss_G: 0.478788, lr_G: 0.000800
794
+ 2023-07-01 21:30:56,223 - INFO - [Train] step: 37399, loss_D: 0.021454, lr_D: 0.000800
795
+ 2023-07-01 21:30:56,305 - INFO - [Train] step: 37399, loss_G: 0.386928, lr_G: 0.000800
796
+ 2023-07-01 21:31:52,538 - INFO - [Train] step: 37499, loss_D: 0.004321, lr_D: 0.000800
797
+ 2023-07-01 21:31:52,619 - INFO - [Train] step: 37499, loss_G: 0.470156, lr_G: 0.000800
798
+ 2023-07-01 21:32:49,001 - INFO - [Train] step: 37599, loss_D: 0.008819, lr_D: 0.000800
799
+ 2023-07-01 21:32:49,083 - INFO - [Train] step: 37599, loss_G: 0.453485, lr_G: 0.000800
800
+ 2023-07-01 21:33:45,278 - INFO - [Train] step: 37699, loss_D: 0.062676, lr_D: 0.000800
801
+ 2023-07-01 21:33:45,360 - INFO - [Train] step: 37699, loss_G: 0.274167, lr_G: 0.000800
802
+ 2023-07-01 21:34:41,560 - INFO - [Train] step: 37799, loss_D: 0.029416, lr_D: 0.000800
803
+ 2023-07-01 21:34:41,641 - INFO - [Train] step: 37799, loss_G: 0.397284, lr_G: 0.000800
804
+ 2023-07-01 21:35:37,876 - INFO - [Train] step: 37899, loss_D: 0.003817, lr_D: 0.000800
805
+ 2023-07-01 21:35:37,957 - INFO - [Train] step: 37899, loss_G: 0.515849, lr_G: 0.000800
806
+ 2023-07-01 21:36:34,175 - INFO - [Train] step: 37999, loss_D: 0.023767, lr_D: 0.000800
807
+ 2023-07-01 21:36:34,257 - INFO - [Train] step: 37999, loss_G: 0.394301, lr_G: 0.000800
808
+ 2023-07-01 21:37:38,499 - INFO - [Eval] step: 37999, fid: 30.114388
809
+ 2023-07-01 21:38:34,514 - INFO - [Train] step: 38099, loss_D: 0.004758, lr_D: 0.000800
810
+ 2023-07-01 21:38:34,595 - INFO - [Train] step: 38099, loss_G: 0.452437, lr_G: 0.000800
811
+ 2023-07-01 21:39:30,978 - INFO - [Train] step: 38199, loss_D: 0.005971, lr_D: 0.000800
812
+ 2023-07-01 21:39:31,060 - INFO - [Train] step: 38199, loss_G: 0.523271, lr_G: 0.000800
813
+ 2023-07-01 21:40:27,269 - INFO - [Train] step: 38299, loss_D: 0.009606, lr_D: 0.000800
814
+ 2023-07-01 21:40:27,351 - INFO - [Train] step: 38299, loss_G: 0.629234, lr_G: 0.000800
815
+ 2023-07-01 21:41:23,558 - INFO - [Train] step: 38399, loss_D: 0.017154, lr_D: 0.000800
816
+ 2023-07-01 21:41:23,639 - INFO - [Train] step: 38399, loss_G: 0.646959, lr_G: 0.000800
817
+ 2023-07-01 21:42:19,845 - INFO - [Train] step: 38499, loss_D: 0.010298, lr_D: 0.000800
818
+ 2023-07-01 21:42:19,926 - INFO - [Train] step: 38499, loss_G: 0.485315, lr_G: 0.000800
819
+ 2023-07-01 21:43:16,136 - INFO - [Train] step: 38599, loss_D: 0.012583, lr_D: 0.000800
820
+ 2023-07-01 21:43:16,218 - INFO - [Train] step: 38599, loss_G: 0.430766, lr_G: 0.000800
821
+ 2023-07-01 21:44:12,437 - INFO - [Train] step: 38699, loss_D: 0.006646, lr_D: 0.000800
822
+ 2023-07-01 21:44:12,518 - INFO - [Train] step: 38699, loss_G: 0.462981, lr_G: 0.000800
823
+ 2023-07-01 21:45:08,736 - INFO - [Train] step: 38799, loss_D: 0.005173, lr_D: 0.000800
824
+ 2023-07-01 21:45:08,817 - INFO - [Train] step: 38799, loss_G: 0.547105, lr_G: 0.000800
825
+ 2023-07-01 21:46:05,160 - INFO - [Train] step: 38899, loss_D: 1.011272, lr_D: 0.000800
826
+ 2023-07-01 21:46:05,242 - INFO - [Train] step: 38899, loss_G: 0.359154, lr_G: 0.000800
827
+ 2023-07-01 21:47:01,454 - INFO - [Train] step: 38999, loss_D: 0.010683, lr_D: 0.000800
828
+ 2023-07-01 21:47:01,536 - INFO - [Train] step: 38999, loss_G: 0.491258, lr_G: 0.000800
829
+ 2023-07-01 21:48:05,715 - INFO - [Eval] step: 38999, fid: 28.675869
830
+ 2023-07-01 21:49:01,750 - INFO - [Train] step: 39099, loss_D: 0.009661, lr_D: 0.000800
831
+ 2023-07-01 21:49:01,832 - INFO - [Train] step: 39099, loss_G: 0.450137, lr_G: 0.000800
832
+ 2023-07-01 21:49:58,041 - INFO - [Train] step: 39199, loss_D: 0.008570, lr_D: 0.000800
833
+ 2023-07-01 21:49:58,122 - INFO - [Train] step: 39199, loss_G: 0.497657, lr_G: 0.000800
834
+ 2023-07-01 21:50:54,331 - INFO - [Train] step: 39299, loss_D: 0.006551, lr_D: 0.000800
835
+ 2023-07-01 21:50:54,412 - INFO - [Train] step: 39299, loss_G: 0.544580, lr_G: 0.000800
836
+ 2023-07-01 21:51:50,639 - INFO - [Train] step: 39399, loss_D: 0.006657, lr_D: 0.000800
837
+ 2023-07-01 21:51:50,720 - INFO - [Train] step: 39399, loss_G: 0.631172, lr_G: 0.000800
838
+ 2023-07-01 21:52:47,106 - INFO - [Train] step: 39499, loss_D: 0.005097, lr_D: 0.000800
839
+ 2023-07-01 21:52:47,187 - INFO - [Train] step: 39499, loss_G: 0.485837, lr_G: 0.000800
840
+ 2023-07-01 21:53:43,409 - INFO - [Train] step: 39599, loss_D: 0.005782, lr_D: 0.000800
841
+ 2023-07-01 21:53:43,491 - INFO - [Train] step: 39599, loss_G: 0.513047, lr_G: 0.000800
842
+ 2023-07-01 21:54:39,730 - INFO - [Train] step: 39699, loss_D: 0.008062, lr_D: 0.000800
843
+ 2023-07-01 21:54:39,812 - INFO - [Train] step: 39699, loss_G: 0.589866, lr_G: 0.000800
844
+ 2023-07-01 21:55:36,021 - INFO - [Train] step: 39799, loss_D: 0.007721, lr_D: 0.000800
845
+ 2023-07-01 21:55:36,103 - INFO - [Train] step: 39799, loss_G: 0.510992, lr_G: 0.000800
846
+ 2023-07-01 21:56:32,323 - INFO - [Train] step: 39899, loss_D: 0.006672, lr_D: 0.000800
847
+ 2023-07-01 21:56:32,404 - INFO - [Train] step: 39899, loss_G: 0.589983, lr_G: 0.000800
848
+ 2023-07-01 21:57:28,604 - INFO - [Train] step: 39999, loss_D: 0.010124, lr_D: 0.000800
849
+ 2023-07-01 21:57:28,685 - INFO - [Train] step: 39999, loss_G: 0.580782, lr_G: 0.000800
850
+ 2023-07-01 21:58:32,933 - INFO - [Eval] step: 39999, fid: 28.713172
851
+ 2023-07-01 21:58:33,102 - INFO - Best FID score: 28.43898727768476
852
+ 2023-07-01 21:58:33,103 - INFO - End of training
lsgan_cifar10/samples.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cff58c2aa52d2cd3a54154e5a701de72854dbe60e883b7d4e51a1ed7cf51bebd
3
+ size 6026103
lsgan_cifar10/tensorboard/events.out.tfevents.1688223609.jason-system.50111.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63c11a9cd18bd91a9dda95469840732652842ff55ef84912c2bfb878f4e500df
3
+ size 8095936