xyfJASON commited on
Commit
90f4182
1 Parent(s): 30793b7

Upload cgan_cifar10 checkpoints and training logs

Browse files
cgan_cifar10/ckpt/best/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:20e4a9b19e429a022c2088af237ca6b9c9fffadf121322a8824f5475ee537259
3
+ size 425
cgan_cifar10/ckpt/best/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:05caae18d843a3ff46d423eab1723299ef7a92ef48fc52903c64544b4812fc52
3
+ size 91501396
cgan_cifar10/ckpt/best/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c3586dfb911af88ab5b171fdd30fe4c85acfa10c02b1861bd923f44fc1c2b5d9
3
+ size 182949275
cgan_cifar10/ckpt/step039999/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6e139d496cdf583e12752c444a68d71d1f0b365f6ee923d2a713602c40feceb5
3
+ size 425
cgan_cifar10/ckpt/step039999/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7942cb74346d64a2ad7b3235b7eb8a9954a88023c010ad2fcb48450698b6e0d2
3
+ size 91501396
cgan_cifar10/ckpt/step039999/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f5731cfbc84a7d6e2ce8aae185994cb26eea8d5b0e725f5a6326eb93563bb606
3
+ size 182949275
cgan_cifar10/config-2023-07-11-11-32-39.yaml ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ seed: 5678
2
+ data:
3
+ name: CIFAR-10
4
+ dataroot: /data/CIFAR-10/
5
+ img_size: 32
6
+ img_channels: 3
7
+ n_classes: 10
8
+ dataloader:
9
+ num_workers: 4
10
+ pin_memory: true
11
+ prefetch_factor: 2
12
+ G:
13
+ target: models.simple_cnn_cond.GeneratorConditional
14
+ params:
15
+ z_dim: 100
16
+ n_classes: 10
17
+ dim: 256
18
+ dim_mults:
19
+ - 4
20
+ - 2
21
+ - 1
22
+ out_dim: 3
23
+ with_bn: true
24
+ with_tanh: true
25
+ D:
26
+ target: models.simple_cnn_cond.DiscriminatorConditional
27
+ params:
28
+ n_classes: 10
29
+ in_dim: 3
30
+ dim: 256
31
+ dim_mults:
32
+ - 1
33
+ - 2
34
+ - 4
35
+ with_bn: true
36
+ train:
37
+ n_steps: 40000
38
+ batch_size: 512
39
+ d_iters: 5
40
+ resume: null
41
+ print_freq: 100
42
+ save_freq: 10000
43
+ sample_freq: 1000
44
+ eval_freq: 1000
45
+ n_samples_per_class: 6
46
+ loss_fn:
47
+ target: losses.vanilla_gan_loss.VanillaGANLoss
48
+ params:
49
+ lambda_r1_reg: 0.0
50
+ optim_G:
51
+ target: torch.optim.Adam
52
+ params:
53
+ lr: 0.0008
54
+ betas:
55
+ - 0.5
56
+ - 0.999
57
+ optim_D:
58
+ target: torch.optim.Adam
59
+ params:
60
+ lr: 0.0008
61
+ betas:
62
+ - 0.5
63
+ - 0.999
cgan_cifar10/output-2023-07-11-11-32-39.log ADDED
@@ -0,0 +1,851 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-07-11 11:32:41,232 - INFO - Experiment directory: ./runs/cgan_cifar10/
2
+ 2023-07-11 11:32:41,232 - INFO - Number of processes: 1
3
+ 2023-07-11 11:32:41,232 - INFO - Distributed type: DistributedType.NO
4
+ 2023-07-11 11:32:41,232 - INFO - Mixed precision: no
5
+ 2023-07-11 11:32:41,232 - INFO - ==============================
6
+ 2023-07-11 11:32:41,737 - INFO - Size of training set: 50000
7
+ 2023-07-11 11:32:41,738 - INFO - Batch size per process: 512
8
+ 2023-07-11 11:32:41,738 - INFO - Total batch size: 512
9
+ 2023-07-11 11:32:41,738 - INFO - ==============================
10
+ 2023-07-11 11:32:42,715 - INFO - Start training...
11
+ 2023-07-11 11:33:40,187 - INFO - [Train] step: 99, loss_D: 0.001627, lr_D: 0.000800
12
+ 2023-07-11 11:33:40,271 - INFO - [Train] step: 99, loss_G: 8.997356, lr_G: 0.000800
13
+ 2023-07-11 11:34:36,377 - INFO - [Train] step: 199, loss_D: 0.032951, lr_D: 0.000800
14
+ 2023-07-11 11:34:36,461 - INFO - [Train] step: 199, loss_G: 4.610237, lr_G: 0.000800
15
+ 2023-07-11 11:35:32,676 - INFO - [Train] step: 299, loss_D: 0.014280, lr_D: 0.000800
16
+ 2023-07-11 11:35:32,759 - INFO - [Train] step: 299, loss_G: 5.337371, lr_G: 0.000800
17
+ 2023-07-11 11:36:28,994 - INFO - [Train] step: 399, loss_D: 0.009693, lr_D: 0.000800
18
+ 2023-07-11 11:36:29,078 - INFO - [Train] step: 399, loss_G: 5.491062, lr_G: 0.000800
19
+ 2023-07-11 11:37:25,366 - INFO - [Train] step: 499, loss_D: 0.033736, lr_D: 0.000800
20
+ 2023-07-11 11:37:25,450 - INFO - [Train] step: 499, loss_G: 4.679213, lr_G: 0.000800
21
+ 2023-07-11 11:38:21,736 - INFO - [Train] step: 599, loss_D: 0.011892, lr_D: 0.000800
22
+ 2023-07-11 11:38:21,820 - INFO - [Train] step: 599, loss_G: 4.816560, lr_G: 0.000800
23
+ 2023-07-11 11:39:18,235 - INFO - [Train] step: 699, loss_D: 0.031111, lr_D: 0.000800
24
+ 2023-07-11 11:39:18,319 - INFO - [Train] step: 699, loss_G: 5.084354, lr_G: 0.000800
25
+ 2023-07-11 11:40:14,587 - INFO - [Train] step: 799, loss_D: 0.023191, lr_D: 0.000800
26
+ 2023-07-11 11:40:14,671 - INFO - [Train] step: 799, loss_G: 4.651308, lr_G: 0.000800
27
+ 2023-07-11 11:41:10,961 - INFO - [Train] step: 899, loss_D: 0.015397, lr_D: 0.000800
28
+ 2023-07-11 11:41:11,045 - INFO - [Train] step: 899, loss_G: 5.959859, lr_G: 0.000800
29
+ 2023-07-11 11:42:07,312 - INFO - [Train] step: 999, loss_D: 0.026454, lr_D: 0.000800
30
+ 2023-07-11 11:42:07,396 - INFO - [Train] step: 999, loss_G: 4.833085, lr_G: 0.000800
31
+ 2023-07-11 11:43:11,563 - INFO - [Eval] step: 999, fid: 128.158505
32
+ 2023-07-11 11:44:08,003 - INFO - [Train] step: 1099, loss_D: 0.008656, lr_D: 0.000800
33
+ 2023-07-11 11:44:08,087 - INFO - [Train] step: 1099, loss_G: 6.271304, lr_G: 0.000800
34
+ 2023-07-11 11:45:04,400 - INFO - [Train] step: 1199, loss_D: 0.021059, lr_D: 0.000800
35
+ 2023-07-11 11:45:04,484 - INFO - [Train] step: 1199, loss_G: 5.619766, lr_G: 0.000800
36
+ 2023-07-11 11:46:00,938 - INFO - [Train] step: 1299, loss_D: 0.014379, lr_D: 0.000800
37
+ 2023-07-11 11:46:01,022 - INFO - [Train] step: 1299, loss_G: 5.458647, lr_G: 0.000800
38
+ 2023-07-11 11:46:57,347 - INFO - [Train] step: 1399, loss_D: 0.026529, lr_D: 0.000800
39
+ 2023-07-11 11:46:57,431 - INFO - [Train] step: 1399, loss_G: 5.784266, lr_G: 0.000800
40
+ 2023-07-11 11:47:53,788 - INFO - [Train] step: 1499, loss_D: 0.013564, lr_D: 0.000800
41
+ 2023-07-11 11:47:53,872 - INFO - [Train] step: 1499, loss_G: 5.812286, lr_G: 0.000800
42
+ 2023-07-11 11:48:50,200 - INFO - [Train] step: 1599, loss_D: 0.032264, lr_D: 0.000800
43
+ 2023-07-11 11:48:50,284 - INFO - [Train] step: 1599, loss_G: 5.074686, lr_G: 0.000800
44
+ 2023-07-11 11:49:46,612 - INFO - [Train] step: 1699, loss_D: 0.008307, lr_D: 0.000800
45
+ 2023-07-11 11:49:46,696 - INFO - [Train] step: 1699, loss_G: 6.551968, lr_G: 0.000800
46
+ 2023-07-11 11:50:43,011 - INFO - [Train] step: 1799, loss_D: 0.026635, lr_D: 0.000800
47
+ 2023-07-11 11:50:43,095 - INFO - [Train] step: 1799, loss_G: 5.197975, lr_G: 0.000800
48
+ 2023-07-11 11:51:39,396 - INFO - [Train] step: 1899, loss_D: 0.063450, lr_D: 0.000800
49
+ 2023-07-11 11:51:39,479 - INFO - [Train] step: 1899, loss_G: 3.990579, lr_G: 0.000800
50
+ 2023-07-11 11:52:35,925 - INFO - [Train] step: 1999, loss_D: 0.029807, lr_D: 0.000800
51
+ 2023-07-11 11:52:36,009 - INFO - [Train] step: 1999, loss_G: 5.723417, lr_G: 0.000800
52
+ 2023-07-11 11:53:39,937 - INFO - [Eval] step: 1999, fid: 79.083606
53
+ 2023-07-11 11:54:36,480 - INFO - [Train] step: 2099, loss_D: 0.028993, lr_D: 0.000800
54
+ 2023-07-11 11:54:36,565 - INFO - [Train] step: 2099, loss_G: 4.681474, lr_G: 0.000800
55
+ 2023-07-11 11:55:32,907 - INFO - [Train] step: 2199, loss_D: 0.022322, lr_D: 0.000800
56
+ 2023-07-11 11:55:32,991 - INFO - [Train] step: 2199, loss_G: 4.906097, lr_G: 0.000800
57
+ 2023-07-11 11:56:29,327 - INFO - [Train] step: 2299, loss_D: 0.115781, lr_D: 0.000800
58
+ 2023-07-11 11:56:29,411 - INFO - [Train] step: 2299, loss_G: 5.437107, lr_G: 0.000800
59
+ 2023-07-11 11:57:25,744 - INFO - [Train] step: 2399, loss_D: 0.032049, lr_D: 0.000800
60
+ 2023-07-11 11:57:25,828 - INFO - [Train] step: 2399, loss_G: 5.646049, lr_G: 0.000800
61
+ 2023-07-11 11:58:22,156 - INFO - [Train] step: 2499, loss_D: 0.016577, lr_D: 0.000800
62
+ 2023-07-11 11:58:22,240 - INFO - [Train] step: 2499, loss_G: 5.342746, lr_G: 0.000800
63
+ 2023-07-11 11:59:18,711 - INFO - [Train] step: 2599, loss_D: 0.024346, lr_D: 0.000800
64
+ 2023-07-11 11:59:18,795 - INFO - [Train] step: 2599, loss_G: 5.324674, lr_G: 0.000800
65
+ 2023-07-11 12:00:15,116 - INFO - [Train] step: 2699, loss_D: 0.065964, lr_D: 0.000800
66
+ 2023-07-11 12:00:15,200 - INFO - [Train] step: 2699, loss_G: 4.916304, lr_G: 0.000800
67
+ 2023-07-11 12:01:11,518 - INFO - [Train] step: 2799, loss_D: 0.075717, lr_D: 0.000800
68
+ 2023-07-11 12:01:11,603 - INFO - [Train] step: 2799, loss_G: 4.272255, lr_G: 0.000800
69
+ 2023-07-11 12:02:07,921 - INFO - [Train] step: 2899, loss_D: 0.041682, lr_D: 0.000800
70
+ 2023-07-11 12:02:08,006 - INFO - [Train] step: 2899, loss_G: 4.634606, lr_G: 0.000800
71
+ 2023-07-11 12:03:04,320 - INFO - [Train] step: 2999, loss_D: 0.085065, lr_D: 0.000800
72
+ 2023-07-11 12:03:04,404 - INFO - [Train] step: 2999, loss_G: 4.004360, lr_G: 0.000800
73
+ 2023-07-11 12:04:08,493 - INFO - [Eval] step: 2999, fid: 58.172546
74
+ 2023-07-11 12:05:05,031 - INFO - [Train] step: 3099, loss_D: 0.159297, lr_D: 0.000800
75
+ 2023-07-11 12:05:05,115 - INFO - [Train] step: 3099, loss_G: 4.403972, lr_G: 0.000800
76
+ 2023-07-11 12:06:01,508 - INFO - [Train] step: 3199, loss_D: 0.029541, lr_D: 0.000800
77
+ 2023-07-11 12:06:01,593 - INFO - [Train] step: 3199, loss_G: 4.741822, lr_G: 0.000800
78
+ 2023-07-11 12:06:58,227 - INFO - [Train] step: 3299, loss_D: 0.054262, lr_D: 0.000800
79
+ 2023-07-11 12:06:58,312 - INFO - [Train] step: 3299, loss_G: 4.842682, lr_G: 0.000800
80
+ 2023-07-11 12:07:54,800 - INFO - [Train] step: 3399, loss_D: 0.030436, lr_D: 0.000800
81
+ 2023-07-11 12:07:54,884 - INFO - [Train] step: 3399, loss_G: 5.227071, lr_G: 0.000800
82
+ 2023-07-11 12:08:51,363 - INFO - [Train] step: 3499, loss_D: 0.037491, lr_D: 0.000800
83
+ 2023-07-11 12:08:51,447 - INFO - [Train] step: 3499, loss_G: 5.755805, lr_G: 0.000800
84
+ 2023-07-11 12:09:47,921 - INFO - [Train] step: 3599, loss_D: 0.047046, lr_D: 0.000800
85
+ 2023-07-11 12:09:48,005 - INFO - [Train] step: 3599, loss_G: 4.720422, lr_G: 0.000800
86
+ 2023-07-11 12:10:44,467 - INFO - [Train] step: 3699, loss_D: 0.055668, lr_D: 0.000800
87
+ 2023-07-11 12:10:44,551 - INFO - [Train] step: 3699, loss_G: 3.561963, lr_G: 0.000800
88
+ 2023-07-11 12:11:41,006 - INFO - [Train] step: 3799, loss_D: 0.056802, lr_D: 0.000800
89
+ 2023-07-11 12:11:41,091 - INFO - [Train] step: 3799, loss_G: 5.189616, lr_G: 0.000800
90
+ 2023-07-11 12:12:37,681 - INFO - [Train] step: 3899, loss_D: 0.032559, lr_D: 0.000800
91
+ 2023-07-11 12:12:37,765 - INFO - [Train] step: 3899, loss_G: 6.290652, lr_G: 0.000800
92
+ 2023-07-11 12:13:34,217 - INFO - [Train] step: 3999, loss_D: 0.039090, lr_D: 0.000800
93
+ 2023-07-11 12:13:34,301 - INFO - [Train] step: 3999, loss_G: 4.379457, lr_G: 0.000800
94
+ 2023-07-11 12:14:38,292 - INFO - [Eval] step: 3999, fid: 44.029579
95
+ 2023-07-11 12:15:34,705 - INFO - [Train] step: 4099, loss_D: 0.050695, lr_D: 0.000800
96
+ 2023-07-11 12:15:34,789 - INFO - [Train] step: 4099, loss_G: 4.415603, lr_G: 0.000800
97
+ 2023-07-11 12:16:30,916 - INFO - [Train] step: 4199, loss_D: 0.073558, lr_D: 0.000800
98
+ 2023-07-11 12:16:30,999 - INFO - [Train] step: 4199, loss_G: 4.393678, lr_G: 0.000800
99
+ 2023-07-11 12:17:27,145 - INFO - [Train] step: 4299, loss_D: 0.001833, lr_D: 0.000800
100
+ 2023-07-11 12:17:27,229 - INFO - [Train] step: 4299, loss_G: 7.804952, lr_G: 0.000800
101
+ 2023-07-11 12:18:23,353 - INFO - [Train] step: 4399, loss_D: 0.079716, lr_D: 0.000800
102
+ 2023-07-11 12:18:23,436 - INFO - [Train] step: 4399, loss_G: 4.749404, lr_G: 0.000800
103
+ 2023-07-11 12:19:19,549 - INFO - [Train] step: 4499, loss_D: 0.129455, lr_D: 0.000800
104
+ 2023-07-11 12:19:19,632 - INFO - [Train] step: 4499, loss_G: 4.564599, lr_G: 0.000800
105
+ 2023-07-11 12:20:15,908 - INFO - [Train] step: 4599, loss_D: 0.001934, lr_D: 0.000800
106
+ 2023-07-11 12:20:15,992 - INFO - [Train] step: 4599, loss_G: 8.201621, lr_G: 0.000800
107
+ 2023-07-11 12:21:12,108 - INFO - [Train] step: 4699, loss_D: 0.097809, lr_D: 0.000800
108
+ 2023-07-11 12:21:12,192 - INFO - [Train] step: 4699, loss_G: 3.315424, lr_G: 0.000800
109
+ 2023-07-11 12:22:08,337 - INFO - [Train] step: 4799, loss_D: 0.009425, lr_D: 0.000800
110
+ 2023-07-11 12:22:08,421 - INFO - [Train] step: 4799, loss_G: 6.000487, lr_G: 0.000800
111
+ 2023-07-11 12:23:04,554 - INFO - [Train] step: 4899, loss_D: 0.010454, lr_D: 0.000800
112
+ 2023-07-11 12:23:04,637 - INFO - [Train] step: 4899, loss_G: 5.965707, lr_G: 0.000800
113
+ 2023-07-11 12:24:00,754 - INFO - [Train] step: 4999, loss_D: 0.017259, lr_D: 0.000800
114
+ 2023-07-11 12:24:00,837 - INFO - [Train] step: 4999, loss_G: 6.603471, lr_G: 0.000800
115
+ 2023-07-11 12:25:04,819 - INFO - [Eval] step: 4999, fid: 43.723356
116
+ 2023-07-11 12:26:01,289 - INFO - [Train] step: 5099, loss_D: 0.064753, lr_D: 0.000800
117
+ 2023-07-11 12:26:01,373 - INFO - [Train] step: 5099, loss_G: 4.521230, lr_G: 0.000800
118
+ 2023-07-11 12:26:57,690 - INFO - [Train] step: 5199, loss_D: 0.075535, lr_D: 0.000800
119
+ 2023-07-11 12:26:57,774 - INFO - [Train] step: 5199, loss_G: 4.297753, lr_G: 0.000800
120
+ 2023-07-11 12:27:53,955 - INFO - [Train] step: 5299, loss_D: 0.058116, lr_D: 0.000800
121
+ 2023-07-11 12:27:54,039 - INFO - [Train] step: 5299, loss_G: 5.717802, lr_G: 0.000800
122
+ 2023-07-11 12:28:50,193 - INFO - [Train] step: 5399, loss_D: 0.025380, lr_D: 0.000800
123
+ 2023-07-11 12:28:50,277 - INFO - [Train] step: 5399, loss_G: 5.295789, lr_G: 0.000800
124
+ 2023-07-11 12:29:46,411 - INFO - [Train] step: 5499, loss_D: 0.056128, lr_D: 0.000800
125
+ 2023-07-11 12:29:46,494 - INFO - [Train] step: 5499, loss_G: 6.103138, lr_G: 0.000800
126
+ 2023-07-11 12:30:42,646 - INFO - [Train] step: 5599, loss_D: 0.098828, lr_D: 0.000800
127
+ 2023-07-11 12:30:42,729 - INFO - [Train] step: 5599, loss_G: 4.553205, lr_G: 0.000800
128
+ 2023-07-11 12:31:38,895 - INFO - [Train] step: 5699, loss_D: 0.020657, lr_D: 0.000800
129
+ 2023-07-11 12:31:38,979 - INFO - [Train] step: 5699, loss_G: 5.729069, lr_G: 0.000800
130
+ 2023-07-11 12:32:35,107 - INFO - [Train] step: 5799, loss_D: 0.001277, lr_D: 0.000800
131
+ 2023-07-11 12:32:35,191 - INFO - [Train] step: 5799, loss_G: 9.115376, lr_G: 0.000800
132
+ 2023-07-11 12:33:31,470 - INFO - [Train] step: 5899, loss_D: 0.048693, lr_D: 0.000800
133
+ 2023-07-11 12:33:31,554 - INFO - [Train] step: 5899, loss_G: 4.729529, lr_G: 0.000800
134
+ 2023-07-11 12:34:27,700 - INFO - [Train] step: 5999, loss_D: 0.704835, lr_D: 0.000800
135
+ 2023-07-11 12:34:27,783 - INFO - [Train] step: 5999, loss_G: 1.363551, lr_G: 0.000800
136
+ 2023-07-11 12:35:31,857 - INFO - [Eval] step: 5999, fid: 34.022448
137
+ 2023-07-11 12:36:28,276 - INFO - [Train] step: 6099, loss_D: 0.210027, lr_D: 0.000800
138
+ 2023-07-11 12:36:28,359 - INFO - [Train] step: 6099, loss_G: 3.310365, lr_G: 0.000800
139
+ 2023-07-11 12:37:24,510 - INFO - [Train] step: 6199, loss_D: 0.020091, lr_D: 0.000800
140
+ 2023-07-11 12:37:24,593 - INFO - [Train] step: 6199, loss_G: 4.991518, lr_G: 0.000800
141
+ 2023-07-11 12:38:20,780 - INFO - [Train] step: 6299, loss_D: 0.056295, lr_D: 0.000800
142
+ 2023-07-11 12:38:20,864 - INFO - [Train] step: 6299, loss_G: 5.282824, lr_G: 0.000800
143
+ 2023-07-11 12:39:17,031 - INFO - [Train] step: 6399, loss_D: 0.051391, lr_D: 0.000800
144
+ 2023-07-11 12:39:17,115 - INFO - [Train] step: 6399, loss_G: 4.308142, lr_G: 0.000800
145
+ 2023-07-11 12:40:13,412 - INFO - [Train] step: 6499, loss_D: 0.011647, lr_D: 0.000800
146
+ 2023-07-11 12:40:13,496 - INFO - [Train] step: 6499, loss_G: 6.623148, lr_G: 0.000800
147
+ 2023-07-11 12:41:09,676 - INFO - [Train] step: 6599, loss_D: 0.032993, lr_D: 0.000800
148
+ 2023-07-11 12:41:09,759 - INFO - [Train] step: 6599, loss_G: 5.757629, lr_G: 0.000800
149
+ 2023-07-11 12:42:05,905 - INFO - [Train] step: 6699, loss_D: 0.007489, lr_D: 0.000800
150
+ 2023-07-11 12:42:05,989 - INFO - [Train] step: 6699, loss_G: 7.138266, lr_G: 0.000800
151
+ 2023-07-11 12:43:02,151 - INFO - [Train] step: 6799, loss_D: 0.010770, lr_D: 0.000800
152
+ 2023-07-11 12:43:02,235 - INFO - [Train] step: 6799, loss_G: 5.752389, lr_G: 0.000800
153
+ 2023-07-11 12:43:58,402 - INFO - [Train] step: 6899, loss_D: 0.038809, lr_D: 0.000800
154
+ 2023-07-11 12:43:58,486 - INFO - [Train] step: 6899, loss_G: 5.210541, lr_G: 0.000800
155
+ 2023-07-11 12:44:54,656 - INFO - [Train] step: 6999, loss_D: 0.017666, lr_D: 0.000800
156
+ 2023-07-11 12:44:54,740 - INFO - [Train] step: 6999, loss_G: 5.626320, lr_G: 0.000800
157
+ 2023-07-11 12:45:58,784 - INFO - [Eval] step: 6999, fid: 33.058625
158
+ 2023-07-11 12:46:55,267 - INFO - [Train] step: 7099, loss_D: 0.091532, lr_D: 0.000800
159
+ 2023-07-11 12:46:55,350 - INFO - [Train] step: 7099, loss_G: 5.154208, lr_G: 0.000800
160
+ 2023-07-11 12:47:51,675 - INFO - [Train] step: 7199, loss_D: 0.490369, lr_D: 0.000800
161
+ 2023-07-11 12:47:51,758 - INFO - [Train] step: 7199, loss_G: 2.006269, lr_G: 0.000800
162
+ 2023-07-11 12:48:47,931 - INFO - [Train] step: 7299, loss_D: 0.014261, lr_D: 0.000800
163
+ 2023-07-11 12:48:48,014 - INFO - [Train] step: 7299, loss_G: 5.799419, lr_G: 0.000800
164
+ 2023-07-11 12:49:44,218 - INFO - [Train] step: 7399, loss_D: 0.032790, lr_D: 0.000800
165
+ 2023-07-11 12:49:44,302 - INFO - [Train] step: 7399, loss_G: 5.115887, lr_G: 0.000800
166
+ 2023-07-11 12:50:40,481 - INFO - [Train] step: 7499, loss_D: 0.024623, lr_D: 0.000800
167
+ 2023-07-11 12:50:40,564 - INFO - [Train] step: 7499, loss_G: 5.416479, lr_G: 0.000800
168
+ 2023-07-11 12:51:36,754 - INFO - [Train] step: 7599, loss_D: 0.022694, lr_D: 0.000800
169
+ 2023-07-11 12:51:36,838 - INFO - [Train] step: 7599, loss_G: 4.639852, lr_G: 0.000800
170
+ 2023-07-11 12:52:32,991 - INFO - [Train] step: 7699, loss_D: 0.020744, lr_D: 0.000800
171
+ 2023-07-11 12:52:33,075 - INFO - [Train] step: 7699, loss_G: 6.460749, lr_G: 0.000800
172
+ 2023-07-11 12:53:29,378 - INFO - [Train] step: 7799, loss_D: 0.013265, lr_D: 0.000800
173
+ 2023-07-11 12:53:29,462 - INFO - [Train] step: 7799, loss_G: 6.499502, lr_G: 0.000800
174
+ 2023-07-11 12:54:25,638 - INFO - [Train] step: 7899, loss_D: 0.010489, lr_D: 0.000800
175
+ 2023-07-11 12:54:25,722 - INFO - [Train] step: 7899, loss_G: 6.309391, lr_G: 0.000800
176
+ 2023-07-11 12:55:21,889 - INFO - [Train] step: 7999, loss_D: 0.011140, lr_D: 0.000800
177
+ 2023-07-11 12:55:21,973 - INFO - [Train] step: 7999, loss_G: 5.999728, lr_G: 0.000800
178
+ 2023-07-11 12:56:25,962 - INFO - [Eval] step: 7999, fid: 35.348561
179
+ 2023-07-11 12:57:22,101 - INFO - [Train] step: 8099, loss_D: 0.003056, lr_D: 0.000800
180
+ 2023-07-11 12:57:22,185 - INFO - [Train] step: 8099, loss_G: 7.496294, lr_G: 0.000800
181
+ 2023-07-11 12:58:18,348 - INFO - [Train] step: 8199, loss_D: 0.000705, lr_D: 0.000800
182
+ 2023-07-11 12:58:18,432 - INFO - [Train] step: 8199, loss_G: 11.359093, lr_G: 0.000800
183
+ 2023-07-11 12:59:14,608 - INFO - [Train] step: 8299, loss_D: 0.025060, lr_D: 0.000800
184
+ 2023-07-11 12:59:14,692 - INFO - [Train] step: 8299, loss_G: 6.100769, lr_G: 0.000800
185
+ 2023-07-11 13:00:10,868 - INFO - [Train] step: 8399, loss_D: 0.029371, lr_D: 0.000800
186
+ 2023-07-11 13:00:10,951 - INFO - [Train] step: 8399, loss_G: 5.231078, lr_G: 0.000800
187
+ 2023-07-11 13:01:07,253 - INFO - [Train] step: 8499, loss_D: 0.019515, lr_D: 0.000800
188
+ 2023-07-11 13:01:07,337 - INFO - [Train] step: 8499, loss_G: 6.361640, lr_G: 0.000800
189
+ 2023-07-11 13:02:03,498 - INFO - [Train] step: 8599, loss_D: 0.071161, lr_D: 0.000800
190
+ 2023-07-11 13:02:03,581 - INFO - [Train] step: 8599, loss_G: 4.338478, lr_G: 0.000800
191
+ 2023-07-11 13:02:59,709 - INFO - [Train] step: 8699, loss_D: 0.040499, lr_D: 0.000800
192
+ 2023-07-11 13:02:59,793 - INFO - [Train] step: 8699, loss_G: 5.633650, lr_G: 0.000800
193
+ 2023-07-11 13:03:55,911 - INFO - [Train] step: 8799, loss_D: 0.011353, lr_D: 0.000800
194
+ 2023-07-11 13:03:55,995 - INFO - [Train] step: 8799, loss_G: 5.696527, lr_G: 0.000800
195
+ 2023-07-11 13:04:52,159 - INFO - [Train] step: 8899, loss_D: 0.065517, lr_D: 0.000800
196
+ 2023-07-11 13:04:52,243 - INFO - [Train] step: 8899, loss_G: 4.653852, lr_G: 0.000800
197
+ 2023-07-11 13:05:48,380 - INFO - [Train] step: 8999, loss_D: 0.018624, lr_D: 0.000800
198
+ 2023-07-11 13:05:48,463 - INFO - [Train] step: 8999, loss_G: 5.674169, lr_G: 0.000800
199
+ 2023-07-11 13:06:52,552 - INFO - [Eval] step: 8999, fid: 30.621362
200
+ 2023-07-11 13:07:49,109 - INFO - [Train] step: 9099, loss_D: 0.028935, lr_D: 0.000800
201
+ 2023-07-11 13:07:49,193 - INFO - [Train] step: 9099, loss_G: 5.719067, lr_G: 0.000800
202
+ 2023-07-11 13:08:45,357 - INFO - [Train] step: 9199, loss_D: 0.020407, lr_D: 0.000800
203
+ 2023-07-11 13:08:45,440 - INFO - [Train] step: 9199, loss_G: 5.857404, lr_G: 0.000800
204
+ 2023-07-11 13:09:41,616 - INFO - [Train] step: 9299, loss_D: 0.044054, lr_D: 0.000800
205
+ 2023-07-11 13:09:41,700 - INFO - [Train] step: 9299, loss_G: 4.781527, lr_G: 0.000800
206
+ 2023-07-11 13:10:37,882 - INFO - [Train] step: 9399, loss_D: 0.023854, lr_D: 0.000800
207
+ 2023-07-11 13:10:37,966 - INFO - [Train] step: 9399, loss_G: 4.744816, lr_G: 0.000800
208
+ 2023-07-11 13:11:34,145 - INFO - [Train] step: 9499, loss_D: 0.023837, lr_D: 0.000800
209
+ 2023-07-11 13:11:34,229 - INFO - [Train] step: 9499, loss_G: 6.172730, lr_G: 0.000800
210
+ 2023-07-11 13:12:30,406 - INFO - [Train] step: 9599, loss_D: 0.468906, lr_D: 0.000800
211
+ 2023-07-11 13:12:30,490 - INFO - [Train] step: 9599, loss_G: 4.899601, lr_G: 0.000800
212
+ 2023-07-11 13:13:26,673 - INFO - [Train] step: 9699, loss_D: 0.001909, lr_D: 0.000800
213
+ 2023-07-11 13:13:26,757 - INFO - [Train] step: 9699, loss_G: 8.229082, lr_G: 0.000800
214
+ 2023-07-11 13:14:23,049 - INFO - [Train] step: 9799, loss_D: 0.058375, lr_D: 0.000800
215
+ 2023-07-11 13:14:23,133 - INFO - [Train] step: 9799, loss_G: 4.454073, lr_G: 0.000800
216
+ 2023-07-11 13:15:19,297 - INFO - [Train] step: 9899, loss_D: 0.020404, lr_D: 0.000800
217
+ 2023-07-11 13:15:19,381 - INFO - [Train] step: 9899, loss_G: 6.260936, lr_G: 0.000800
218
+ 2023-07-11 13:16:15,525 - INFO - [Train] step: 9999, loss_D: 0.022226, lr_D: 0.000800
219
+ 2023-07-11 13:16:15,609 - INFO - [Train] step: 9999, loss_G: 6.552267, lr_G: 0.000800
220
+ 2023-07-11 13:17:19,649 - INFO - [Eval] step: 9999, fid: 31.728809
221
+ 2023-07-11 13:18:15,942 - INFO - [Train] step: 10099, loss_D: 0.013560, lr_D: 0.000800
222
+ 2023-07-11 13:18:16,026 - INFO - [Train] step: 10099, loss_G: 6.183954, lr_G: 0.000800
223
+ 2023-07-11 13:19:12,202 - INFO - [Train] step: 10199, loss_D: 0.301225, lr_D: 0.000800
224
+ 2023-07-11 13:19:12,286 - INFO - [Train] step: 10199, loss_G: 3.235008, lr_G: 0.000800
225
+ 2023-07-11 13:20:08,467 - INFO - [Train] step: 10299, loss_D: 0.015755, lr_D: 0.000800
226
+ 2023-07-11 13:20:08,550 - INFO - [Train] step: 10299, loss_G: 5.884558, lr_G: 0.000800
227
+ 2023-07-11 13:21:04,888 - INFO - [Train] step: 10399, loss_D: 0.033889, lr_D: 0.000800
228
+ 2023-07-11 13:21:04,972 - INFO - [Train] step: 10399, loss_G: 6.869841, lr_G: 0.000800
229
+ 2023-07-11 13:22:01,153 - INFO - [Train] step: 10499, loss_D: 0.017708, lr_D: 0.000800
230
+ 2023-07-11 13:22:01,236 - INFO - [Train] step: 10499, loss_G: 7.041650, lr_G: 0.000800
231
+ 2023-07-11 13:22:57,410 - INFO - [Train] step: 10599, loss_D: 0.017894, lr_D: 0.000800
232
+ 2023-07-11 13:22:57,494 - INFO - [Train] step: 10599, loss_G: 6.538893, lr_G: 0.000800
233
+ 2023-07-11 13:23:53,722 - INFO - [Train] step: 10699, loss_D: 0.001114, lr_D: 0.000800
234
+ 2023-07-11 13:23:53,806 - INFO - [Train] step: 10699, loss_G: 8.781487, lr_G: 0.000800
235
+ 2023-07-11 13:24:49,988 - INFO - [Train] step: 10799, loss_D: 0.027659, lr_D: 0.000800
236
+ 2023-07-11 13:24:50,072 - INFO - [Train] step: 10799, loss_G: 6.549875, lr_G: 0.000800
237
+ 2023-07-11 13:25:46,269 - INFO - [Train] step: 10899, loss_D: 0.031984, lr_D: 0.000800
238
+ 2023-07-11 13:25:46,353 - INFO - [Train] step: 10899, loss_G: 6.759466, lr_G: 0.000800
239
+ 2023-07-11 13:26:42,669 - INFO - [Train] step: 10999, loss_D: 0.006145, lr_D: 0.000800
240
+ 2023-07-11 13:26:42,753 - INFO - [Train] step: 10999, loss_G: 7.112816, lr_G: 0.000800
241
+ 2023-07-11 13:27:46,758 - INFO - [Eval] step: 10999, fid: 30.327368
242
+ 2023-07-11 13:28:43,254 - INFO - [Train] step: 11099, loss_D: 0.003850, lr_D: 0.000800
243
+ 2023-07-11 13:28:43,337 - INFO - [Train] step: 11099, loss_G: 7.746220, lr_G: 0.000800
244
+ 2023-07-11 13:29:39,518 - INFO - [Train] step: 11199, loss_D: 0.067815, lr_D: 0.000800
245
+ 2023-07-11 13:29:39,602 - INFO - [Train] step: 11199, loss_G: 5.223897, lr_G: 0.000800
246
+ 2023-07-11 13:30:35,813 - INFO - [Train] step: 11299, loss_D: 0.002763, lr_D: 0.000800
247
+ 2023-07-11 13:30:35,896 - INFO - [Train] step: 11299, loss_G: 8.084093, lr_G: 0.000800
248
+ 2023-07-11 13:31:32,110 - INFO - [Train] step: 11399, loss_D: 0.012797, lr_D: 0.000800
249
+ 2023-07-11 13:31:32,193 - INFO - [Train] step: 11399, loss_G: 6.798376, lr_G: 0.000800
250
+ 2023-07-11 13:32:28,392 - INFO - [Train] step: 11499, loss_D: 0.009886, lr_D: 0.000800
251
+ 2023-07-11 13:32:28,476 - INFO - [Train] step: 11499, loss_G: 7.411889, lr_G: 0.000800
252
+ 2023-07-11 13:33:24,688 - INFO - [Train] step: 11599, loss_D: 0.011459, lr_D: 0.000800
253
+ 2023-07-11 13:33:24,771 - INFO - [Train] step: 11599, loss_G: 6.962460, lr_G: 0.000800
254
+ 2023-07-11 13:34:21,090 - INFO - [Train] step: 11699, loss_D: 0.024009, lr_D: 0.000800
255
+ 2023-07-11 13:34:21,174 - INFO - [Train] step: 11699, loss_G: 7.555803, lr_G: 0.000800
256
+ 2023-07-11 13:35:17,359 - INFO - [Train] step: 11799, loss_D: 0.015610, lr_D: 0.000800
257
+ 2023-07-11 13:35:17,443 - INFO - [Train] step: 11799, loss_G: 6.682260, lr_G: 0.000800
258
+ 2023-07-11 13:36:13,643 - INFO - [Train] step: 11899, loss_D: 3.467544, lr_D: 0.000800
259
+ 2023-07-11 13:36:13,727 - INFO - [Train] step: 11899, loss_G: 3.764105, lr_G: 0.000800
260
+ 2023-07-11 13:37:09,918 - INFO - [Train] step: 11999, loss_D: 0.004966, lr_D: 0.000800
261
+ 2023-07-11 13:37:10,002 - INFO - [Train] step: 11999, loss_G: 8.321743, lr_G: 0.000800
262
+ 2023-07-11 13:38:14,082 - INFO - [Eval] step: 11999, fid: 29.305782
263
+ 2023-07-11 13:39:10,580 - INFO - [Train] step: 12099, loss_D: 0.009038, lr_D: 0.000800
264
+ 2023-07-11 13:39:10,664 - INFO - [Train] step: 12099, loss_G: 8.330870, lr_G: 0.000800
265
+ 2023-07-11 13:40:06,889 - INFO - [Train] step: 12199, loss_D: 0.023681, lr_D: 0.000800
266
+ 2023-07-11 13:40:06,973 - INFO - [Train] step: 12199, loss_G: 6.975578, lr_G: 0.000800
267
+ 2023-07-11 13:41:03,360 - INFO - [Train] step: 12299, loss_D: 0.004188, lr_D: 0.000800
268
+ 2023-07-11 13:41:03,444 - INFO - [Train] step: 12299, loss_G: 7.406260, lr_G: 0.000800
269
+ 2023-07-11 13:41:59,658 - INFO - [Train] step: 12399, loss_D: 0.086447, lr_D: 0.000800
270
+ 2023-07-11 13:41:59,742 - INFO - [Train] step: 12399, loss_G: 5.287652, lr_G: 0.000800
271
+ 2023-07-11 13:42:55,956 - INFO - [Train] step: 12499, loss_D: 0.000366, lr_D: 0.000800
272
+ 2023-07-11 13:42:56,040 - INFO - [Train] step: 12499, loss_G: 11.014597, lr_G: 0.000800
273
+ 2023-07-11 13:43:52,269 - INFO - [Train] step: 12599, loss_D: 0.013249, lr_D: 0.000800
274
+ 2023-07-11 13:43:52,352 - INFO - [Train] step: 12599, loss_G: 7.486347, lr_G: 0.000800
275
+ 2023-07-11 13:44:48,578 - INFO - [Train] step: 12699, loss_D: 0.008541, lr_D: 0.000800
276
+ 2023-07-11 13:44:48,662 - INFO - [Train] step: 12699, loss_G: 7.364436, lr_G: 0.000800
277
+ 2023-07-11 13:45:44,859 - INFO - [Train] step: 12799, loss_D: 0.101281, lr_D: 0.000800
278
+ 2023-07-11 13:45:44,943 - INFO - [Train] step: 12799, loss_G: 4.887404, lr_G: 0.000800
279
+ 2023-07-11 13:46:41,150 - INFO - [Train] step: 12899, loss_D: 0.046161, lr_D: 0.000800
280
+ 2023-07-11 13:46:41,234 - INFO - [Train] step: 12899, loss_G: 6.781134, lr_G: 0.000800
281
+ 2023-07-11 13:47:37,593 - INFO - [Train] step: 12999, loss_D: 0.003869, lr_D: 0.000800
282
+ 2023-07-11 13:47:37,677 - INFO - [Train] step: 12999, loss_G: 7.521938, lr_G: 0.000800
283
+ 2023-07-11 13:48:41,726 - INFO - [Eval] step: 12999, fid: 31.145196
284
+ 2023-07-11 13:49:37,899 - INFO - [Train] step: 13099, loss_D: 0.010063, lr_D: 0.000800
285
+ 2023-07-11 13:49:37,983 - INFO - [Train] step: 13099, loss_G: 6.120701, lr_G: 0.000800
286
+ 2023-07-11 13:50:34,174 - INFO - [Train] step: 13199, loss_D: 0.003973, lr_D: 0.000800
287
+ 2023-07-11 13:50:34,258 - INFO - [Train] step: 13199, loss_G: 7.335663, lr_G: 0.000800
288
+ 2023-07-11 13:51:30,410 - INFO - [Train] step: 13299, loss_D: 0.105317, lr_D: 0.000800
289
+ 2023-07-11 13:51:30,494 - INFO - [Train] step: 13299, loss_G: 3.828239, lr_G: 0.000800
290
+ 2023-07-11 13:52:26,637 - INFO - [Train] step: 13399, loss_D: 0.010208, lr_D: 0.000800
291
+ 2023-07-11 13:52:26,721 - INFO - [Train] step: 13399, loss_G: 8.042526, lr_G: 0.000800
292
+ 2023-07-11 13:53:22,877 - INFO - [Train] step: 13499, loss_D: 0.027225, lr_D: 0.000800
293
+ 2023-07-11 13:53:22,960 - INFO - [Train] step: 13499, loss_G: 6.426303, lr_G: 0.000800
294
+ 2023-07-11 13:54:19,215 - INFO - [Train] step: 13599, loss_D: 0.007277, lr_D: 0.000800
295
+ 2023-07-11 13:54:19,300 - INFO - [Train] step: 13599, loss_G: 7.899753, lr_G: 0.000800
296
+ 2023-07-11 13:55:15,450 - INFO - [Train] step: 13699, loss_D: 0.002837, lr_D: 0.000800
297
+ 2023-07-11 13:55:15,534 - INFO - [Train] step: 13699, loss_G: 8.021205, lr_G: 0.000800
298
+ 2023-07-11 13:56:11,641 - INFO - [Train] step: 13799, loss_D: 0.240107, lr_D: 0.000800
299
+ 2023-07-11 13:56:11,725 - INFO - [Train] step: 13799, loss_G: 3.456906, lr_G: 0.000800
300
+ 2023-07-11 13:57:07,841 - INFO - [Train] step: 13899, loss_D: 0.003996, lr_D: 0.000800
301
+ 2023-07-11 13:57:07,925 - INFO - [Train] step: 13899, loss_G: 8.493683, lr_G: 0.000800
302
+ 2023-07-11 13:58:04,072 - INFO - [Train] step: 13999, loss_D: 0.014825, lr_D: 0.000800
303
+ 2023-07-11 13:58:04,156 - INFO - [Train] step: 13999, loss_G: 6.566381, lr_G: 0.000800
304
+ 2023-07-11 13:59:08,169 - INFO - [Eval] step: 13999, fid: 29.862665
305
+ 2023-07-11 14:00:04,339 - INFO - [Train] step: 14099, loss_D: 0.007222, lr_D: 0.000800
306
+ 2023-07-11 14:00:04,423 - INFO - [Train] step: 14099, loss_G: 7.093710, lr_G: 0.000800
307
+ 2023-07-11 14:01:00,637 - INFO - [Train] step: 14199, loss_D: 0.007936, lr_D: 0.000800
308
+ 2023-07-11 14:01:00,721 - INFO - [Train] step: 14199, loss_G: 7.119807, lr_G: 0.000800
309
+ 2023-07-11 14:01:57,085 - INFO - [Train] step: 14299, loss_D: 0.003573, lr_D: 0.000800
310
+ 2023-07-11 14:01:57,169 - INFO - [Train] step: 14299, loss_G: 9.360214, lr_G: 0.000800
311
+ 2023-07-11 14:02:53,362 - INFO - [Train] step: 14399, loss_D: 0.051941, lr_D: 0.000800
312
+ 2023-07-11 14:02:53,446 - INFO - [Train] step: 14399, loss_G: 6.090578, lr_G: 0.000800
313
+ 2023-07-11 14:03:49,602 - INFO - [Train] step: 14499, loss_D: 0.007884, lr_D: 0.000800
314
+ 2023-07-11 14:03:49,686 - INFO - [Train] step: 14499, loss_G: 6.830848, lr_G: 0.000800
315
+ 2023-07-11 14:04:45,814 - INFO - [Train] step: 14599, loss_D: 0.008319, lr_D: 0.000800
316
+ 2023-07-11 14:04:45,898 - INFO - [Train] step: 14599, loss_G: 7.291912, lr_G: 0.000800
317
+ 2023-07-11 14:05:42,016 - INFO - [Train] step: 14699, loss_D: 0.003675, lr_D: 0.000800
318
+ 2023-07-11 14:05:42,100 - INFO - [Train] step: 14699, loss_G: 8.325859, lr_G: 0.000800
319
+ 2023-07-11 14:06:38,219 - INFO - [Train] step: 14799, loss_D: 0.028460, lr_D: 0.000800
320
+ 2023-07-11 14:06:38,303 - INFO - [Train] step: 14799, loss_G: 6.904002, lr_G: 0.000800
321
+ 2023-07-11 14:07:34,540 - INFO - [Train] step: 14899, loss_D: 0.010624, lr_D: 0.000800
322
+ 2023-07-11 14:07:34,624 - INFO - [Train] step: 14899, loss_G: 7.824240, lr_G: 0.000800
323
+ 2023-07-11 14:08:30,729 - INFO - [Train] step: 14999, loss_D: 0.000340, lr_D: 0.000800
324
+ 2023-07-11 14:08:30,813 - INFO - [Train] step: 14999, loss_G: 10.208112, lr_G: 0.000800
325
+ 2023-07-11 14:09:34,758 - INFO - [Eval] step: 14999, fid: 29.454847
326
+ 2023-07-11 14:10:30,878 - INFO - [Train] step: 15099, loss_D: 0.001534, lr_D: 0.000800
327
+ 2023-07-11 14:10:30,961 - INFO - [Train] step: 15099, loss_G: 8.764018, lr_G: 0.000800
328
+ 2023-07-11 14:11:27,105 - INFO - [Train] step: 15199, loss_D: 0.011419, lr_D: 0.000800
329
+ 2023-07-11 14:11:27,189 - INFO - [Train] step: 15199, loss_G: 7.759914, lr_G: 0.000800
330
+ 2023-07-11 14:12:23,394 - INFO - [Train] step: 15299, loss_D: 0.008819, lr_D: 0.000800
331
+ 2023-07-11 14:12:23,478 - INFO - [Train] step: 15299, loss_G: 8.962471, lr_G: 0.000800
332
+ 2023-07-11 14:13:19,693 - INFO - [Train] step: 15399, loss_D: 0.020425, lr_D: 0.000800
333
+ 2023-07-11 14:13:19,777 - INFO - [Train] step: 15399, loss_G: 7.879280, lr_G: 0.000800
334
+ 2023-07-11 14:14:15,986 - INFO - [Train] step: 15499, loss_D: 0.006072, lr_D: 0.000800
335
+ 2023-07-11 14:14:16,069 - INFO - [Train] step: 15499, loss_G: 7.211442, lr_G: 0.000800
336
+ 2023-07-11 14:15:12,433 - INFO - [Train] step: 15599, loss_D: 0.002150, lr_D: 0.000800
337
+ 2023-07-11 14:15:12,516 - INFO - [Train] step: 15599, loss_G: 8.549648, lr_G: 0.000800
338
+ 2023-07-11 14:16:08,707 - INFO - [Train] step: 15699, loss_D: 0.187069, lr_D: 0.000800
339
+ 2023-07-11 14:16:08,790 - INFO - [Train] step: 15699, loss_G: 4.555290, lr_G: 0.000800
340
+ 2023-07-11 14:17:04,922 - INFO - [Train] step: 15799, loss_D: 0.006582, lr_D: 0.000800
341
+ 2023-07-11 14:17:05,005 - INFO - [Train] step: 15799, loss_G: 7.604278, lr_G: 0.000800
342
+ 2023-07-11 14:18:01,177 - INFO - [Train] step: 15899, loss_D: 0.007547, lr_D: 0.000800
343
+ 2023-07-11 14:18:01,261 - INFO - [Train] step: 15899, loss_G: 7.642211, lr_G: 0.000800
344
+ 2023-07-11 14:18:57,369 - INFO - [Train] step: 15999, loss_D: 2.618884, lr_D: 0.000800
345
+ 2023-07-11 14:18:57,453 - INFO - [Train] step: 15999, loss_G: 5.931014, lr_G: 0.000800
346
+ 2023-07-11 14:20:01,404 - INFO - [Eval] step: 15999, fid: 30.130104
347
+ 2023-07-11 14:20:57,507 - INFO - [Train] step: 16099, loss_D: 0.007411, lr_D: 0.000800
348
+ 2023-07-11 14:20:57,590 - INFO - [Train] step: 16099, loss_G: 8.274845, lr_G: 0.000800
349
+ 2023-07-11 14:21:53,873 - INFO - [Train] step: 16199, loss_D: 0.042619, lr_D: 0.000800
350
+ 2023-07-11 14:21:53,957 - INFO - [Train] step: 16199, loss_G: 6.934038, lr_G: 0.000800
351
+ 2023-07-11 14:22:50,088 - INFO - [Train] step: 16299, loss_D: 0.000987, lr_D: 0.000800
352
+ 2023-07-11 14:22:50,172 - INFO - [Train] step: 16299, loss_G: 9.771923, lr_G: 0.000800
353
+ 2023-07-11 14:23:46,302 - INFO - [Train] step: 16399, loss_D: 0.009464, lr_D: 0.000800
354
+ 2023-07-11 14:23:46,386 - INFO - [Train] step: 16399, loss_G: 7.184815, lr_G: 0.000800
355
+ 2023-07-11 14:24:42,507 - INFO - [Train] step: 16499, loss_D: 0.005335, lr_D: 0.000800
356
+ 2023-07-11 14:24:42,591 - INFO - [Train] step: 16499, loss_G: 8.571962, lr_G: 0.000800
357
+ 2023-07-11 14:25:38,697 - INFO - [Train] step: 16599, loss_D: 2.028174, lr_D: 0.000800
358
+ 2023-07-11 14:25:38,780 - INFO - [Train] step: 16599, loss_G: 3.879996, lr_G: 0.000800
359
+ 2023-07-11 14:26:34,941 - INFO - [Train] step: 16699, loss_D: 0.007578, lr_D: 0.000800
360
+ 2023-07-11 14:26:35,025 - INFO - [Train] step: 16699, loss_G: 6.618587, lr_G: 0.000800
361
+ 2023-07-11 14:27:31,231 - INFO - [Train] step: 16799, loss_D: 0.011151, lr_D: 0.000800
362
+ 2023-07-11 14:27:31,314 - INFO - [Train] step: 16799, loss_G: 7.938923, lr_G: 0.000800
363
+ 2023-07-11 14:28:27,679 - INFO - [Train] step: 16899, loss_D: 0.007307, lr_D: 0.000800
364
+ 2023-07-11 14:28:27,763 - INFO - [Train] step: 16899, loss_G: 8.276512, lr_G: 0.000800
365
+ 2023-07-11 14:29:23,964 - INFO - [Train] step: 16999, loss_D: 0.112136, lr_D: 0.000800
366
+ 2023-07-11 14:29:24,048 - INFO - [Train] step: 16999, loss_G: 4.869758, lr_G: 0.000800
367
+ 2023-07-11 14:30:28,192 - INFO - [Eval] step: 16999, fid: 28.441528
368
+ 2023-07-11 14:31:24,660 - INFO - [Train] step: 17099, loss_D: 0.011750, lr_D: 0.000800
369
+ 2023-07-11 14:31:24,744 - INFO - [Train] step: 17099, loss_G: 7.757559, lr_G: 0.000800
370
+ 2023-07-11 14:32:20,933 - INFO - [Train] step: 17199, loss_D: 0.003757, lr_D: 0.000800
371
+ 2023-07-11 14:32:21,017 - INFO - [Train] step: 17199, loss_G: 7.643621, lr_G: 0.000800
372
+ 2023-07-11 14:33:17,206 - INFO - [Train] step: 17299, loss_D: 0.006172, lr_D: 0.000800
373
+ 2023-07-11 14:33:17,289 - INFO - [Train] step: 17299, loss_G: 8.143820, lr_G: 0.000800
374
+ 2023-07-11 14:34:13,464 - INFO - [Train] step: 17399, loss_D: 0.045059, lr_D: 0.000800
375
+ 2023-07-11 14:34:13,548 - INFO - [Train] step: 17399, loss_G: 7.102817, lr_G: 0.000800
376
+ 2023-07-11 14:35:09,859 - INFO - [Train] step: 17499, loss_D: 0.008678, lr_D: 0.000800
377
+ 2023-07-11 14:35:09,943 - INFO - [Train] step: 17499, loss_G: 8.260544, lr_G: 0.000800
378
+ 2023-07-11 14:36:06,086 - INFO - [Train] step: 17599, loss_D: 0.001129, lr_D: 0.000800
379
+ 2023-07-11 14:36:06,169 - INFO - [Train] step: 17599, loss_G: 9.633171, lr_G: 0.000800
380
+ 2023-07-11 14:37:02,313 - INFO - [Train] step: 17699, loss_D: 0.002065, lr_D: 0.000800
381
+ 2023-07-11 14:37:02,397 - INFO - [Train] step: 17699, loss_G: 8.164239, lr_G: 0.000800
382
+ 2023-07-11 14:37:58,535 - INFO - [Train] step: 17799, loss_D: 0.003847, lr_D: 0.000800
383
+ 2023-07-11 14:37:58,619 - INFO - [Train] step: 17799, loss_G: 10.379612, lr_G: 0.000800
384
+ 2023-07-11 14:38:54,750 - INFO - [Train] step: 17899, loss_D: 0.029135, lr_D: 0.000800
385
+ 2023-07-11 14:38:54,834 - INFO - [Train] step: 17899, loss_G: 8.965570, lr_G: 0.000800
386
+ 2023-07-11 14:39:50,977 - INFO - [Train] step: 17999, loss_D: 0.016164, lr_D: 0.000800
387
+ 2023-07-11 14:39:51,061 - INFO - [Train] step: 17999, loss_G: 7.357069, lr_G: 0.000800
388
+ 2023-07-11 14:40:55,161 - INFO - [Eval] step: 17999, fid: 27.078599
389
+ 2023-07-11 14:41:51,551 - INFO - [Train] step: 18099, loss_D: 0.001106, lr_D: 0.000800
390
+ 2023-07-11 14:41:51,634 - INFO - [Train] step: 18099, loss_G: 10.010357, lr_G: 0.000800
391
+ 2023-07-11 14:42:47,921 - INFO - [Train] step: 18199, loss_D: 0.233529, lr_D: 0.000800
392
+ 2023-07-11 14:42:48,005 - INFO - [Train] step: 18199, loss_G: 3.818181, lr_G: 0.000800
393
+ 2023-07-11 14:43:44,156 - INFO - [Train] step: 18299, loss_D: 0.004615, lr_D: 0.000800
394
+ 2023-07-11 14:43:44,240 - INFO - [Train] step: 18299, loss_G: 9.297379, lr_G: 0.000800
395
+ 2023-07-11 14:44:40,425 - INFO - [Train] step: 18399, loss_D: 0.001716, lr_D: 0.000800
396
+ 2023-07-11 14:44:40,509 - INFO - [Train] step: 18399, loss_G: 9.390348, lr_G: 0.000800
397
+ 2023-07-11 14:45:36,726 - INFO - [Train] step: 18499, loss_D: 0.003523, lr_D: 0.000800
398
+ 2023-07-11 14:45:36,810 - INFO - [Train] step: 18499, loss_G: 9.848138, lr_G: 0.000800
399
+ 2023-07-11 14:46:33,029 - INFO - [Train] step: 18599, loss_D: 0.018795, lr_D: 0.000800
400
+ 2023-07-11 14:46:33,113 - INFO - [Train] step: 18599, loss_G: 8.085789, lr_G: 0.000800
401
+ 2023-07-11 14:47:29,340 - INFO - [Train] step: 18699, loss_D: 0.145551, lr_D: 0.000800
402
+ 2023-07-11 14:47:29,423 - INFO - [Train] step: 18699, loss_G: 6.390501, lr_G: 0.000800
403
+ 2023-07-11 14:48:25,761 - INFO - [Train] step: 18799, loss_D: 0.004858, lr_D: 0.000800
404
+ 2023-07-11 14:48:25,844 - INFO - [Train] step: 18799, loss_G: 8.706554, lr_G: 0.000800
405
+ 2023-07-11 14:49:22,023 - INFO - [Train] step: 18899, loss_D: 0.000915, lr_D: 0.000800
406
+ 2023-07-11 14:49:22,107 - INFO - [Train] step: 18899, loss_G: 11.438213, lr_G: 0.000800
407
+ 2023-07-11 14:50:18,280 - INFO - [Train] step: 18999, loss_D: 0.005635, lr_D: 0.000800
408
+ 2023-07-11 14:50:18,364 - INFO - [Train] step: 18999, loss_G: 8.079836, lr_G: 0.000800
409
+ 2023-07-11 14:51:22,382 - INFO - [Eval] step: 18999, fid: 30.342733
410
+ 2023-07-11 14:52:18,512 - INFO - [Train] step: 19099, loss_D: 0.495687, lr_D: 0.000800
411
+ 2023-07-11 14:52:18,596 - INFO - [Train] step: 19099, loss_G: 2.701498, lr_G: 0.000800
412
+ 2023-07-11 14:53:14,777 - INFO - [Train] step: 19199, loss_D: 0.010718, lr_D: 0.000800
413
+ 2023-07-11 14:53:14,861 - INFO - [Train] step: 19199, loss_G: 8.037912, lr_G: 0.000800
414
+ 2023-07-11 14:54:11,025 - INFO - [Train] step: 19299, loss_D: 0.001784, lr_D: 0.000800
415
+ 2023-07-11 14:54:11,109 - INFO - [Train] step: 19299, loss_G: 9.724527, lr_G: 0.000800
416
+ 2023-07-11 14:55:07,284 - INFO - [Train] step: 19399, loss_D: 0.000590, lr_D: 0.000800
417
+ 2023-07-11 14:55:07,368 - INFO - [Train] step: 19399, loss_G: 10.502163, lr_G: 0.000800
418
+ 2023-07-11 14:56:03,628 - INFO - [Train] step: 19499, loss_D: 0.012816, lr_D: 0.000800
419
+ 2023-07-11 14:56:03,711 - INFO - [Train] step: 19499, loss_G: 7.751922, lr_G: 0.000800
420
+ 2023-07-11 14:56:59,856 - INFO - [Train] step: 19599, loss_D: 0.005400, lr_D: 0.000800
421
+ 2023-07-11 14:56:59,939 - INFO - [Train] step: 19599, loss_G: 8.535320, lr_G: 0.000800
422
+ 2023-07-11 14:57:56,065 - INFO - [Train] step: 19699, loss_D: 0.002196, lr_D: 0.000800
423
+ 2023-07-11 14:57:56,149 - INFO - [Train] step: 19699, loss_G: 10.263439, lr_G: 0.000800
424
+ 2023-07-11 14:58:52,317 - INFO - [Train] step: 19799, loss_D: 0.021812, lr_D: 0.000800
425
+ 2023-07-11 14:58:52,401 - INFO - [Train] step: 19799, loss_G: 7.788321, lr_G: 0.000800
426
+ 2023-07-11 14:59:48,604 - INFO - [Train] step: 19899, loss_D: 0.003188, lr_D: 0.000800
427
+ 2023-07-11 14:59:48,688 - INFO - [Train] step: 19899, loss_G: 10.278534, lr_G: 0.000800
428
+ 2023-07-11 15:00:44,920 - INFO - [Train] step: 19999, loss_D: 0.360845, lr_D: 0.000800
429
+ 2023-07-11 15:00:45,004 - INFO - [Train] step: 19999, loss_G: 4.274442, lr_G: 0.000800
430
+ 2023-07-11 15:01:49,053 - INFO - [Eval] step: 19999, fid: 27.008773
431
+ 2023-07-11 15:02:45,811 - INFO - [Train] step: 20099, loss_D: 0.006600, lr_D: 0.000800
432
+ 2023-07-11 15:02:45,895 - INFO - [Train] step: 20099, loss_G: 8.291388, lr_G: 0.000800
433
+ 2023-07-11 15:03:42,117 - INFO - [Train] step: 20199, loss_D: 0.004649, lr_D: 0.000800
434
+ 2023-07-11 15:03:42,200 - INFO - [Train] step: 20199, loss_G: 8.462478, lr_G: 0.000800
435
+ 2023-07-11 15:04:38,385 - INFO - [Train] step: 20299, loss_D: 0.002957, lr_D: 0.000800
436
+ 2023-07-11 15:04:38,468 - INFO - [Train] step: 20299, loss_G: 9.305570, lr_G: 0.000800
437
+ 2023-07-11 15:05:34,636 - INFO - [Train] step: 20399, loss_D: 0.158959, lr_D: 0.000800
438
+ 2023-07-11 15:05:34,720 - INFO - [Train] step: 20399, loss_G: 11.202217, lr_G: 0.000800
439
+ 2023-07-11 15:06:30,882 - INFO - [Train] step: 20499, loss_D: 1.208866, lr_D: 0.000800
440
+ 2023-07-11 15:06:30,966 - INFO - [Train] step: 20499, loss_G: 9.777437, lr_G: 0.000800
441
+ 2023-07-11 15:07:27,109 - INFO - [Train] step: 20599, loss_D: 0.000840, lr_D: 0.000800
442
+ 2023-07-11 15:07:27,193 - INFO - [Train] step: 20599, loss_G: 11.558846, lr_G: 0.000800
443
+ 2023-07-11 15:08:23,454 - INFO - [Train] step: 20699, loss_D: 0.003777, lr_D: 0.000800
444
+ 2023-07-11 15:08:23,538 - INFO - [Train] step: 20699, loss_G: 9.691971, lr_G: 0.000800
445
+ 2023-07-11 15:09:19,662 - INFO - [Train] step: 20799, loss_D: 0.003992, lr_D: 0.000800
446
+ 2023-07-11 15:09:19,746 - INFO - [Train] step: 20799, loss_G: 7.911158, lr_G: 0.000800
447
+ 2023-07-11 15:10:15,899 - INFO - [Train] step: 20899, loss_D: 0.005836, lr_D: 0.000800
448
+ 2023-07-11 15:10:15,982 - INFO - [Train] step: 20899, loss_G: 7.689766, lr_G: 0.000800
449
+ 2023-07-11 15:11:12,147 - INFO - [Train] step: 20999, loss_D: 0.003656, lr_D: 0.000800
450
+ 2023-07-11 15:11:12,230 - INFO - [Train] step: 20999, loss_G: 9.950825, lr_G: 0.000800
451
+ 2023-07-11 15:12:16,243 - INFO - [Eval] step: 20999, fid: 25.323420
452
+ 2023-07-11 15:13:12,716 - INFO - [Train] step: 21099, loss_D: 0.002893, lr_D: 0.000800
453
+ 2023-07-11 15:13:12,800 - INFO - [Train] step: 21099, loss_G: 11.599640, lr_G: 0.000800
454
+ 2023-07-11 15:14:08,986 - INFO - [Train] step: 21199, loss_D: 0.049744, lr_D: 0.000800
455
+ 2023-07-11 15:14:09,070 - INFO - [Train] step: 21199, loss_G: 6.993627, lr_G: 0.000800
456
+ 2023-07-11 15:15:05,248 - INFO - [Train] step: 21299, loss_D: 0.008271, lr_D: 0.000800
457
+ 2023-07-11 15:15:05,332 - INFO - [Train] step: 21299, loss_G: 8.385923, lr_G: 0.000800
458
+ 2023-07-11 15:16:01,663 - INFO - [Train] step: 21399, loss_D: 0.002594, lr_D: 0.000800
459
+ 2023-07-11 15:16:01,746 - INFO - [Train] step: 21399, loss_G: 8.507057, lr_G: 0.000800
460
+ 2023-07-11 15:16:57,916 - INFO - [Train] step: 21499, loss_D: 0.002587, lr_D: 0.000800
461
+ 2023-07-11 15:16:58,000 - INFO - [Train] step: 21499, loss_G: 10.397579, lr_G: 0.000800
462
+ 2023-07-11 15:17:54,167 - INFO - [Train] step: 21599, loss_D: 0.015398, lr_D: 0.000800
463
+ 2023-07-11 15:17:54,251 - INFO - [Train] step: 21599, loss_G: 7.068629, lr_G: 0.000800
464
+ 2023-07-11 15:18:50,386 - INFO - [Train] step: 21699, loss_D: 0.005399, lr_D: 0.000800
465
+ 2023-07-11 15:18:50,470 - INFO - [Train] step: 21699, loss_G: 7.438874, lr_G: 0.000800
466
+ 2023-07-11 15:19:46,641 - INFO - [Train] step: 21799, loss_D: 0.003675, lr_D: 0.000800
467
+ 2023-07-11 15:19:46,725 - INFO - [Train] step: 21799, loss_G: 9.521066, lr_G: 0.000800
468
+ 2023-07-11 15:20:42,865 - INFO - [Train] step: 21899, loss_D: 0.006019, lr_D: 0.000800
469
+ 2023-07-11 15:20:42,949 - INFO - [Train] step: 21899, loss_G: 8.693350, lr_G: 0.000800
470
+ 2023-07-11 15:21:39,236 - INFO - [Train] step: 21999, loss_D: 0.007369, lr_D: 0.000800
471
+ 2023-07-11 15:21:39,319 - INFO - [Train] step: 21999, loss_G: 9.730040, lr_G: 0.000800
472
+ 2023-07-11 15:22:43,392 - INFO - [Eval] step: 21999, fid: 27.574415
473
+ 2023-07-11 15:23:39,529 - INFO - [Train] step: 22099, loss_D: 0.007360, lr_D: 0.000800
474
+ 2023-07-11 15:23:39,613 - INFO - [Train] step: 22099, loss_G: 7.748243, lr_G: 0.000800
475
+ 2023-07-11 15:24:35,779 - INFO - [Train] step: 22199, loss_D: 0.000811, lr_D: 0.000800
476
+ 2023-07-11 15:24:35,862 - INFO - [Train] step: 22199, loss_G: 10.671335, lr_G: 0.000800
477
+ 2023-07-11 15:25:32,048 - INFO - [Train] step: 22299, loss_D: 0.001396, lr_D: 0.000800
478
+ 2023-07-11 15:25:32,132 - INFO - [Train] step: 22299, loss_G: 10.386802, lr_G: 0.000800
479
+ 2023-07-11 15:26:28,291 - INFO - [Train] step: 22399, loss_D: 0.288791, lr_D: 0.000800
480
+ 2023-07-11 15:26:28,374 - INFO - [Train] step: 22399, loss_G: 3.395542, lr_G: 0.000800
481
+ 2023-07-11 15:27:24,528 - INFO - [Train] step: 22499, loss_D: 0.011659, lr_D: 0.000800
482
+ 2023-07-11 15:27:24,613 - INFO - [Train] step: 22499, loss_G: 8.573141, lr_G: 0.000800
483
+ 2023-07-11 15:28:20,788 - INFO - [Train] step: 22599, loss_D: 0.003066, lr_D: 0.000800
484
+ 2023-07-11 15:28:20,872 - INFO - [Train] step: 22599, loss_G: 8.549353, lr_G: 0.000800
485
+ 2023-07-11 15:29:17,190 - INFO - [Train] step: 22699, loss_D: 0.004410, lr_D: 0.000800
486
+ 2023-07-11 15:29:17,274 - INFO - [Train] step: 22699, loss_G: 9.382833, lr_G: 0.000800
487
+ 2023-07-11 15:30:13,405 - INFO - [Train] step: 22799, loss_D: 0.037453, lr_D: 0.000800
488
+ 2023-07-11 15:30:13,489 - INFO - [Train] step: 22799, loss_G: 7.477693, lr_G: 0.000800
489
+ 2023-07-11 15:31:09,624 - INFO - [Train] step: 22899, loss_D: 0.109032, lr_D: 0.000800
490
+ 2023-07-11 15:31:09,708 - INFO - [Train] step: 22899, loss_G: 5.265463, lr_G: 0.000800
491
+ 2023-07-11 15:32:05,851 - INFO - [Train] step: 22999, loss_D: 0.003763, lr_D: 0.000800
492
+ 2023-07-11 15:32:05,934 - INFO - [Train] step: 22999, loss_G: 9.396175, lr_G: 0.000800
493
+ 2023-07-11 15:33:09,889 - INFO - [Eval] step: 22999, fid: 29.516297
494
+ 2023-07-11 15:34:06,058 - INFO - [Train] step: 23099, loss_D: 0.003253, lr_D: 0.000800
495
+ 2023-07-11 15:34:06,142 - INFO - [Train] step: 23099, loss_G: 9.109964, lr_G: 0.000800
496
+ 2023-07-11 15:35:02,365 - INFO - [Train] step: 23199, loss_D: 0.000366, lr_D: 0.000800
497
+ 2023-07-11 15:35:02,449 - INFO - [Train] step: 23199, loss_G: 11.528555, lr_G: 0.000800
498
+ 2023-07-11 15:35:58,768 - INFO - [Train] step: 23299, loss_D: 0.079994, lr_D: 0.000800
499
+ 2023-07-11 15:35:58,852 - INFO - [Train] step: 23299, loss_G: 5.743853, lr_G: 0.000800
500
+ 2023-07-11 15:36:54,983 - INFO - [Train] step: 23399, loss_D: 0.002688, lr_D: 0.000800
501
+ 2023-07-11 15:36:55,067 - INFO - [Train] step: 23399, loss_G: 11.234402, lr_G: 0.000800
502
+ 2023-07-11 15:37:51,260 - INFO - [Train] step: 23499, loss_D: 0.148203, lr_D: 0.000800
503
+ 2023-07-11 15:37:51,344 - INFO - [Train] step: 23499, loss_G: 4.712931, lr_G: 0.000800
504
+ 2023-07-11 15:38:47,518 - INFO - [Train] step: 23599, loss_D: 0.001541, lr_D: 0.000800
505
+ 2023-07-11 15:38:47,602 - INFO - [Train] step: 23599, loss_G: 9.591396, lr_G: 0.000800
506
+ 2023-07-11 15:39:43,792 - INFO - [Train] step: 23699, loss_D: 0.002986, lr_D: 0.000800
507
+ 2023-07-11 15:39:43,877 - INFO - [Train] step: 23699, loss_G: 9.700461, lr_G: 0.000800
508
+ 2023-07-11 15:40:40,038 - INFO - [Train] step: 23799, loss_D: 0.010304, lr_D: 0.000800
509
+ 2023-07-11 15:40:40,122 - INFO - [Train] step: 23799, loss_G: 8.965643, lr_G: 0.000800
510
+ 2023-07-11 15:41:36,319 - INFO - [Train] step: 23899, loss_D: 0.001316, lr_D: 0.000800
511
+ 2023-07-11 15:41:36,402 - INFO - [Train] step: 23899, loss_G: 9.987736, lr_G: 0.000800
512
+ 2023-07-11 15:42:32,766 - INFO - [Train] step: 23999, loss_D: 0.001019, lr_D: 0.000800
513
+ 2023-07-11 15:42:32,850 - INFO - [Train] step: 23999, loss_G: 11.542675, lr_G: 0.000800
514
+ 2023-07-11 15:43:36,944 - INFO - [Eval] step: 23999, fid: 28.743679
515
+ 2023-07-11 15:44:33,120 - INFO - [Train] step: 24099, loss_D: 0.209719, lr_D: 0.000800
516
+ 2023-07-11 15:44:33,204 - INFO - [Train] step: 24099, loss_G: 4.494919, lr_G: 0.000800
517
+ 2023-07-11 15:45:29,426 - INFO - [Train] step: 24199, loss_D: 0.007436, lr_D: 0.000800
518
+ 2023-07-11 15:45:29,510 - INFO - [Train] step: 24199, loss_G: 9.161270, lr_G: 0.000800
519
+ 2023-07-11 15:46:25,708 - INFO - [Train] step: 24299, loss_D: 0.005351, lr_D: 0.000800
520
+ 2023-07-11 15:46:25,792 - INFO - [Train] step: 24299, loss_G: 9.379765, lr_G: 0.000800
521
+ 2023-07-11 15:47:22,009 - INFO - [Train] step: 24399, loss_D: 0.002128, lr_D: 0.000800
522
+ 2023-07-11 15:47:22,092 - INFO - [Train] step: 24399, loss_G: 9.378587, lr_G: 0.000800
523
+ 2023-07-11 15:48:18,317 - INFO - [Train] step: 24499, loss_D: 0.003422, lr_D: 0.000800
524
+ 2023-07-11 15:48:18,401 - INFO - [Train] step: 24499, loss_G: 9.513985, lr_G: 0.000800
525
+ 2023-07-11 15:49:14,735 - INFO - [Train] step: 24599, loss_D: 0.006654, lr_D: 0.000800
526
+ 2023-07-11 15:49:14,819 - INFO - [Train] step: 24599, loss_G: 9.272728, lr_G: 0.000800
527
+ 2023-07-11 15:50:11,028 - INFO - [Train] step: 24699, loss_D: 0.002792, lr_D: 0.000800
528
+ 2023-07-11 15:50:11,111 - INFO - [Train] step: 24699, loss_G: 9.233520, lr_G: 0.000800
529
+ 2023-07-11 15:51:07,306 - INFO - [Train] step: 24799, loss_D: 0.004219, lr_D: 0.000800
530
+ 2023-07-11 15:51:07,389 - INFO - [Train] step: 24799, loss_G: 9.143127, lr_G: 0.000800
531
+ 2023-07-11 15:52:03,599 - INFO - [Train] step: 24899, loss_D: 0.160338, lr_D: 0.000800
532
+ 2023-07-11 15:52:03,683 - INFO - [Train] step: 24899, loss_G: 4.610417, lr_G: 0.000800
533
+ 2023-07-11 15:52:59,895 - INFO - [Train] step: 24999, loss_D: 0.015072, lr_D: 0.000800
534
+ 2023-07-11 15:52:59,979 - INFO - [Train] step: 24999, loss_G: 8.884357, lr_G: 0.000800
535
+ 2023-07-11 15:54:04,042 - INFO - [Eval] step: 24999, fid: 28.120965
536
+ 2023-07-11 15:55:00,217 - INFO - [Train] step: 25099, loss_D: 0.003968, lr_D: 0.000800
537
+ 2023-07-11 15:55:00,301 - INFO - [Train] step: 25099, loss_G: 10.388786, lr_G: 0.000800
538
+ 2023-07-11 15:55:56,534 - INFO - [Train] step: 25199, loss_D: 0.203878, lr_D: 0.000800
539
+ 2023-07-11 15:55:56,618 - INFO - [Train] step: 25199, loss_G: 5.433864, lr_G: 0.000800
540
+ 2023-07-11 15:56:52,988 - INFO - [Train] step: 25299, loss_D: 0.001827, lr_D: 0.000800
541
+ 2023-07-11 15:56:53,072 - INFO - [Train] step: 25299, loss_G: 10.073749, lr_G: 0.000800
542
+ 2023-07-11 15:57:49,282 - INFO - [Train] step: 25399, loss_D: 0.001059, lr_D: 0.000800
543
+ 2023-07-11 15:57:49,366 - INFO - [Train] step: 25399, loss_G: 11.256536, lr_G: 0.000800
544
+ 2023-07-11 15:58:45,566 - INFO - [Train] step: 25499, loss_D: 0.028165, lr_D: 0.000800
545
+ 2023-07-11 15:58:45,650 - INFO - [Train] step: 25499, loss_G: 7.335596, lr_G: 0.000800
546
+ 2023-07-11 15:59:41,867 - INFO - [Train] step: 25599, loss_D: 0.013606, lr_D: 0.000800
547
+ 2023-07-11 15:59:41,950 - INFO - [Train] step: 25599, loss_G: 8.838924, lr_G: 0.000800
548
+ 2023-07-11 16:00:38,189 - INFO - [Train] step: 25699, loss_D: 0.010900, lr_D: 0.000800
549
+ 2023-07-11 16:00:38,272 - INFO - [Train] step: 25699, loss_G: 7.900687, lr_G: 0.000800
550
+ 2023-07-11 16:01:34,495 - INFO - [Train] step: 25799, loss_D: 0.001805, lr_D: 0.000800
551
+ 2023-07-11 16:01:34,579 - INFO - [Train] step: 25799, loss_G: 8.660160, lr_G: 0.000800
552
+ 2023-07-11 16:02:30,939 - INFO - [Train] step: 25899, loss_D: 0.001893, lr_D: 0.000800
553
+ 2023-07-11 16:02:31,023 - INFO - [Train] step: 25899, loss_G: 11.859098, lr_G: 0.000800
554
+ 2023-07-11 16:03:27,226 - INFO - [Train] step: 25999, loss_D: 0.003861, lr_D: 0.000800
555
+ 2023-07-11 16:03:27,310 - INFO - [Train] step: 25999, loss_G: 10.326664, lr_G: 0.000800
556
+ 2023-07-11 16:04:31,317 - INFO - [Eval] step: 25999, fid: 29.586841
557
+ 2023-07-11 16:05:27,502 - INFO - [Train] step: 26099, loss_D: 0.012158, lr_D: 0.000800
558
+ 2023-07-11 16:05:27,586 - INFO - [Train] step: 26099, loss_G: 8.987256, lr_G: 0.000800
559
+ 2023-07-11 16:06:23,733 - INFO - [Train] step: 26199, loss_D: 0.002671, lr_D: 0.000800
560
+ 2023-07-11 16:06:23,817 - INFO - [Train] step: 26199, loss_G: 11.424615, lr_G: 0.000800
561
+ 2023-07-11 16:07:20,007 - INFO - [Train] step: 26299, loss_D: 0.015238, lr_D: 0.000800
562
+ 2023-07-11 16:07:20,091 - INFO - [Train] step: 26299, loss_G: 8.505151, lr_G: 0.000800
563
+ 2023-07-11 16:08:16,254 - INFO - [Train] step: 26399, loss_D: 0.004004, lr_D: 0.000800
564
+ 2023-07-11 16:08:16,338 - INFO - [Train] step: 26399, loss_G: 9.692331, lr_G: 0.000800
565
+ 2023-07-11 16:09:12,517 - INFO - [Train] step: 26499, loss_D: 0.000488, lr_D: 0.000800
566
+ 2023-07-11 16:09:12,600 - INFO - [Train] step: 26499, loss_G: 13.278828, lr_G: 0.000800
567
+ 2023-07-11 16:10:08,957 - INFO - [Train] step: 26599, loss_D: 0.123925, lr_D: 0.000800
568
+ 2023-07-11 16:10:09,041 - INFO - [Train] step: 26599, loss_G: 6.979785, lr_G: 0.000800
569
+ 2023-07-11 16:11:05,249 - INFO - [Train] step: 26699, loss_D: 0.007197, lr_D: 0.000800
570
+ 2023-07-11 16:11:05,333 - INFO - [Train] step: 26699, loss_G: 9.265303, lr_G: 0.000800
571
+ 2023-07-11 16:12:01,561 - INFO - [Train] step: 26799, loss_D: 0.002805, lr_D: 0.000800
572
+ 2023-07-11 16:12:01,646 - INFO - [Train] step: 26799, loss_G: 10.261986, lr_G: 0.000800
573
+ 2023-07-11 16:12:57,866 - INFO - [Train] step: 26899, loss_D: 0.002603, lr_D: 0.000800
574
+ 2023-07-11 16:12:57,950 - INFO - [Train] step: 26899, loss_G: 9.968462, lr_G: 0.000800
575
+ 2023-07-11 16:13:54,138 - INFO - [Train] step: 26999, loss_D: 0.270974, lr_D: 0.000800
576
+ 2023-07-11 16:13:54,222 - INFO - [Train] step: 26999, loss_G: 4.476009, lr_G: 0.000800
577
+ 2023-07-11 16:14:58,262 - INFO - [Eval] step: 26999, fid: 28.261105
578
+ 2023-07-11 16:15:54,401 - INFO - [Train] step: 27099, loss_D: 0.005396, lr_D: 0.000800
579
+ 2023-07-11 16:15:54,485 - INFO - [Train] step: 27099, loss_G: 9.020653, lr_G: 0.000800
580
+ 2023-07-11 16:16:50,800 - INFO - [Train] step: 27199, loss_D: 0.002239, lr_D: 0.000800
581
+ 2023-07-11 16:16:50,884 - INFO - [Train] step: 27199, loss_G: 10.092320, lr_G: 0.000800
582
+ 2023-07-11 16:17:47,071 - INFO - [Train] step: 27299, loss_D: 0.001601, lr_D: 0.000800
583
+ 2023-07-11 16:17:47,154 - INFO - [Train] step: 27299, loss_G: 10.914949, lr_G: 0.000800
584
+ 2023-07-11 16:18:43,343 - INFO - [Train] step: 27399, loss_D: 0.022859, lr_D: 0.000800
585
+ 2023-07-11 16:18:43,426 - INFO - [Train] step: 27399, loss_G: 9.357937, lr_G: 0.000800
586
+ 2023-07-11 16:19:39,639 - INFO - [Train] step: 27499, loss_D: 0.003744, lr_D: 0.000800
587
+ 2023-07-11 16:19:39,723 - INFO - [Train] step: 27499, loss_G: 9.579683, lr_G: 0.000800
588
+ 2023-07-11 16:20:35,927 - INFO - [Train] step: 27599, loss_D: 0.008932, lr_D: 0.000800
589
+ 2023-07-11 16:20:36,010 - INFO - [Train] step: 27599, loss_G: 9.413516, lr_G: 0.000800
590
+ 2023-07-11 16:21:32,233 - INFO - [Train] step: 27699, loss_D: 0.002887, lr_D: 0.000800
591
+ 2023-07-11 16:21:32,316 - INFO - [Train] step: 27699, loss_G: 9.666740, lr_G: 0.000800
592
+ 2023-07-11 16:22:28,504 - INFO - [Train] step: 27799, loss_D: 0.076950, lr_D: 0.000800
593
+ 2023-07-11 16:22:28,587 - INFO - [Train] step: 27799, loss_G: 7.720177, lr_G: 0.000800
594
+ 2023-07-11 16:23:24,915 - INFO - [Train] step: 27899, loss_D: 0.003379, lr_D: 0.000800
595
+ 2023-07-11 16:23:24,999 - INFO - [Train] step: 27899, loss_G: 9.764106, lr_G: 0.000800
596
+ 2023-07-11 16:24:21,160 - INFO - [Train] step: 27999, loss_D: 0.002163, lr_D: 0.000800
597
+ 2023-07-11 16:24:21,244 - INFO - [Train] step: 27999, loss_G: 9.900660, lr_G: 0.000800
598
+ 2023-07-11 16:25:25,168 - INFO - [Eval] step: 27999, fid: 31.858242
599
+ 2023-07-11 16:26:21,310 - INFO - [Train] step: 28099, loss_D: 0.005257, lr_D: 0.000800
600
+ 2023-07-11 16:26:21,394 - INFO - [Train] step: 28099, loss_G: 10.851337, lr_G: 0.000800
601
+ 2023-07-11 16:27:17,562 - INFO - [Train] step: 28199, loss_D: 0.017695, lr_D: 0.000800
602
+ 2023-07-11 16:27:17,646 - INFO - [Train] step: 28199, loss_G: 9.165024, lr_G: 0.000800
603
+ 2023-07-11 16:28:13,841 - INFO - [Train] step: 28299, loss_D: 0.003889, lr_D: 0.000800
604
+ 2023-07-11 16:28:13,925 - INFO - [Train] step: 28299, loss_G: 9.393435, lr_G: 0.000800
605
+ 2023-07-11 16:29:10,157 - INFO - [Train] step: 28399, loss_D: 0.005450, lr_D: 0.000800
606
+ 2023-07-11 16:29:10,241 - INFO - [Train] step: 28399, loss_G: 10.037657, lr_G: 0.000800
607
+ 2023-07-11 16:30:06,672 - INFO - [Train] step: 28499, loss_D: 0.081876, lr_D: 0.000800
608
+ 2023-07-11 16:30:06,756 - INFO - [Train] step: 28499, loss_G: 6.137455, lr_G: 0.000800
609
+ 2023-07-11 16:31:03,178 - INFO - [Train] step: 28599, loss_D: 0.009347, lr_D: 0.000800
610
+ 2023-07-11 16:31:03,262 - INFO - [Train] step: 28599, loss_G: 8.486110, lr_G: 0.000800
611
+ 2023-07-11 16:31:59,783 - INFO - [Train] step: 28699, loss_D: 0.003453, lr_D: 0.000800
612
+ 2023-07-11 16:31:59,867 - INFO - [Train] step: 28699, loss_G: 9.374587, lr_G: 0.000800
613
+ 2023-07-11 16:32:56,372 - INFO - [Train] step: 28799, loss_D: 0.007726, lr_D: 0.000800
614
+ 2023-07-11 16:32:56,457 - INFO - [Train] step: 28799, loss_G: 9.320904, lr_G: 0.000800
615
+ 2023-07-11 16:33:52,947 - INFO - [Train] step: 28899, loss_D: 2.428608, lr_D: 0.000800
616
+ 2023-07-11 16:33:53,031 - INFO - [Train] step: 28899, loss_G: 16.081322, lr_G: 0.000800
617
+ 2023-07-11 16:34:49,510 - INFO - [Train] step: 28999, loss_D: 0.006730, lr_D: 0.000800
618
+ 2023-07-11 16:34:49,594 - INFO - [Train] step: 28999, loss_G: 10.405189, lr_G: 0.000800
619
+ 2023-07-11 16:35:53,637 - INFO - [Eval] step: 28999, fid: 28.514210
620
+ 2023-07-11 16:36:50,081 - INFO - [Train] step: 29099, loss_D: 0.000476, lr_D: 0.000800
621
+ 2023-07-11 16:36:50,165 - INFO - [Train] step: 29099, loss_G: 11.495661, lr_G: 0.000800
622
+ 2023-07-11 16:37:46,916 - INFO - [Train] step: 29199, loss_D: 0.021919, lr_D: 0.000800
623
+ 2023-07-11 16:37:47,001 - INFO - [Train] step: 29199, loss_G: 7.978640, lr_G: 0.000800
624
+ 2023-07-11 16:38:43,615 - INFO - [Train] step: 29299, loss_D: 0.005424, lr_D: 0.000800
625
+ 2023-07-11 16:38:43,700 - INFO - [Train] step: 29299, loss_G: 10.110363, lr_G: 0.000800
626
+ 2023-07-11 16:39:40,317 - INFO - [Train] step: 29399, loss_D: 0.001903, lr_D: 0.000800
627
+ 2023-07-11 16:39:40,401 - INFO - [Train] step: 29399, loss_G: 10.540430, lr_G: 0.000800
628
+ 2023-07-11 16:40:37,009 - INFO - [Train] step: 29499, loss_D: 0.001573, lr_D: 0.000800
629
+ 2023-07-11 16:40:37,094 - INFO - [Train] step: 29499, loss_G: 10.988358, lr_G: 0.000800
630
+ 2023-07-11 16:41:33,715 - INFO - [Train] step: 29599, loss_D: 0.006324, lr_D: 0.000800
631
+ 2023-07-11 16:41:33,799 - INFO - [Train] step: 29599, loss_G: 8.904610, lr_G: 0.000800
632
+ 2023-07-11 16:42:30,448 - INFO - [Train] step: 29699, loss_D: 0.002145, lr_D: 0.000800
633
+ 2023-07-11 16:42:30,532 - INFO - [Train] step: 29699, loss_G: 10.345118, lr_G: 0.000800
634
+ 2023-07-11 16:43:27,268 - INFO - [Train] step: 29799, loss_D: 0.001737, lr_D: 0.000800
635
+ 2023-07-11 16:43:27,352 - INFO - [Train] step: 29799, loss_G: 10.607989, lr_G: 0.000800
636
+ 2023-07-11 16:44:23,956 - INFO - [Train] step: 29899, loss_D: 0.014888, lr_D: 0.000800
637
+ 2023-07-11 16:44:24,041 - INFO - [Train] step: 29899, loss_G: 8.812881, lr_G: 0.000800
638
+ 2023-07-11 16:45:20,649 - INFO - [Train] step: 29999, loss_D: 0.003554, lr_D: 0.000800
639
+ 2023-07-11 16:45:20,733 - INFO - [Train] step: 29999, loss_G: 10.038866, lr_G: 0.000800
640
+ 2023-07-11 16:46:24,833 - INFO - [Eval] step: 29999, fid: 31.425861
641
+ 2023-07-11 16:47:21,431 - INFO - [Train] step: 30099, loss_D: 0.001823, lr_D: 0.000800
642
+ 2023-07-11 16:47:21,516 - INFO - [Train] step: 30099, loss_G: 10.117667, lr_G: 0.000800
643
+ 2023-07-11 16:48:18,135 - INFO - [Train] step: 30199, loss_D: 0.009171, lr_D: 0.000800
644
+ 2023-07-11 16:48:18,220 - INFO - [Train] step: 30199, loss_G: 11.614449, lr_G: 0.000800
645
+ 2023-07-11 16:49:14,819 - INFO - [Train] step: 30299, loss_D: 0.002240, lr_D: 0.000800
646
+ 2023-07-11 16:49:14,903 - INFO - [Train] step: 30299, loss_G: 10.939562, lr_G: 0.000800
647
+ 2023-07-11 16:50:11,654 - INFO - [Train] step: 30399, loss_D: 0.001206, lr_D: 0.000800
648
+ 2023-07-11 16:50:11,739 - INFO - [Train] step: 30399, loss_G: 11.431597, lr_G: 0.000800
649
+ 2023-07-11 16:51:08,375 - INFO - [Train] step: 30499, loss_D: 0.002355, lr_D: 0.000800
650
+ 2023-07-11 16:51:08,459 - INFO - [Train] step: 30499, loss_G: 10.659374, lr_G: 0.000800
651
+ 2023-07-11 16:52:05,088 - INFO - [Train] step: 30599, loss_D: 0.001289, lr_D: 0.000800
652
+ 2023-07-11 16:52:05,172 - INFO - [Train] step: 30599, loss_G: 10.584681, lr_G: 0.000800
653
+ 2023-07-11 16:53:01,794 - INFO - [Train] step: 30699, loss_D: 0.018748, lr_D: 0.000800
654
+ 2023-07-11 16:53:01,878 - INFO - [Train] step: 30699, loss_G: 11.520538, lr_G: 0.000800
655
+ 2023-07-11 16:53:58,477 - INFO - [Train] step: 30799, loss_D: 0.000794, lr_D: 0.000800
656
+ 2023-07-11 16:53:58,561 - INFO - [Train] step: 30799, loss_G: 12.048307, lr_G: 0.000800
657
+ 2023-07-11 16:54:55,161 - INFO - [Train] step: 30899, loss_D: 0.017922, lr_D: 0.000800
658
+ 2023-07-11 16:54:55,245 - INFO - [Train] step: 30899, loss_G: 8.961665, lr_G: 0.000800
659
+ 2023-07-11 16:55:51,858 - INFO - [Train] step: 30999, loss_D: 0.003480, lr_D: 0.000800
660
+ 2023-07-11 16:55:51,942 - INFO - [Train] step: 30999, loss_G: 10.417709, lr_G: 0.000800
661
+ 2023-07-11 16:56:55,954 - INFO - [Eval] step: 30999, fid: 31.124594
662
+ 2023-07-11 16:57:52,496 - INFO - [Train] step: 31099, loss_D: 0.004688, lr_D: 0.000800
663
+ 2023-07-11 16:57:52,581 - INFO - [Train] step: 31099, loss_G: 11.631746, lr_G: 0.000800
664
+ 2023-07-11 16:58:49,209 - INFO - [Train] step: 31199, loss_D: 0.005057, lr_D: 0.000800
665
+ 2023-07-11 16:58:49,294 - INFO - [Train] step: 31199, loss_G: 12.383636, lr_G: 0.000800
666
+ 2023-07-11 16:59:45,910 - INFO - [Train] step: 31299, loss_D: 0.018138, lr_D: 0.000800
667
+ 2023-07-11 16:59:45,995 - INFO - [Train] step: 31299, loss_G: 9.527775, lr_G: 0.000800
668
+ 2023-07-11 17:00:42,623 - INFO - [Train] step: 31399, loss_D: 0.008496, lr_D: 0.000800
669
+ 2023-07-11 17:00:42,707 - INFO - [Train] step: 31399, loss_G: 8.838847, lr_G: 0.000800
670
+ 2023-07-11 17:01:39,307 - INFO - [Train] step: 31499, loss_D: 0.000398, lr_D: 0.000800
671
+ 2023-07-11 17:01:39,391 - INFO - [Train] step: 31499, loss_G: 11.916531, lr_G: 0.000800
672
+ 2023-07-11 17:02:36,018 - INFO - [Train] step: 31599, loss_D: 0.000577, lr_D: 0.000800
673
+ 2023-07-11 17:02:36,102 - INFO - [Train] step: 31599, loss_G: 12.267941, lr_G: 0.000800
674
+ 2023-07-11 17:03:32,842 - INFO - [Train] step: 31699, loss_D: 0.001245, lr_D: 0.000800
675
+ 2023-07-11 17:03:32,926 - INFO - [Train] step: 31699, loss_G: 11.535707, lr_G: 0.000800
676
+ 2023-07-11 17:04:29,560 - INFO - [Train] step: 31799, loss_D: 0.007009, lr_D: 0.000800
677
+ 2023-07-11 17:04:29,644 - INFO - [Train] step: 31799, loss_G: 9.988715, lr_G: 0.000800
678
+ 2023-07-11 17:05:26,260 - INFO - [Train] step: 31899, loss_D: 0.002843, lr_D: 0.000800
679
+ 2023-07-11 17:05:26,344 - INFO - [Train] step: 31899, loss_G: 11.165958, lr_G: 0.000800
680
+ 2023-07-11 17:06:22,959 - INFO - [Train] step: 31999, loss_D: 0.001144, lr_D: 0.000800
681
+ 2023-07-11 17:06:23,044 - INFO - [Train] step: 31999, loss_G: 11.805136, lr_G: 0.000800
682
+ 2023-07-11 17:07:27,156 - INFO - [Eval] step: 31999, fid: 28.932226
683
+ 2023-07-11 17:08:23,529 - INFO - [Train] step: 32099, loss_D: 0.015821, lr_D: 0.000800
684
+ 2023-07-11 17:08:23,613 - INFO - [Train] step: 32099, loss_G: 9.272867, lr_G: 0.000800
685
+ 2023-07-11 17:09:20,136 - INFO - [Train] step: 32199, loss_D: 0.004023, lr_D: 0.000800
686
+ 2023-07-11 17:09:20,221 - INFO - [Train] step: 32199, loss_G: 9.855824, lr_G: 0.000800
687
+ 2023-07-11 17:10:16,768 - INFO - [Train] step: 32299, loss_D: 0.002352, lr_D: 0.000800
688
+ 2023-07-11 17:10:16,852 - INFO - [Train] step: 32299, loss_G: 10.211768, lr_G: 0.000800
689
+ 2023-07-11 17:11:13,525 - INFO - [Train] step: 32399, loss_D: 0.002465, lr_D: 0.000800
690
+ 2023-07-11 17:11:13,609 - INFO - [Train] step: 32399, loss_G: 10.366522, lr_G: 0.000800
691
+ 2023-07-11 17:12:10,126 - INFO - [Train] step: 32499, loss_D: 0.040789, lr_D: 0.000800
692
+ 2023-07-11 17:12:10,210 - INFO - [Train] step: 32499, loss_G: 6.815432, lr_G: 0.000800
693
+ 2023-07-11 17:13:06,741 - INFO - [Train] step: 32599, loss_D: 0.004047, lr_D: 0.000800
694
+ 2023-07-11 17:13:06,825 - INFO - [Train] step: 32599, loss_G: 10.256836, lr_G: 0.000800
695
+ 2023-07-11 17:14:03,348 - INFO - [Train] step: 32699, loss_D: 0.002375, lr_D: 0.000800
696
+ 2023-07-11 17:14:03,433 - INFO - [Train] step: 32699, loss_G: 10.991968, lr_G: 0.000800
697
+ 2023-07-11 17:14:59,968 - INFO - [Train] step: 32799, loss_D: 0.196997, lr_D: 0.000800
698
+ 2023-07-11 17:15:00,053 - INFO - [Train] step: 32799, loss_G: 5.590190, lr_G: 0.000800
699
+ 2023-07-11 17:15:56,560 - INFO - [Train] step: 32899, loss_D: 0.003315, lr_D: 0.000800
700
+ 2023-07-11 17:15:56,644 - INFO - [Train] step: 32899, loss_G: 10.307394, lr_G: 0.000800
701
+ 2023-07-11 17:16:53,330 - INFO - [Train] step: 32999, loss_D: 0.003419, lr_D: 0.000800
702
+ 2023-07-11 17:16:53,414 - INFO - [Train] step: 32999, loss_G: 10.800374, lr_G: 0.000800
703
+ 2023-07-11 17:17:57,613 - INFO - [Eval] step: 32999, fid: 34.970089
704
+ 2023-07-11 17:18:54,095 - INFO - [Train] step: 33099, loss_D: 0.001734, lr_D: 0.000800
705
+ 2023-07-11 17:18:54,179 - INFO - [Train] step: 33099, loss_G: 11.038864, lr_G: 0.000800
706
+ 2023-07-11 17:19:50,801 - INFO - [Train] step: 33199, loss_D: 0.068180, lr_D: 0.000800
707
+ 2023-07-11 17:19:50,885 - INFO - [Train] step: 33199, loss_G: 8.782141, lr_G: 0.000800
708
+ 2023-07-11 17:20:47,517 - INFO - [Train] step: 33299, loss_D: 0.007126, lr_D: 0.000800
709
+ 2023-07-11 17:20:47,601 - INFO - [Train] step: 33299, loss_G: 10.125813, lr_G: 0.000800
710
+ 2023-07-11 17:21:44,231 - INFO - [Train] step: 33399, loss_D: 0.002722, lr_D: 0.000800
711
+ 2023-07-11 17:21:44,315 - INFO - [Train] step: 33399, loss_G: 10.206334, lr_G: 0.000800
712
+ 2023-07-11 17:22:40,947 - INFO - [Train] step: 33499, loss_D: 0.001919, lr_D: 0.000800
713
+ 2023-07-11 17:22:41,032 - INFO - [Train] step: 33499, loss_G: 11.129574, lr_G: 0.000800
714
+ 2023-07-11 17:23:37,660 - INFO - [Train] step: 33599, loss_D: 0.010751, lr_D: 0.000800
715
+ 2023-07-11 17:23:37,744 - INFO - [Train] step: 33599, loss_G: 9.334590, lr_G: 0.000800
716
+ 2023-07-11 17:24:34,507 - INFO - [Train] step: 33699, loss_D: 0.003594, lr_D: 0.000800
717
+ 2023-07-11 17:24:34,592 - INFO - [Train] step: 33699, loss_G: 10.852534, lr_G: 0.000800
718
+ 2023-07-11 17:25:31,200 - INFO - [Train] step: 33799, loss_D: 0.002941, lr_D: 0.000800
719
+ 2023-07-11 17:25:31,285 - INFO - [Train] step: 33799, loss_G: 10.506541, lr_G: 0.000800
720
+ 2023-07-11 17:26:27,895 - INFO - [Train] step: 33899, loss_D: 0.001559, lr_D: 0.000800
721
+ 2023-07-11 17:26:27,979 - INFO - [Train] step: 33899, loss_G: 11.605881, lr_G: 0.000800
722
+ 2023-07-11 17:27:24,603 - INFO - [Train] step: 33999, loss_D: 0.010180, lr_D: 0.000800
723
+ 2023-07-11 17:27:24,687 - INFO - [Train] step: 33999, loss_G: 9.430072, lr_G: 0.000800
724
+ 2023-07-11 17:28:28,800 - INFO - [Eval] step: 33999, fid: 26.479848
725
+ 2023-07-11 17:29:25,216 - INFO - [Train] step: 34099, loss_D: 0.088297, lr_D: 0.000800
726
+ 2023-07-11 17:29:25,301 - INFO - [Train] step: 34099, loss_G: 6.721412, lr_G: 0.000800
727
+ 2023-07-11 17:30:21,940 - INFO - [Train] step: 34199, loss_D: 0.001062, lr_D: 0.000800
728
+ 2023-07-11 17:30:22,024 - INFO - [Train] step: 34199, loss_G: 11.332446, lr_G: 0.000800
729
+ 2023-07-11 17:31:18,784 - INFO - [Train] step: 34299, loss_D: 0.001985, lr_D: 0.000800
730
+ 2023-07-11 17:31:18,869 - INFO - [Train] step: 34299, loss_G: 12.944086, lr_G: 0.000800
731
+ 2023-07-11 17:32:15,489 - INFO - [Train] step: 34399, loss_D: 0.001062, lr_D: 0.000800
732
+ 2023-07-11 17:32:15,574 - INFO - [Train] step: 34399, loss_G: 11.664156, lr_G: 0.000800
733
+ 2023-07-11 17:33:12,188 - INFO - [Train] step: 34499, loss_D: 0.052018, lr_D: 0.000800
734
+ 2023-07-11 17:33:12,273 - INFO - [Train] step: 34499, loss_G: 6.532789, lr_G: 0.000800
735
+ 2023-07-11 17:34:08,895 - INFO - [Train] step: 34599, loss_D: 0.007955, lr_D: 0.000800
736
+ 2023-07-11 17:34:08,979 - INFO - [Train] step: 34599, loss_G: 11.164508, lr_G: 0.000800
737
+ 2023-07-11 17:35:05,598 - INFO - [Train] step: 34699, loss_D: 0.003598, lr_D: 0.000800
738
+ 2023-07-11 17:35:05,682 - INFO - [Train] step: 34699, loss_G: 9.783045, lr_G: 0.000800
739
+ 2023-07-11 17:36:02,320 - INFO - [Train] step: 34799, loss_D: 0.001351, lr_D: 0.000800
740
+ 2023-07-11 17:36:02,404 - INFO - [Train] step: 34799, loss_G: 10.909458, lr_G: 0.000800
741
+ 2023-07-11 17:36:58,998 - INFO - [Train] step: 34899, loss_D: 0.027341, lr_D: 0.000800
742
+ 2023-07-11 17:36:59,082 - INFO - [Train] step: 34899, loss_G: 8.103909, lr_G: 0.000800
743
+ 2023-07-11 17:37:55,873 - INFO - [Train] step: 34999, loss_D: 0.003608, lr_D: 0.000800
744
+ 2023-07-11 17:37:55,957 - INFO - [Train] step: 34999, loss_G: 10.162874, lr_G: 0.000800
745
+ 2023-07-11 17:39:00,132 - INFO - [Eval] step: 34999, fid: 30.250333
746
+ 2023-07-11 17:39:56,520 - INFO - [Train] step: 35099, loss_D: 0.002374, lr_D: 0.000800
747
+ 2023-07-11 17:39:56,605 - INFO - [Train] step: 35099, loss_G: 12.336600, lr_G: 0.000800
748
+ 2023-07-11 17:40:53,151 - INFO - [Train] step: 35199, loss_D: 0.000961, lr_D: 0.000800
749
+ 2023-07-11 17:40:53,235 - INFO - [Train] step: 35199, loss_G: 12.600614, lr_G: 0.000800
750
+ 2023-07-11 17:41:49,767 - INFO - [Train] step: 35299, loss_D: 0.014972, lr_D: 0.000800
751
+ 2023-07-11 17:41:49,851 - INFO - [Train] step: 35299, loss_G: 9.577575, lr_G: 0.000800
752
+ 2023-07-11 17:42:46,374 - INFO - [Train] step: 35399, loss_D: 0.006320, lr_D: 0.000800
753
+ 2023-07-11 17:42:46,458 - INFO - [Train] step: 35399, loss_G: 10.768517, lr_G: 0.000800
754
+ 2023-07-11 17:43:43,005 - INFO - [Train] step: 35499, loss_D: 0.001321, lr_D: 0.000800
755
+ 2023-07-11 17:43:43,089 - INFO - [Train] step: 35499, loss_G: 10.944354, lr_G: 0.000800
756
+ 2023-07-11 17:44:39,774 - INFO - [Train] step: 35599, loss_D: 0.001155, lr_D: 0.000800
757
+ 2023-07-11 17:44:39,858 - INFO - [Train] step: 35599, loss_G: 11.848622, lr_G: 0.000800
758
+ 2023-07-11 17:45:36,395 - INFO - [Train] step: 35699, loss_D: 0.004498, lr_D: 0.000800
759
+ 2023-07-11 17:45:36,479 - INFO - [Train] step: 35699, loss_G: 10.615336, lr_G: 0.000800
760
+ 2023-07-11 17:46:33,017 - INFO - [Train] step: 35799, loss_D: 0.002211, lr_D: 0.000800
761
+ 2023-07-11 17:46:33,102 - INFO - [Train] step: 35799, loss_G: 11.308196, lr_G: 0.000800
762
+ 2023-07-11 17:47:29,620 - INFO - [Train] step: 35899, loss_D: 0.112851, lr_D: 0.000800
763
+ 2023-07-11 17:47:29,705 - INFO - [Train] step: 35899, loss_G: 5.648012, lr_G: 0.000800
764
+ 2023-07-11 17:48:26,224 - INFO - [Train] step: 35999, loss_D: 0.003291, lr_D: 0.000800
765
+ 2023-07-11 17:48:26,309 - INFO - [Train] step: 35999, loss_G: 11.379570, lr_G: 0.000800
766
+ 2023-07-11 17:49:30,385 - INFO - [Eval] step: 35999, fid: 29.220229
767
+ 2023-07-11 17:50:26,725 - INFO - [Train] step: 36099, loss_D: 0.002197, lr_D: 0.000800
768
+ 2023-07-11 17:50:26,809 - INFO - [Train] step: 36099, loss_G: 12.284069, lr_G: 0.000800
769
+ 2023-07-11 17:51:23,326 - INFO - [Train] step: 36199, loss_D: 0.001456, lr_D: 0.000800
770
+ 2023-07-11 17:51:23,410 - INFO - [Train] step: 36199, loss_G: 13.320007, lr_G: 0.000800
771
+ 2023-07-11 17:52:20,128 - INFO - [Train] step: 36299, loss_D: 0.014802, lr_D: 0.000800
772
+ 2023-07-11 17:52:20,213 - INFO - [Train] step: 36299, loss_G: 9.526157, lr_G: 0.000800
773
+ 2023-07-11 17:53:16,737 - INFO - [Train] step: 36399, loss_D: 0.004870, lr_D: 0.000800
774
+ 2023-07-11 17:53:16,822 - INFO - [Train] step: 36399, loss_G: 10.647831, lr_G: 0.000800
775
+ 2023-07-11 17:54:13,356 - INFO - [Train] step: 36499, loss_D: 0.002279, lr_D: 0.000800
776
+ 2023-07-11 17:54:13,441 - INFO - [Train] step: 36499, loss_G: 10.133535, lr_G: 0.000800
777
+ 2023-07-11 17:55:09,987 - INFO - [Train] step: 36599, loss_D: 0.003005, lr_D: 0.000800
778
+ 2023-07-11 17:55:10,071 - INFO - [Train] step: 36599, loss_G: 11.771160, lr_G: 0.000800
779
+ 2023-07-11 17:56:06,598 - INFO - [Train] step: 36699, loss_D: 0.034414, lr_D: 0.000800
780
+ 2023-07-11 17:56:06,682 - INFO - [Train] step: 36699, loss_G: 8.730915, lr_G: 0.000800
781
+ 2023-07-11 17:57:03,224 - INFO - [Train] step: 36799, loss_D: 0.001324, lr_D: 0.000800
782
+ 2023-07-11 17:57:03,308 - INFO - [Train] step: 36799, loss_G: 11.915476, lr_G: 0.000800
783
+ 2023-07-11 17:58:00,002 - INFO - [Train] step: 36899, loss_D: 0.001386, lr_D: 0.000800
784
+ 2023-07-11 17:58:00,086 - INFO - [Train] step: 36899, loss_G: 11.080511, lr_G: 0.000800
785
+ 2023-07-11 17:58:56,607 - INFO - [Train] step: 36999, loss_D: 0.002617, lr_D: 0.000800
786
+ 2023-07-11 17:58:56,691 - INFO - [Train] step: 36999, loss_G: 12.768784, lr_G: 0.000800
787
+ 2023-07-11 18:00:00,808 - INFO - [Eval] step: 36999, fid: 29.393289
788
+ 2023-07-11 18:00:57,169 - INFO - [Train] step: 37099, loss_D: 0.005409, lr_D: 0.000800
789
+ 2023-07-11 18:00:57,254 - INFO - [Train] step: 37099, loss_G: 10.404194, lr_G: 0.000800
790
+ 2023-07-11 18:01:53,793 - INFO - [Train] step: 37199, loss_D: 0.000680, lr_D: 0.000800
791
+ 2023-07-11 18:01:53,877 - INFO - [Train] step: 37199, loss_G: 11.750727, lr_G: 0.000800
792
+ 2023-07-11 18:02:50,408 - INFO - [Train] step: 37299, loss_D: 0.001456, lr_D: 0.000800
793
+ 2023-07-11 18:02:50,492 - INFO - [Train] step: 37299, loss_G: 11.308887, lr_G: 0.000800
794
+ 2023-07-11 18:03:47,008 - INFO - [Train] step: 37399, loss_D: 2.423407, lr_D: 0.000800
795
+ 2023-07-11 18:03:47,092 - INFO - [Train] step: 37399, loss_G: 9.273409, lr_G: 0.000800
796
+ 2023-07-11 18:04:43,623 - INFO - [Train] step: 37499, loss_D: 0.002194, lr_D: 0.000800
797
+ 2023-07-11 18:04:43,707 - INFO - [Train] step: 37499, loss_G: 10.400314, lr_G: 0.000800
798
+ 2023-07-11 18:05:40,382 - INFO - [Train] step: 37599, loss_D: 0.001529, lr_D: 0.000800
799
+ 2023-07-11 18:05:40,466 - INFO - [Train] step: 37599, loss_G: 11.470114, lr_G: 0.000800
800
+ 2023-07-11 18:06:36,980 - INFO - [Train] step: 37699, loss_D: 0.000489, lr_D: 0.000800
801
+ 2023-07-11 18:06:37,064 - INFO - [Train] step: 37699, loss_G: 13.643311, lr_G: 0.000800
802
+ 2023-07-11 18:07:33,588 - INFO - [Train] step: 37799, loss_D: 0.018753, lr_D: 0.000800
803
+ 2023-07-11 18:07:33,672 - INFO - [Train] step: 37799, loss_G: 10.013208, lr_G: 0.000800
804
+ 2023-07-11 18:08:30,190 - INFO - [Train] step: 37899, loss_D: 0.000898, lr_D: 0.000800
805
+ 2023-07-11 18:08:30,275 - INFO - [Train] step: 37899, loss_G: 11.948532, lr_G: 0.000800
806
+ 2023-07-11 18:09:26,780 - INFO - [Train] step: 37999, loss_D: 0.000186, lr_D: 0.000800
807
+ 2023-07-11 18:09:26,865 - INFO - [Train] step: 37999, loss_G: 12.791275, lr_G: 0.000800
808
+ 2023-07-11 18:10:30,975 - INFO - [Eval] step: 37999, fid: 32.256753
809
+ 2023-07-11 18:11:27,328 - INFO - [Train] step: 38099, loss_D: 0.001841, lr_D: 0.000800
810
+ 2023-07-11 18:11:27,411 - INFO - [Train] step: 38099, loss_G: 11.418566, lr_G: 0.000800
811
+ 2023-07-11 18:12:24,071 - INFO - [Train] step: 38199, loss_D: 0.001069, lr_D: 0.000800
812
+ 2023-07-11 18:12:24,155 - INFO - [Train] step: 38199, loss_G: 13.107159, lr_G: 0.000800
813
+ 2023-07-11 18:13:20,709 - INFO - [Train] step: 38299, loss_D: 0.004591, lr_D: 0.000800
814
+ 2023-07-11 18:13:20,793 - INFO - [Train] step: 38299, loss_G: 14.307920, lr_G: 0.000800
815
+ 2023-07-11 18:14:17,324 - INFO - [Train] step: 38399, loss_D: 0.015494, lr_D: 0.000800
816
+ 2023-07-11 18:14:17,409 - INFO - [Train] step: 38399, loss_G: 9.397242, lr_G: 0.000800
817
+ 2023-07-11 18:15:13,933 - INFO - [Train] step: 38499, loss_D: 0.001532, lr_D: 0.000800
818
+ 2023-07-11 18:15:14,017 - INFO - [Train] step: 38499, loss_G: 11.354094, lr_G: 0.000800
819
+ 2023-07-11 18:16:10,539 - INFO - [Train] step: 38599, loss_D: 0.002174, lr_D: 0.000800
820
+ 2023-07-11 18:16:10,623 - INFO - [Train] step: 38599, loss_G: 11.296826, lr_G: 0.000800
821
+ 2023-07-11 18:17:07,146 - INFO - [Train] step: 38699, loss_D: 0.122292, lr_D: 0.000800
822
+ 2023-07-11 18:17:07,231 - INFO - [Train] step: 38699, loss_G: 6.858611, lr_G: 0.000800
823
+ 2023-07-11 18:18:03,759 - INFO - [Train] step: 38799, loss_D: 0.002347, lr_D: 0.000800
824
+ 2023-07-11 18:18:03,843 - INFO - [Train] step: 38799, loss_G: 11.651567, lr_G: 0.000800
825
+ 2023-07-11 18:19:00,491 - INFO - [Train] step: 38899, loss_D: 0.000883, lr_D: 0.000800
826
+ 2023-07-11 18:19:00,576 - INFO - [Train] step: 38899, loss_G: 11.204956, lr_G: 0.000800
827
+ 2023-07-11 18:19:57,079 - INFO - [Train] step: 38999, loss_D: 0.005888, lr_D: 0.000800
828
+ 2023-07-11 18:19:57,163 - INFO - [Train] step: 38999, loss_G: 11.321920, lr_G: 0.000800
829
+ 2023-07-11 18:21:01,292 - INFO - [Eval] step: 38999, fid: 30.338257
830
+ 2023-07-11 18:21:57,630 - INFO - [Train] step: 39099, loss_D: 0.009130, lr_D: 0.000800
831
+ 2023-07-11 18:21:57,713 - INFO - [Train] step: 39099, loss_G: 11.101431, lr_G: 0.000800
832
+ 2023-07-11 18:22:54,062 - INFO - [Train] step: 39199, loss_D: 0.001463, lr_D: 0.000800
833
+ 2023-07-11 18:22:54,146 - INFO - [Train] step: 39199, loss_G: 11.452006, lr_G: 0.000800
834
+ 2023-07-11 18:23:50,514 - INFO - [Train] step: 39299, loss_D: 0.000898, lr_D: 0.000800
835
+ 2023-07-11 18:23:50,598 - INFO - [Train] step: 39299, loss_G: 11.766135, lr_G: 0.000800
836
+ 2023-07-11 18:24:46,944 - INFO - [Train] step: 39399, loss_D: 0.031015, lr_D: 0.000800
837
+ 2023-07-11 18:24:47,028 - INFO - [Train] step: 39399, loss_G: 8.246071, lr_G: 0.000800
838
+ 2023-07-11 18:25:43,511 - INFO - [Train] step: 39499, loss_D: 0.006617, lr_D: 0.000800
839
+ 2023-07-11 18:25:43,596 - INFO - [Train] step: 39499, loss_G: 10.835472, lr_G: 0.000800
840
+ 2023-07-11 18:26:39,936 - INFO - [Train] step: 39599, loss_D: 0.000876, lr_D: 0.000800
841
+ 2023-07-11 18:26:40,020 - INFO - [Train] step: 39599, loss_G: 10.974045, lr_G: 0.000800
842
+ 2023-07-11 18:27:36,382 - INFO - [Train] step: 39699, loss_D: 0.000632, lr_D: 0.000800
843
+ 2023-07-11 18:27:36,466 - INFO - [Train] step: 39699, loss_G: 12.644368, lr_G: 0.000800
844
+ 2023-07-11 18:28:32,806 - INFO - [Train] step: 39799, loss_D: 0.027200, lr_D: 0.000800
845
+ 2023-07-11 18:28:32,890 - INFO - [Train] step: 39799, loss_G: 11.607603, lr_G: 0.000800
846
+ 2023-07-11 18:29:29,246 - INFO - [Train] step: 39899, loss_D: 0.004701, lr_D: 0.000800
847
+ 2023-07-11 18:29:29,330 - INFO - [Train] step: 39899, loss_G: 11.546965, lr_G: 0.000800
848
+ 2023-07-11 18:30:25,704 - INFO - [Train] step: 39999, loss_D: 0.002906, lr_D: 0.000800
849
+ 2023-07-11 18:30:25,788 - INFO - [Train] step: 39999, loss_G: 10.704437, lr_G: 0.000800
850
+ 2023-07-11 18:31:29,906 - INFO - [Eval] step: 39999, fid: 34.076071
851
+ 2023-07-11 18:31:30,079 - INFO - End of training
cgan_cifar10/samples.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:118df371977488fcdbbecdd3e4a0629feb694ae0a84771201d0ff6cbe5845372
3
+ size 5669212
cgan_cifar10/tensorboard/events.out.tfevents.1689075161.jason-system.2278.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de580274ccd602c38c93a1933517e9efa239411940944de70383d0ba9ee52185
3
+ size 8095936