xyfJASON commited on
Commit
768f2bc
1 Parent(s): 90f4182

Upload cgan_cbn_cifar10 checkpoints and training logs

Browse files
cgan_cbn_cifar10/ckpt/best/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:18a2af67fb8211a2caf4c345e91cbdd54c8f68af5784d49db5a5679c5475cce0
3
+ size 425
cgan_cbn_cifar10/ckpt/best/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:297df12f2e27f5757ceeb971a734ce6b064d4d4760d9337d25051d94d854ae60
3
+ size 90975508
cgan_cbn_cifar10/ckpt/best/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c7fb0133d0d147659dd12380df8ce5b6b2fa63015963d69d6683a09bb81f16ac
3
+ size 181896667
cgan_cbn_cifar10/ckpt/step039999/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:87eef79f18b95bc2ab817de168f547ab0a32ab86418ff6d379144fe58376c2c5
3
+ size 425
cgan_cbn_cifar10/ckpt/step039999/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b2ce792cbfdc4a4f681a59ece8856b9f5d51737151728679ed9967718822536e
3
+ size 90975508
cgan_cbn_cifar10/ckpt/step039999/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f0ac6bb2efc39bbee32ddb5b8212a06e488a481bf0fb83adb3dc06c219f64d0e
3
+ size 181896667
cgan_cbn_cifar10/config-2023-07-12-16-51-27.yaml ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ seed: 5678
2
+ data:
3
+ name: CIFAR-10
4
+ dataroot: /data/CIFAR-10/
5
+ img_size: 32
6
+ img_channels: 3
7
+ n_classes: 10
8
+ dataloader:
9
+ num_workers: 4
10
+ pin_memory: true
11
+ prefetch_factor: 2
12
+ G:
13
+ target: models.simple_cnn_cond.GeneratorConditionalCBN
14
+ params:
15
+ z_dim: 100
16
+ n_classes: 10
17
+ dim: 256
18
+ dim_mults:
19
+ - 4
20
+ - 2
21
+ - 1
22
+ out_dim: 3
23
+ with_bn: true
24
+ with_tanh: true
25
+ D:
26
+ target: models.simple_cnn_cond.DiscriminatorConditional
27
+ params:
28
+ n_classes: 10
29
+ in_dim: 3
30
+ dim: 256
31
+ dim_mults:
32
+ - 1
33
+ - 2
34
+ - 4
35
+ with_bn: true
36
+ train:
37
+ n_steps: 40000
38
+ batch_size: 512
39
+ d_iters: 5
40
+ resume: null
41
+ print_freq: 100
42
+ save_freq: 10000
43
+ sample_freq: 1000
44
+ eval_freq: 1000
45
+ n_samples_per_class: 6
46
+ loss_fn:
47
+ target: losses.vanilla_gan_loss.VanillaGANLoss
48
+ params:
49
+ lambda_r1_reg: 0.0
50
+ optim_G:
51
+ target: torch.optim.Adam
52
+ params:
53
+ lr: 0.0008
54
+ betas:
55
+ - 0.5
56
+ - 0.999
57
+ optim_D:
58
+ target: torch.optim.Adam
59
+ params:
60
+ lr: 0.0008
61
+ betas:
62
+ - 0.5
63
+ - 0.999
cgan_cbn_cifar10/output-2023-07-12-16-51-27.log ADDED
@@ -0,0 +1,852 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-07-12 16:51:31,075 - INFO - Experiment directory: ./runs/cgan_cbn_cifar10/
2
+ 2023-07-12 16:51:31,076 - INFO - Number of processes: 1
3
+ 2023-07-12 16:51:31,077 - INFO - Distributed type: DistributedType.NO
4
+ 2023-07-12 16:51:31,077 - INFO - Mixed precision: no
5
+ 2023-07-12 16:51:31,077 - INFO - ==============================
6
+ 2023-07-12 16:51:31,597 - INFO - Size of training set: 50000
7
+ 2023-07-12 16:51:31,597 - INFO - Batch size per process: 512
8
+ 2023-07-12 16:51:31,597 - INFO - Total batch size: 512
9
+ 2023-07-12 16:51:31,597 - INFO - ==============================
10
+ 2023-07-12 16:51:32,623 - INFO - Start training...
11
+ 2023-07-12 16:52:30,815 - INFO - [Train] step: 99, loss_D: 0.003105, lr_D: 0.000800
12
+ 2023-07-12 16:52:30,902 - INFO - [Train] step: 99, loss_G: 10.790014, lr_G: 0.000800
13
+ 2023-07-12 16:53:27,833 - INFO - [Train] step: 199, loss_D: 0.005162, lr_D: 0.000800
14
+ 2023-07-12 16:53:27,920 - INFO - [Train] step: 199, loss_G: 6.575283, lr_G: 0.000800
15
+ 2023-07-12 16:54:24,950 - INFO - [Train] step: 299, loss_D: 0.015990, lr_D: 0.000800
16
+ 2023-07-12 16:54:25,037 - INFO - [Train] step: 299, loss_G: 5.417484, lr_G: 0.000800
17
+ 2023-07-12 16:55:22,225 - INFO - [Train] step: 399, loss_D: 0.014408, lr_D: 0.000800
18
+ 2023-07-12 16:55:22,312 - INFO - [Train] step: 399, loss_G: 5.476928, lr_G: 0.000800
19
+ 2023-07-12 16:56:19,515 - INFO - [Train] step: 499, loss_D: 0.021238, lr_D: 0.000800
20
+ 2023-07-12 16:56:19,603 - INFO - [Train] step: 499, loss_G: 6.185371, lr_G: 0.000800
21
+ 2023-07-12 16:57:16,831 - INFO - [Train] step: 599, loss_D: 0.018802, lr_D: 0.000800
22
+ 2023-07-12 16:57:16,918 - INFO - [Train] step: 599, loss_G: 5.295971, lr_G: 0.000800
23
+ 2023-07-12 16:58:14,312 - INFO - [Train] step: 699, loss_D: 0.009105, lr_D: 0.000800
24
+ 2023-07-12 16:58:14,400 - INFO - [Train] step: 699, loss_G: 5.853440, lr_G: 0.000800
25
+ 2023-07-12 16:59:11,670 - INFO - [Train] step: 799, loss_D: 0.014588, lr_D: 0.000800
26
+ 2023-07-12 16:59:11,758 - INFO - [Train] step: 799, loss_G: 6.018249, lr_G: 0.000800
27
+ 2023-07-12 17:00:09,009 - INFO - [Train] step: 899, loss_D: 0.025501, lr_D: 0.000800
28
+ 2023-07-12 17:00:09,096 - INFO - [Train] step: 899, loss_G: 5.412691, lr_G: 0.000800
29
+ 2023-07-12 17:01:06,349 - INFO - [Train] step: 999, loss_D: 0.004077, lr_D: 0.000800
30
+ 2023-07-12 17:01:06,436 - INFO - [Train] step: 999, loss_G: 7.350991, lr_G: 0.000800
31
+ 2023-07-12 17:02:10,784 - INFO - [Eval] step: 999, fid: 147.988934
32
+ 2023-07-12 17:03:08,233 - INFO - [Train] step: 1099, loss_D: 0.018384, lr_D: 0.000800
33
+ 2023-07-12 17:03:08,321 - INFO - [Train] step: 1099, loss_G: 6.915034, lr_G: 0.000800
34
+ 2023-07-12 17:04:05,642 - INFO - [Train] step: 1199, loss_D: 0.029909, lr_D: 0.000800
35
+ 2023-07-12 17:04:05,729 - INFO - [Train] step: 1199, loss_G: 5.504713, lr_G: 0.000800
36
+ 2023-07-12 17:05:03,202 - INFO - [Train] step: 1299, loss_D: 0.045890, lr_D: 0.000800
37
+ 2023-07-12 17:05:03,290 - INFO - [Train] step: 1299, loss_G: 5.696881, lr_G: 0.000800
38
+ 2023-07-12 17:06:00,626 - INFO - [Train] step: 1399, loss_D: 0.026205, lr_D: 0.000800
39
+ 2023-07-12 17:06:00,713 - INFO - [Train] step: 1399, loss_G: 4.888653, lr_G: 0.000800
40
+ 2023-07-12 17:06:58,037 - INFO - [Train] step: 1499, loss_D: 0.019429, lr_D: 0.000800
41
+ 2023-07-12 17:06:58,125 - INFO - [Train] step: 1499, loss_G: 5.075842, lr_G: 0.000800
42
+ 2023-07-12 17:07:55,441 - INFO - [Train] step: 1599, loss_D: 0.021472, lr_D: 0.000800
43
+ 2023-07-12 17:07:55,529 - INFO - [Train] step: 1599, loss_G: 5.127141, lr_G: 0.000800
44
+ 2023-07-12 17:08:52,842 - INFO - [Train] step: 1699, loss_D: 0.022238, lr_D: 0.000800
45
+ 2023-07-12 17:08:52,929 - INFO - [Train] step: 1699, loss_G: 4.819377, lr_G: 0.000800
46
+ 2023-07-12 17:09:50,264 - INFO - [Train] step: 1799, loss_D: 0.036398, lr_D: 0.000800
47
+ 2023-07-12 17:09:50,351 - INFO - [Train] step: 1799, loss_G: 5.162158, lr_G: 0.000800
48
+ 2023-07-12 17:10:47,667 - INFO - [Train] step: 1899, loss_D: 0.028905, lr_D: 0.000800
49
+ 2023-07-12 17:10:47,754 - INFO - [Train] step: 1899, loss_G: 5.107882, lr_G: 0.000800
50
+ 2023-07-12 17:11:45,195 - INFO - [Train] step: 1999, loss_D: 0.036849, lr_D: 0.000800
51
+ 2023-07-12 17:11:45,283 - INFO - [Train] step: 1999, loss_G: 4.894027, lr_G: 0.000800
52
+ 2023-07-12 17:12:49,366 - INFO - [Eval] step: 1999, fid: 95.961661
53
+ 2023-07-12 17:13:46,985 - INFO - [Train] step: 2099, loss_D: 0.045795, lr_D: 0.000800
54
+ 2023-07-12 17:13:47,072 - INFO - [Train] step: 2099, loss_G: 7.183059, lr_G: 0.000800
55
+ 2023-07-12 17:14:44,397 - INFO - [Train] step: 2199, loss_D: 0.018250, lr_D: 0.000800
56
+ 2023-07-12 17:14:44,485 - INFO - [Train] step: 2199, loss_G: 4.700519, lr_G: 0.000800
57
+ 2023-07-12 17:15:41,824 - INFO - [Train] step: 2299, loss_D: 0.030321, lr_D: 0.000800
58
+ 2023-07-12 17:15:41,912 - INFO - [Train] step: 2299, loss_G: 4.807626, lr_G: 0.000800
59
+ 2023-07-12 17:16:39,229 - INFO - [Train] step: 2399, loss_D: 0.013947, lr_D: 0.000800
60
+ 2023-07-12 17:16:39,317 - INFO - [Train] step: 2399, loss_G: 5.071939, lr_G: 0.000800
61
+ 2023-07-12 17:17:36,650 - INFO - [Train] step: 2499, loss_D: 0.017522, lr_D: 0.000800
62
+ 2023-07-12 17:17:36,737 - INFO - [Train] step: 2499, loss_G: 5.386928, lr_G: 0.000800
63
+ 2023-07-12 17:18:34,197 - INFO - [Train] step: 2599, loss_D: 0.030984, lr_D: 0.000800
64
+ 2023-07-12 17:18:34,285 - INFO - [Train] step: 2599, loss_G: 4.378212, lr_G: 0.000800
65
+ 2023-07-12 17:19:31,600 - INFO - [Train] step: 2699, loss_D: 0.027854, lr_D: 0.000800
66
+ 2023-07-12 17:19:31,687 - INFO - [Train] step: 2699, loss_G: 4.803050, lr_G: 0.000800
67
+ 2023-07-12 17:20:29,006 - INFO - [Train] step: 2799, loss_D: 0.028994, lr_D: 0.000800
68
+ 2023-07-12 17:20:29,093 - INFO - [Train] step: 2799, loss_G: 4.919062, lr_G: 0.000800
69
+ 2023-07-12 17:21:26,400 - INFO - [Train] step: 2899, loss_D: 0.020073, lr_D: 0.000800
70
+ 2023-07-12 17:21:26,488 - INFO - [Train] step: 2899, loss_G: 4.879203, lr_G: 0.000800
71
+ 2023-07-12 17:22:23,809 - INFO - [Train] step: 2999, loss_D: 0.078545, lr_D: 0.000800
72
+ 2023-07-12 17:22:23,896 - INFO - [Train] step: 2999, loss_G: 5.285282, lr_G: 0.000800
73
+ 2023-07-12 17:23:28,072 - INFO - [Eval] step: 2999, fid: 61.421845
74
+ 2023-07-12 17:24:25,717 - INFO - [Train] step: 3099, loss_D: 0.032204, lr_D: 0.000800
75
+ 2023-07-12 17:24:25,804 - INFO - [Train] step: 3099, loss_G: 4.897223, lr_G: 0.000800
76
+ 2023-07-12 17:25:23,147 - INFO - [Train] step: 3199, loss_D: 0.053038, lr_D: 0.000800
77
+ 2023-07-12 17:25:23,235 - INFO - [Train] step: 3199, loss_G: 5.073289, lr_G: 0.000800
78
+ 2023-07-12 17:26:20,726 - INFO - [Train] step: 3299, loss_D: 0.054510, lr_D: 0.000800
79
+ 2023-07-12 17:26:20,814 - INFO - [Train] step: 3299, loss_G: 4.930062, lr_G: 0.000800
80
+ 2023-07-12 17:27:18,183 - INFO - [Train] step: 3399, loss_D: 0.034935, lr_D: 0.000800
81
+ 2023-07-12 17:27:18,270 - INFO - [Train] step: 3399, loss_G: 3.990243, lr_G: 0.000800
82
+ 2023-07-12 17:28:15,598 - INFO - [Train] step: 3499, loss_D: 0.030087, lr_D: 0.000800
83
+ 2023-07-12 17:28:15,685 - INFO - [Train] step: 3499, loss_G: 4.579709, lr_G: 0.000800
84
+ 2023-07-12 17:29:13,024 - INFO - [Train] step: 3599, loss_D: 0.038519, lr_D: 0.000800
85
+ 2023-07-12 17:29:13,112 - INFO - [Train] step: 3599, loss_G: 4.418524, lr_G: 0.000800
86
+ 2023-07-12 17:30:10,438 - INFO - [Train] step: 3699, loss_D: 0.057550, lr_D: 0.000800
87
+ 2023-07-12 17:30:10,526 - INFO - [Train] step: 3699, loss_G: 4.605961, lr_G: 0.000800
88
+ 2023-07-12 17:31:07,887 - INFO - [Train] step: 3799, loss_D: 0.041060, lr_D: 0.000800
89
+ 2023-07-12 17:31:07,982 - INFO - [Train] step: 3799, loss_G: 4.834350, lr_G: 0.000800
90
+ 2023-07-12 17:32:05,456 - INFO - [Train] step: 3899, loss_D: 0.042648, lr_D: 0.000800
91
+ 2023-07-12 17:32:05,544 - INFO - [Train] step: 3899, loss_G: 5.262332, lr_G: 0.000800
92
+ 2023-07-12 17:33:02,865 - INFO - [Train] step: 3999, loss_D: 1.264918, lr_D: 0.000800
93
+ 2023-07-12 17:33:02,953 - INFO - [Train] step: 3999, loss_G: 4.694508, lr_G: 0.000800
94
+ 2023-07-12 17:34:07,152 - INFO - [Eval] step: 3999, fid: 46.189030
95
+ 2023-07-12 17:35:04,793 - INFO - [Train] step: 4099, loss_D: 0.003329, lr_D: 0.000800
96
+ 2023-07-12 17:35:04,881 - INFO - [Train] step: 4099, loss_G: 7.554510, lr_G: 0.000800
97
+ 2023-07-12 17:36:02,340 - INFO - [Train] step: 4199, loss_D: 0.040185, lr_D: 0.000800
98
+ 2023-07-12 17:36:02,428 - INFO - [Train] step: 4199, loss_G: 3.475946, lr_G: 0.000800
99
+ 2023-07-12 17:36:59,897 - INFO - [Train] step: 4299, loss_D: 0.027306, lr_D: 0.000800
100
+ 2023-07-12 17:36:59,985 - INFO - [Train] step: 4299, loss_G: 5.402829, lr_G: 0.000800
101
+ 2023-07-12 17:37:57,437 - INFO - [Train] step: 4399, loss_D: 0.051728, lr_D: 0.000800
102
+ 2023-07-12 17:37:57,524 - INFO - [Train] step: 4399, loss_G: 3.607202, lr_G: 0.000800
103
+ 2023-07-12 17:38:54,973 - INFO - [Train] step: 4499, loss_D: 0.056249, lr_D: 0.000800
104
+ 2023-07-12 17:38:55,060 - INFO - [Train] step: 4499, loss_G: 3.384413, lr_G: 0.000800
105
+ 2023-07-12 17:39:52,661 - INFO - [Train] step: 4599, loss_D: 0.007323, lr_D: 0.000800
106
+ 2023-07-12 17:39:52,749 - INFO - [Train] step: 4599, loss_G: 6.453446, lr_G: 0.000800
107
+ 2023-07-12 17:40:50,198 - INFO - [Train] step: 4699, loss_D: 0.024911, lr_D: 0.000800
108
+ 2023-07-12 17:40:50,286 - INFO - [Train] step: 4699, loss_G: 5.086204, lr_G: 0.000800
109
+ 2023-07-12 17:41:47,749 - INFO - [Train] step: 4799, loss_D: 0.014862, lr_D: 0.000800
110
+ 2023-07-12 17:41:47,837 - INFO - [Train] step: 4799, loss_G: 5.229910, lr_G: 0.000800
111
+ 2023-07-12 17:42:45,271 - INFO - [Train] step: 4899, loss_D: 0.031109, lr_D: 0.000800
112
+ 2023-07-12 17:42:45,359 - INFO - [Train] step: 4899, loss_G: 5.116955, lr_G: 0.000800
113
+ 2023-07-12 17:43:42,816 - INFO - [Train] step: 4999, loss_D: 0.065226, lr_D: 0.000800
114
+ 2023-07-12 17:43:42,903 - INFO - [Train] step: 4999, loss_G: 5.112177, lr_G: 0.000800
115
+ 2023-07-12 17:44:47,064 - INFO - [Eval] step: 4999, fid: 36.271797
116
+ 2023-07-12 17:45:44,714 - INFO - [Train] step: 5099, loss_D: 0.020115, lr_D: 0.000800
117
+ 2023-07-12 17:45:44,802 - INFO - [Train] step: 5099, loss_G: 5.193239, lr_G: 0.000800
118
+ 2023-07-12 17:46:42,379 - INFO - [Train] step: 5199, loss_D: 0.347856, lr_D: 0.000800
119
+ 2023-07-12 17:46:42,467 - INFO - [Train] step: 5199, loss_G: 1.892686, lr_G: 0.000800
120
+ 2023-07-12 17:47:39,934 - INFO - [Train] step: 5299, loss_D: 0.058115, lr_D: 0.000800
121
+ 2023-07-12 17:47:40,021 - INFO - [Train] step: 5299, loss_G: 4.653395, lr_G: 0.000800
122
+ 2023-07-12 17:48:37,460 - INFO - [Train] step: 5399, loss_D: 0.077972, lr_D: 0.000800
123
+ 2023-07-12 17:48:37,548 - INFO - [Train] step: 5399, loss_G: 3.208207, lr_G: 0.000800
124
+ 2023-07-12 17:49:34,989 - INFO - [Train] step: 5499, loss_D: 2.887662, lr_D: 0.000800
125
+ 2023-07-12 17:49:35,077 - INFO - [Train] step: 5499, loss_G: 8.952351, lr_G: 0.000800
126
+ 2023-07-12 17:50:32,512 - INFO - [Train] step: 5599, loss_D: 0.035710, lr_D: 0.000800
127
+ 2023-07-12 17:50:32,599 - INFO - [Train] step: 5599, loss_G: 5.363744, lr_G: 0.000800
128
+ 2023-07-12 17:51:30,067 - INFO - [Train] step: 5699, loss_D: 0.025094, lr_D: 0.000800
129
+ 2023-07-12 17:51:30,156 - INFO - [Train] step: 5699, loss_G: 5.028629, lr_G: 0.000800
130
+ 2023-07-12 17:52:27,609 - INFO - [Train] step: 5799, loss_D: 0.067504, lr_D: 0.000800
131
+ 2023-07-12 17:52:27,697 - INFO - [Train] step: 5799, loss_G: 5.370408, lr_G: 0.000800
132
+ 2023-07-12 17:53:25,291 - INFO - [Train] step: 5899, loss_D: 0.039531, lr_D: 0.000800
133
+ 2023-07-12 17:53:25,379 - INFO - [Train] step: 5899, loss_G: 4.985330, lr_G: 0.000800
134
+ 2023-07-12 17:54:22,840 - INFO - [Train] step: 5999, loss_D: 0.115845, lr_D: 0.000800
135
+ 2023-07-12 17:54:22,928 - INFO - [Train] step: 5999, loss_G: 4.274446, lr_G: 0.000800
136
+ 2023-07-12 17:55:27,145 - INFO - [Eval] step: 5999, fid: 34.028760
137
+ 2023-07-12 17:56:24,688 - INFO - [Train] step: 6099, loss_D: 0.033611, lr_D: 0.000800
138
+ 2023-07-12 17:56:24,776 - INFO - [Train] step: 6099, loss_G: 4.980865, lr_G: 0.000800
139
+ 2023-07-12 17:57:22,224 - INFO - [Train] step: 6199, loss_D: 0.048325, lr_D: 0.000800
140
+ 2023-07-12 17:57:22,312 - INFO - [Train] step: 6199, loss_G: 5.303787, lr_G: 0.000800
141
+ 2023-07-12 17:58:19,786 - INFO - [Train] step: 6299, loss_D: 0.041649, lr_D: 0.000800
142
+ 2023-07-12 17:58:19,873 - INFO - [Train] step: 6299, loss_G: 5.620918, lr_G: 0.000800
143
+ 2023-07-12 17:59:17,344 - INFO - [Train] step: 6399, loss_D: 0.023680, lr_D: 0.000800
144
+ 2023-07-12 17:59:17,431 - INFO - [Train] step: 6399, loss_G: 5.616631, lr_G: 0.000800
145
+ 2023-07-12 18:00:15,025 - INFO - [Train] step: 6499, loss_D: 0.036628, lr_D: 0.000800
146
+ 2023-07-12 18:00:15,113 - INFO - [Train] step: 6499, loss_G: 5.512962, lr_G: 0.000800
147
+ 2023-07-12 18:01:12,577 - INFO - [Train] step: 6599, loss_D: 0.023097, lr_D: 0.000800
148
+ 2023-07-12 18:01:12,664 - INFO - [Train] step: 6599, loss_G: 5.855290, lr_G: 0.000800
149
+ 2023-07-12 18:02:10,133 - INFO - [Train] step: 6699, loss_D: 0.028405, lr_D: 0.000800
150
+ 2023-07-12 18:02:10,221 - INFO - [Train] step: 6699, loss_G: 5.488470, lr_G: 0.000800
151
+ 2023-07-12 18:03:07,679 - INFO - [Train] step: 6799, loss_D: 0.021471, lr_D: 0.000800
152
+ 2023-07-12 18:03:07,767 - INFO - [Train] step: 6799, loss_G: 5.797908, lr_G: 0.000800
153
+ 2023-07-12 18:04:05,218 - INFO - [Train] step: 6899, loss_D: 0.058845, lr_D: 0.000800
154
+ 2023-07-12 18:04:05,306 - INFO - [Train] step: 6899, loss_G: 5.064159, lr_G: 0.000800
155
+ 2023-07-12 18:05:02,742 - INFO - [Train] step: 6999, loss_D: 0.020497, lr_D: 0.000800
156
+ 2023-07-12 18:05:02,830 - INFO - [Train] step: 6999, loss_G: 5.523105, lr_G: 0.000800
157
+ 2023-07-12 18:06:06,988 - INFO - [Eval] step: 6999, fid: 32.576500
158
+ 2023-07-12 18:07:04,593 - INFO - [Train] step: 7099, loss_D: 0.032421, lr_D: 0.000800
159
+ 2023-07-12 18:07:04,681 - INFO - [Train] step: 7099, loss_G: 5.685035, lr_G: 0.000800
160
+ 2023-07-12 18:08:02,316 - INFO - [Train] step: 7199, loss_D: 0.031736, lr_D: 0.000800
161
+ 2023-07-12 18:08:02,404 - INFO - [Train] step: 7199, loss_G: 5.395480, lr_G: 0.000800
162
+ 2023-07-12 18:08:59,916 - INFO - [Train] step: 7299, loss_D: 0.024387, lr_D: 0.000800
163
+ 2023-07-12 18:09:00,004 - INFO - [Train] step: 7299, loss_G: 6.032953, lr_G: 0.000800
164
+ 2023-07-12 18:09:57,523 - INFO - [Train] step: 7399, loss_D: 0.024259, lr_D: 0.000800
165
+ 2023-07-12 18:09:57,611 - INFO - [Train] step: 7399, loss_G: 5.144004, lr_G: 0.000800
166
+ 2023-07-12 18:10:55,097 - INFO - [Train] step: 7499, loss_D: 0.555654, lr_D: 0.000800
167
+ 2023-07-12 18:10:55,185 - INFO - [Train] step: 7499, loss_G: 1.863510, lr_G: 0.000800
168
+ 2023-07-12 18:11:52,686 - INFO - [Train] step: 7599, loss_D: 0.034270, lr_D: 0.000800
169
+ 2023-07-12 18:11:52,774 - INFO - [Train] step: 7599, loss_G: 5.184028, lr_G: 0.000800
170
+ 2023-07-12 18:12:50,269 - INFO - [Train] step: 7699, loss_D: 0.039935, lr_D: 0.000800
171
+ 2023-07-12 18:12:50,356 - INFO - [Train] step: 7699, loss_G: 5.671440, lr_G: 0.000800
172
+ 2023-07-12 18:13:47,983 - INFO - [Train] step: 7799, loss_D: 0.038383, lr_D: 0.000800
173
+ 2023-07-12 18:13:48,071 - INFO - [Train] step: 7799, loss_G: 5.034060, lr_G: 0.000800
174
+ 2023-07-12 18:14:45,546 - INFO - [Train] step: 7899, loss_D: 0.026917, lr_D: 0.000800
175
+ 2023-07-12 18:14:45,634 - INFO - [Train] step: 7899, loss_G: 5.547049, lr_G: 0.000800
176
+ 2023-07-12 18:15:43,134 - INFO - [Train] step: 7999, loss_D: 0.011145, lr_D: 0.000800
177
+ 2023-07-12 18:15:43,222 - INFO - [Train] step: 7999, loss_G: 5.550413, lr_G: 0.000800
178
+ 2023-07-12 18:16:47,438 - INFO - [Eval] step: 7999, fid: 32.241777
179
+ 2023-07-12 18:17:45,032 - INFO - [Train] step: 8099, loss_D: 0.014564, lr_D: 0.000800
180
+ 2023-07-12 18:17:45,120 - INFO - [Train] step: 8099, loss_G: 5.964281, lr_G: 0.000800
181
+ 2023-07-12 18:18:42,639 - INFO - [Train] step: 8199, loss_D: 0.000684, lr_D: 0.000800
182
+ 2023-07-12 18:18:42,726 - INFO - [Train] step: 8199, loss_G: 8.837653, lr_G: 0.000800
183
+ 2023-07-12 18:19:40,235 - INFO - [Train] step: 8299, loss_D: 0.017238, lr_D: 0.000800
184
+ 2023-07-12 18:19:40,322 - INFO - [Train] step: 8299, loss_G: 5.495771, lr_G: 0.000800
185
+ 2023-07-12 18:20:37,807 - INFO - [Train] step: 8399, loss_D: 0.011962, lr_D: 0.000800
186
+ 2023-07-12 18:20:37,894 - INFO - [Train] step: 8399, loss_G: 6.880361, lr_G: 0.000800
187
+ 2023-07-12 18:21:35,525 - INFO - [Train] step: 8499, loss_D: 0.030142, lr_D: 0.000800
188
+ 2023-07-12 18:21:35,612 - INFO - [Train] step: 8499, loss_G: 5.224750, lr_G: 0.000800
189
+ 2023-07-12 18:22:33,112 - INFO - [Train] step: 8599, loss_D: 0.023434, lr_D: 0.000800
190
+ 2023-07-12 18:22:33,199 - INFO - [Train] step: 8599, loss_G: 5.050735, lr_G: 0.000800
191
+ 2023-07-12 18:23:30,680 - INFO - [Train] step: 8699, loss_D: 0.013734, lr_D: 0.000800
192
+ 2023-07-12 18:23:30,768 - INFO - [Train] step: 8699, loss_G: 5.922662, lr_G: 0.000800
193
+ 2023-07-12 18:24:28,246 - INFO - [Train] step: 8799, loss_D: 0.010626, lr_D: 0.000800
194
+ 2023-07-12 18:24:28,334 - INFO - [Train] step: 8799, loss_G: 6.543083, lr_G: 0.000800
195
+ 2023-07-12 18:25:25,846 - INFO - [Train] step: 8899, loss_D: 0.029900, lr_D: 0.000800
196
+ 2023-07-12 18:25:25,934 - INFO - [Train] step: 8899, loss_G: 5.696473, lr_G: 0.000800
197
+ 2023-07-12 18:26:23,424 - INFO - [Train] step: 8999, loss_D: 0.126051, lr_D: 0.000800
198
+ 2023-07-12 18:26:23,512 - INFO - [Train] step: 8999, loss_G: 4.001851, lr_G: 0.000800
199
+ 2023-07-12 18:27:27,684 - INFO - [Eval] step: 8999, fid: 28.227711
200
+ 2023-07-12 18:28:25,482 - INFO - [Train] step: 9099, loss_D: 0.109626, lr_D: 0.000800
201
+ 2023-07-12 18:28:25,570 - INFO - [Train] step: 9099, loss_G: 5.276723, lr_G: 0.000800
202
+ 2023-07-12 18:29:23,076 - INFO - [Train] step: 9199, loss_D: 0.228694, lr_D: 0.000800
203
+ 2023-07-12 18:29:23,164 - INFO - [Train] step: 9199, loss_G: 3.498900, lr_G: 0.000800
204
+ 2023-07-12 18:30:20,680 - INFO - [Train] step: 9299, loss_D: 0.020664, lr_D: 0.000800
205
+ 2023-07-12 18:30:20,768 - INFO - [Train] step: 9299, loss_G: 5.587677, lr_G: 0.000800
206
+ 2023-07-12 18:31:18,286 - INFO - [Train] step: 9399, loss_D: 0.012350, lr_D: 0.000800
207
+ 2023-07-12 18:31:18,373 - INFO - [Train] step: 9399, loss_G: 6.273077, lr_G: 0.000800
208
+ 2023-07-12 18:32:15,891 - INFO - [Train] step: 9499, loss_D: 0.001696, lr_D: 0.000800
209
+ 2023-07-12 18:32:15,979 - INFO - [Train] step: 9499, loss_G: 8.534375, lr_G: 0.000800
210
+ 2023-07-12 18:33:13,486 - INFO - [Train] step: 9599, loss_D: 3.387591, lr_D: 0.000800
211
+ 2023-07-12 18:33:13,574 - INFO - [Train] step: 9599, loss_G: 2.096522, lr_G: 0.000800
212
+ 2023-07-12 18:34:11,097 - INFO - [Train] step: 9699, loss_D: 0.020986, lr_D: 0.000800
213
+ 2023-07-12 18:34:11,184 - INFO - [Train] step: 9699, loss_G: 5.463163, lr_G: 0.000800
214
+ 2023-07-12 18:35:08,811 - INFO - [Train] step: 9799, loss_D: 0.015239, lr_D: 0.000800
215
+ 2023-07-12 18:35:08,899 - INFO - [Train] step: 9799, loss_G: 6.112862, lr_G: 0.000800
216
+ 2023-07-12 18:36:06,406 - INFO - [Train] step: 9899, loss_D: 0.005519, lr_D: 0.000800
217
+ 2023-07-12 18:36:06,494 - INFO - [Train] step: 9899, loss_G: 8.183928, lr_G: 0.000800
218
+ 2023-07-12 18:37:03,990 - INFO - [Train] step: 9999, loss_D: 0.030426, lr_D: 0.000800
219
+ 2023-07-12 18:37:04,078 - INFO - [Train] step: 9999, loss_G: 5.927347, lr_G: 0.000800
220
+ 2023-07-12 18:38:08,210 - INFO - [Eval] step: 9999, fid: 30.084231
221
+ 2023-07-12 18:39:05,700 - INFO - [Train] step: 10099, loss_D: 0.013930, lr_D: 0.000800
222
+ 2023-07-12 18:39:05,788 - INFO - [Train] step: 10099, loss_G: 6.193681, lr_G: 0.000800
223
+ 2023-07-12 18:40:03,300 - INFO - [Train] step: 10199, loss_D: 0.012470, lr_D: 0.000800
224
+ 2023-07-12 18:40:03,388 - INFO - [Train] step: 10199, loss_G: 7.306033, lr_G: 0.000800
225
+ 2023-07-12 18:41:00,907 - INFO - [Train] step: 10299, loss_D: 0.061986, lr_D: 0.000800
226
+ 2023-07-12 18:41:00,995 - INFO - [Train] step: 10299, loss_G: 4.621758, lr_G: 0.000800
227
+ 2023-07-12 18:41:58,658 - INFO - [Train] step: 10399, loss_D: 0.016806, lr_D: 0.000800
228
+ 2023-07-12 18:41:58,746 - INFO - [Train] step: 10399, loss_G: 6.250135, lr_G: 0.000800
229
+ 2023-07-12 18:42:56,249 - INFO - [Train] step: 10499, loss_D: 0.001286, lr_D: 0.000800
230
+ 2023-07-12 18:42:56,337 - INFO - [Train] step: 10499, loss_G: 8.661914, lr_G: 0.000800
231
+ 2023-07-12 18:43:53,823 - INFO - [Train] step: 10599, loss_D: 0.000602, lr_D: 0.000800
232
+ 2023-07-12 18:43:53,911 - INFO - [Train] step: 10599, loss_G: 9.411369, lr_G: 0.000800
233
+ 2023-07-12 18:44:51,413 - INFO - [Train] step: 10699, loss_D: 0.016039, lr_D: 0.000800
234
+ 2023-07-12 18:44:51,501 - INFO - [Train] step: 10699, loss_G: 6.897484, lr_G: 0.000800
235
+ 2023-07-12 18:45:49,004 - INFO - [Train] step: 10799, loss_D: 0.065530, lr_D: 0.000800
236
+ 2023-07-12 18:45:49,092 - INFO - [Train] step: 10799, loss_G: 9.432408, lr_G: 0.000800
237
+ 2023-07-12 18:46:46,585 - INFO - [Train] step: 10899, loss_D: 0.014672, lr_D: 0.000800
238
+ 2023-07-12 18:46:46,673 - INFO - [Train] step: 10899, loss_G: 6.191032, lr_G: 0.000800
239
+ 2023-07-12 18:47:44,316 - INFO - [Train] step: 10999, loss_D: 0.014719, lr_D: 0.000800
240
+ 2023-07-12 18:47:44,404 - INFO - [Train] step: 10999, loss_G: 7.411528, lr_G: 0.000800
241
+ 2023-07-12 18:48:48,592 - INFO - [Eval] step: 10999, fid: 29.694119
242
+ 2023-07-12 18:49:45,827 - INFO - [Train] step: 11099, loss_D: 0.019559, lr_D: 0.000800
243
+ 2023-07-12 18:49:45,915 - INFO - [Train] step: 11099, loss_G: 6.056180, lr_G: 0.000800
244
+ 2023-07-12 18:50:43,301 - INFO - [Train] step: 11199, loss_D: 0.017542, lr_D: 0.000800
245
+ 2023-07-12 18:50:43,389 - INFO - [Train] step: 11199, loss_G: 6.981877, lr_G: 0.000800
246
+ 2023-07-12 18:51:40,805 - INFO - [Train] step: 11299, loss_D: 0.032792, lr_D: 0.000800
247
+ 2023-07-12 18:51:40,893 - INFO - [Train] step: 11299, loss_G: 5.256979, lr_G: 0.000800
248
+ 2023-07-12 18:52:38,305 - INFO - [Train] step: 11399, loss_D: 0.003058, lr_D: 0.000800
249
+ 2023-07-12 18:52:38,393 - INFO - [Train] step: 11399, loss_G: 8.941432, lr_G: 0.000800
250
+ 2023-07-12 18:53:35,801 - INFO - [Train] step: 11499, loss_D: 0.015946, lr_D: 0.000800
251
+ 2023-07-12 18:53:35,889 - INFO - [Train] step: 11499, loss_G: 6.630933, lr_G: 0.000800
252
+ 2023-07-12 18:54:33,283 - INFO - [Train] step: 11599, loss_D: 0.032323, lr_D: 0.000800
253
+ 2023-07-12 18:54:33,370 - INFO - [Train] step: 11599, loss_G: 4.722200, lr_G: 0.000800
254
+ 2023-07-12 18:55:30,897 - INFO - [Train] step: 11699, loss_D: 0.001752, lr_D: 0.000800
255
+ 2023-07-12 18:55:30,984 - INFO - [Train] step: 11699, loss_G: 8.421609, lr_G: 0.000800
256
+ 2023-07-12 18:56:28,360 - INFO - [Train] step: 11799, loss_D: 0.026858, lr_D: 0.000800
257
+ 2023-07-12 18:56:28,448 - INFO - [Train] step: 11799, loss_G: 6.427086, lr_G: 0.000800
258
+ 2023-07-12 18:57:25,831 - INFO - [Train] step: 11899, loss_D: 0.026251, lr_D: 0.000800
259
+ 2023-07-12 18:57:25,919 - INFO - [Train] step: 11899, loss_G: 6.259496, lr_G: 0.000800
260
+ 2023-07-12 18:58:23,311 - INFO - [Train] step: 11999, loss_D: 0.014363, lr_D: 0.000800
261
+ 2023-07-12 18:58:23,399 - INFO - [Train] step: 11999, loss_G: 6.759715, lr_G: 0.000800
262
+ 2023-07-12 18:59:27,592 - INFO - [Eval] step: 11999, fid: 29.804872
263
+ 2023-07-12 19:00:24,880 - INFO - [Train] step: 12099, loss_D: 0.007211, lr_D: 0.000800
264
+ 2023-07-12 19:00:24,968 - INFO - [Train] step: 12099, loss_G: 7.119853, lr_G: 0.000800
265
+ 2023-07-12 19:01:22,522 - INFO - [Train] step: 12199, loss_D: 0.030018, lr_D: 0.000800
266
+ 2023-07-12 19:01:22,610 - INFO - [Train] step: 12199, loss_G: 6.445191, lr_G: 0.000800
267
+ 2023-07-12 19:02:20,282 - INFO - [Train] step: 12299, loss_D: 0.015362, lr_D: 0.000800
268
+ 2023-07-12 19:02:20,370 - INFO - [Train] step: 12299, loss_G: 7.062533, lr_G: 0.000800
269
+ 2023-07-12 19:03:17,900 - INFO - [Train] step: 12399, loss_D: 0.091250, lr_D: 0.000800
270
+ 2023-07-12 19:03:17,988 - INFO - [Train] step: 12399, loss_G: 5.277500, lr_G: 0.000800
271
+ 2023-07-12 19:04:15,503 - INFO - [Train] step: 12499, loss_D: 0.014617, lr_D: 0.000800
272
+ 2023-07-12 19:04:15,592 - INFO - [Train] step: 12499, loss_G: 6.756591, lr_G: 0.000800
273
+ 2023-07-12 19:05:13,100 - INFO - [Train] step: 12599, loss_D: 0.167268, lr_D: 0.000800
274
+ 2023-07-12 19:05:13,188 - INFO - [Train] step: 12599, loss_G: 4.781672, lr_G: 0.000800
275
+ 2023-07-12 19:06:10,715 - INFO - [Train] step: 12699, loss_D: 0.008438, lr_D: 0.000800
276
+ 2023-07-12 19:06:10,802 - INFO - [Train] step: 12699, loss_G: 7.567505, lr_G: 0.000800
277
+ 2023-07-12 19:07:08,327 - INFO - [Train] step: 12799, loss_D: 0.007369, lr_D: 0.000800
278
+ 2023-07-12 19:07:08,415 - INFO - [Train] step: 12799, loss_G: 6.749778, lr_G: 0.000800
279
+ 2023-07-12 19:08:05,975 - INFO - [Train] step: 12899, loss_D: 0.000751, lr_D: 0.000800
280
+ 2023-07-12 19:08:06,063 - INFO - [Train] step: 12899, loss_G: 9.670005, lr_G: 0.000800
281
+ 2023-07-12 19:09:03,725 - INFO - [Train] step: 12999, loss_D: 0.012068, lr_D: 0.000800
282
+ 2023-07-12 19:09:03,813 - INFO - [Train] step: 12999, loss_G: 7.449256, lr_G: 0.000800
283
+ 2023-07-12 19:10:08,014 - INFO - [Eval] step: 12999, fid: 27.870969
284
+ 2023-07-12 19:11:05,663 - INFO - [Train] step: 13099, loss_D: 0.191236, lr_D: 0.000800
285
+ 2023-07-12 19:11:05,751 - INFO - [Train] step: 13099, loss_G: 4.029625, lr_G: 0.000800
286
+ 2023-07-12 19:12:03,278 - INFO - [Train] step: 13199, loss_D: 0.021085, lr_D: 0.000800
287
+ 2023-07-12 19:12:03,366 - INFO - [Train] step: 13199, loss_G: 7.048948, lr_G: 0.000800
288
+ 2023-07-12 19:13:00,899 - INFO - [Train] step: 13299, loss_D: 0.010232, lr_D: 0.000800
289
+ 2023-07-12 19:13:00,987 - INFO - [Train] step: 13299, loss_G: 7.128829, lr_G: 0.000800
290
+ 2023-07-12 19:13:58,538 - INFO - [Train] step: 13399, loss_D: 0.025881, lr_D: 0.000800
291
+ 2023-07-12 19:13:58,626 - INFO - [Train] step: 13399, loss_G: 6.640596, lr_G: 0.000800
292
+ 2023-07-12 19:14:56,132 - INFO - [Train] step: 13499, loss_D: 0.008684, lr_D: 0.000800
293
+ 2023-07-12 19:14:56,220 - INFO - [Train] step: 13499, loss_G: 7.248220, lr_G: 0.000800
294
+ 2023-07-12 19:15:53,867 - INFO - [Train] step: 13599, loss_D: 0.013442, lr_D: 0.000800
295
+ 2023-07-12 19:15:53,955 - INFO - [Train] step: 13599, loss_G: 7.150399, lr_G: 0.000800
296
+ 2023-07-12 19:16:51,476 - INFO - [Train] step: 13699, loss_D: 0.092759, lr_D: 0.000800
297
+ 2023-07-12 19:16:51,563 - INFO - [Train] step: 13699, loss_G: 5.712218, lr_G: 0.000800
298
+ 2023-07-12 19:17:49,082 - INFO - [Train] step: 13799, loss_D: 0.010183, lr_D: 0.000800
299
+ 2023-07-12 19:17:49,170 - INFO - [Train] step: 13799, loss_G: 6.829955, lr_G: 0.000800
300
+ 2023-07-12 19:18:46,688 - INFO - [Train] step: 13899, loss_D: 0.006534, lr_D: 0.000800
301
+ 2023-07-12 19:18:46,775 - INFO - [Train] step: 13899, loss_G: 7.313403, lr_G: 0.000800
302
+ 2023-07-12 19:19:44,267 - INFO - [Train] step: 13999, loss_D: 0.008968, lr_D: 0.000800
303
+ 2023-07-12 19:19:44,356 - INFO - [Train] step: 13999, loss_G: 7.097583, lr_G: 0.000800
304
+ 2023-07-12 19:20:48,545 - INFO - [Eval] step: 13999, fid: 29.171430
305
+ 2023-07-12 19:21:45,877 - INFO - [Train] step: 14099, loss_D: 0.154992, lr_D: 0.000800
306
+ 2023-07-12 19:21:45,965 - INFO - [Train] step: 14099, loss_G: 4.506565, lr_G: 0.000800
307
+ 2023-07-12 19:22:43,480 - INFO - [Train] step: 14199, loss_D: 0.006306, lr_D: 0.000800
308
+ 2023-07-12 19:22:43,567 - INFO - [Train] step: 14199, loss_G: 7.778669, lr_G: 0.000800
309
+ 2023-07-12 19:23:41,233 - INFO - [Train] step: 14299, loss_D: 0.033244, lr_D: 0.000800
310
+ 2023-07-12 19:23:41,321 - INFO - [Train] step: 14299, loss_G: 6.365058, lr_G: 0.000800
311
+ 2023-07-12 19:24:38,846 - INFO - [Train] step: 14399, loss_D: 0.011858, lr_D: 0.000800
312
+ 2023-07-12 19:24:38,934 - INFO - [Train] step: 14399, loss_G: 7.772148, lr_G: 0.000800
313
+ 2023-07-12 19:25:36,447 - INFO - [Train] step: 14499, loss_D: 0.000651, lr_D: 0.000800
314
+ 2023-07-12 19:25:36,535 - INFO - [Train] step: 14499, loss_G: 12.251223, lr_G: 0.000800
315
+ 2023-07-12 19:26:34,071 - INFO - [Train] step: 14599, loss_D: 0.020187, lr_D: 0.000800
316
+ 2023-07-12 19:26:34,158 - INFO - [Train] step: 14599, loss_G: 7.172268, lr_G: 0.000800
317
+ 2023-07-12 19:27:31,670 - INFO - [Train] step: 14699, loss_D: 0.008557, lr_D: 0.000800
318
+ 2023-07-12 19:27:31,758 - INFO - [Train] step: 14699, loss_G: 6.713667, lr_G: 0.000800
319
+ 2023-07-12 19:28:29,268 - INFO - [Train] step: 14799, loss_D: 0.004455, lr_D: 0.000800
320
+ 2023-07-12 19:28:29,356 - INFO - [Train] step: 14799, loss_G: 8.290122, lr_G: 0.000800
321
+ 2023-07-12 19:29:27,002 - INFO - [Train] step: 14899, loss_D: 0.148839, lr_D: 0.000800
322
+ 2023-07-12 19:29:27,090 - INFO - [Train] step: 14899, loss_G: 4.583262, lr_G: 0.000800
323
+ 2023-07-12 19:30:24,606 - INFO - [Train] step: 14999, loss_D: 0.001947, lr_D: 0.000800
324
+ 2023-07-12 19:30:24,694 - INFO - [Train] step: 14999, loss_G: 9.636635, lr_G: 0.000800
325
+ 2023-07-12 19:31:28,905 - INFO - [Eval] step: 14999, fid: 31.329617
326
+ 2023-07-12 19:32:26,223 - INFO - [Train] step: 15099, loss_D: 0.008853, lr_D: 0.000800
327
+ 2023-07-12 19:32:26,311 - INFO - [Train] step: 15099, loss_G: 8.022595, lr_G: 0.000800
328
+ 2023-07-12 19:33:23,839 - INFO - [Train] step: 15199, loss_D: 0.006613, lr_D: 0.000800
329
+ 2023-07-12 19:33:23,926 - INFO - [Train] step: 15199, loss_G: 8.065881, lr_G: 0.000800
330
+ 2023-07-12 19:34:21,453 - INFO - [Train] step: 15299, loss_D: 0.009662, lr_D: 0.000800
331
+ 2023-07-12 19:34:21,541 - INFO - [Train] step: 15299, loss_G: 7.805553, lr_G: 0.000800
332
+ 2023-07-12 19:35:19,088 - INFO - [Train] step: 15399, loss_D: 0.006433, lr_D: 0.000800
333
+ 2023-07-12 19:35:19,176 - INFO - [Train] step: 15399, loss_G: 7.157223, lr_G: 0.000800
334
+ 2023-07-12 19:36:16,726 - INFO - [Train] step: 15499, loss_D: 0.020738, lr_D: 0.000800
335
+ 2023-07-12 19:36:16,813 - INFO - [Train] step: 15499, loss_G: 7.369261, lr_G: 0.000800
336
+ 2023-07-12 19:37:14,483 - INFO - [Train] step: 15599, loss_D: 0.007516, lr_D: 0.000800
337
+ 2023-07-12 19:37:14,571 - INFO - [Train] step: 15599, loss_G: 8.355461, lr_G: 0.000800
338
+ 2023-07-12 19:38:12,087 - INFO - [Train] step: 15699, loss_D: 0.006930, lr_D: 0.000800
339
+ 2023-07-12 19:38:12,175 - INFO - [Train] step: 15699, loss_G: 8.547059, lr_G: 0.000800
340
+ 2023-07-12 19:39:09,721 - INFO - [Train] step: 15799, loss_D: 0.007758, lr_D: 0.000800
341
+ 2023-07-12 19:39:09,808 - INFO - [Train] step: 15799, loss_G: 7.711467, lr_G: 0.000800
342
+ 2023-07-12 19:40:07,335 - INFO - [Train] step: 15899, loss_D: 0.668111, lr_D: 0.000800
343
+ 2023-07-12 19:40:07,423 - INFO - [Train] step: 15899, loss_G: 1.805636, lr_G: 0.000800
344
+ 2023-07-12 19:41:04,937 - INFO - [Train] step: 15999, loss_D: 0.014439, lr_D: 0.000800
345
+ 2023-07-12 19:41:05,025 - INFO - [Train] step: 15999, loss_G: 7.707753, lr_G: 0.000800
346
+ 2023-07-12 19:42:09,224 - INFO - [Eval] step: 15999, fid: 27.496213
347
+ 2023-07-12 19:43:06,872 - INFO - [Train] step: 16099, loss_D: 0.004317, lr_D: 0.000800
348
+ 2023-07-12 19:43:06,959 - INFO - [Train] step: 16099, loss_G: 7.620277, lr_G: 0.000800
349
+ 2023-07-12 19:44:04,589 - INFO - [Train] step: 16199, loss_D: 0.002995, lr_D: 0.000800
350
+ 2023-07-12 19:44:04,677 - INFO - [Train] step: 16199, loss_G: 10.655226, lr_G: 0.000800
351
+ 2023-07-12 19:45:02,175 - INFO - [Train] step: 16299, loss_D: 0.015546, lr_D: 0.000800
352
+ 2023-07-12 19:45:02,262 - INFO - [Train] step: 16299, loss_G: 8.730984, lr_G: 0.000800
353
+ 2023-07-12 19:45:59,769 - INFO - [Train] step: 16399, loss_D: 0.004836, lr_D: 0.000800
354
+ 2023-07-12 19:45:59,857 - INFO - [Train] step: 16399, loss_G: 8.217114, lr_G: 0.000800
355
+ 2023-07-12 19:46:57,375 - INFO - [Train] step: 16499, loss_D: 0.018905, lr_D: 0.000800
356
+ 2023-07-12 19:46:57,463 - INFO - [Train] step: 16499, loss_G: 7.328541, lr_G: 0.000800
357
+ 2023-07-12 19:47:54,961 - INFO - [Train] step: 16599, loss_D: 0.004462, lr_D: 0.000800
358
+ 2023-07-12 19:47:55,049 - INFO - [Train] step: 16599, loss_G: 8.639082, lr_G: 0.000800
359
+ 2023-07-12 19:48:52,557 - INFO - [Train] step: 16699, loss_D: 0.003930, lr_D: 0.000800
360
+ 2023-07-12 19:48:52,645 - INFO - [Train] step: 16699, loss_G: 8.739486, lr_G: 0.000800
361
+ 2023-07-12 19:49:50,114 - INFO - [Train] step: 16799, loss_D: 0.624540, lr_D: 0.000800
362
+ 2023-07-12 19:49:50,201 - INFO - [Train] step: 16799, loss_G: 2.562969, lr_G: 0.000800
363
+ 2023-07-12 19:50:47,821 - INFO - [Train] step: 16899, loss_D: 0.007945, lr_D: 0.000800
364
+ 2023-07-12 19:50:47,908 - INFO - [Train] step: 16899, loss_G: 8.452702, lr_G: 0.000800
365
+ 2023-07-12 19:51:45,397 - INFO - [Train] step: 16999, loss_D: 0.006168, lr_D: 0.000800
366
+ 2023-07-12 19:51:45,485 - INFO - [Train] step: 16999, loss_G: 7.918242, lr_G: 0.000800
367
+ 2023-07-12 19:52:49,625 - INFO - [Eval] step: 16999, fid: 29.205224
368
+ 2023-07-12 19:53:46,966 - INFO - [Train] step: 17099, loss_D: 0.003305, lr_D: 0.000800
369
+ 2023-07-12 19:53:47,053 - INFO - [Train] step: 17099, loss_G: 8.687705, lr_G: 0.000800
370
+ 2023-07-12 19:54:44,570 - INFO - [Train] step: 17199, loss_D: 0.251634, lr_D: 0.000800
371
+ 2023-07-12 19:54:44,658 - INFO - [Train] step: 17199, loss_G: 4.278350, lr_G: 0.000800
372
+ 2023-07-12 19:55:42,146 - INFO - [Train] step: 17299, loss_D: 0.012229, lr_D: 0.000800
373
+ 2023-07-12 19:55:42,234 - INFO - [Train] step: 17299, loss_G: 8.460243, lr_G: 0.000800
374
+ 2023-07-12 19:56:39,727 - INFO - [Train] step: 17399, loss_D: 0.004488, lr_D: 0.000800
375
+ 2023-07-12 19:56:39,815 - INFO - [Train] step: 17399, loss_G: 8.270636, lr_G: 0.000800
376
+ 2023-07-12 19:57:37,438 - INFO - [Train] step: 17499, loss_D: 0.002650, lr_D: 0.000800
377
+ 2023-07-12 19:57:37,526 - INFO - [Train] step: 17499, loss_G: 8.917006, lr_G: 0.000800
378
+ 2023-07-12 19:58:35,022 - INFO - [Train] step: 17599, loss_D: 0.064571, lr_D: 0.000800
379
+ 2023-07-12 19:58:35,110 - INFO - [Train] step: 17599, loss_G: 6.797041, lr_G: 0.000800
380
+ 2023-07-12 19:59:32,589 - INFO - [Train] step: 17699, loss_D: 0.020608, lr_D: 0.000800
381
+ 2023-07-12 19:59:32,677 - INFO - [Train] step: 17699, loss_G: 7.263035, lr_G: 0.000800
382
+ 2023-07-12 20:00:30,166 - INFO - [Train] step: 17799, loss_D: 0.004761, lr_D: 0.000800
383
+ 2023-07-12 20:00:30,254 - INFO - [Train] step: 17799, loss_G: 8.363796, lr_G: 0.000800
384
+ 2023-07-12 20:01:27,742 - INFO - [Train] step: 17899, loss_D: 0.003941, lr_D: 0.000800
385
+ 2023-07-12 20:01:27,830 - INFO - [Train] step: 17899, loss_G: 8.821142, lr_G: 0.000800
386
+ 2023-07-12 20:02:25,322 - INFO - [Train] step: 17999, loss_D: 0.127479, lr_D: 0.000800
387
+ 2023-07-12 20:02:25,409 - INFO - [Train] step: 17999, loss_G: 5.232607, lr_G: 0.000800
388
+ 2023-07-12 20:03:29,705 - INFO - [Eval] step: 17999, fid: 25.614563
389
+ 2023-07-12 20:04:27,432 - INFO - [Train] step: 18099, loss_D: 0.009604, lr_D: 0.000800
390
+ 2023-07-12 20:04:27,519 - INFO - [Train] step: 18099, loss_G: 7.408797, lr_G: 0.000800
391
+ 2023-07-12 20:05:25,146 - INFO - [Train] step: 18199, loss_D: 0.005565, lr_D: 0.000800
392
+ 2023-07-12 20:05:25,234 - INFO - [Train] step: 18199, loss_G: 8.682463, lr_G: 0.000800
393
+ 2023-07-12 20:06:22,728 - INFO - [Train] step: 18299, loss_D: 0.000129, lr_D: 0.000800
394
+ 2023-07-12 20:06:22,816 - INFO - [Train] step: 18299, loss_G: 14.515563, lr_G: 0.000800
395
+ 2023-07-12 20:07:20,316 - INFO - [Train] step: 18399, loss_D: 0.092455, lr_D: 0.000800
396
+ 2023-07-12 20:07:20,404 - INFO - [Train] step: 18399, loss_G: 6.524065, lr_G: 0.000800
397
+ 2023-07-12 20:08:17,898 - INFO - [Train] step: 18499, loss_D: 0.013248, lr_D: 0.000800
398
+ 2023-07-12 20:08:17,985 - INFO - [Train] step: 18499, loss_G: 8.964443, lr_G: 0.000800
399
+ 2023-07-12 20:09:15,495 - INFO - [Train] step: 18599, loss_D: 0.006180, lr_D: 0.000800
400
+ 2023-07-12 20:09:15,583 - INFO - [Train] step: 18599, loss_G: 8.387197, lr_G: 0.000800
401
+ 2023-07-12 20:10:13,068 - INFO - [Train] step: 18699, loss_D: 0.349644, lr_D: 0.000800
402
+ 2023-07-12 20:10:13,155 - INFO - [Train] step: 18699, loss_G: 2.922549, lr_G: 0.000800
403
+ 2023-07-12 20:11:10,775 - INFO - [Train] step: 18799, loss_D: 0.015498, lr_D: 0.000800
404
+ 2023-07-12 20:11:10,863 - INFO - [Train] step: 18799, loss_G: 8.309088, lr_G: 0.000800
405
+ 2023-07-12 20:12:08,354 - INFO - [Train] step: 18899, loss_D: 0.005159, lr_D: 0.000800
406
+ 2023-07-12 20:12:08,442 - INFO - [Train] step: 18899, loss_G: 7.997553, lr_G: 0.000800
407
+ 2023-07-12 20:13:05,912 - INFO - [Train] step: 18999, loss_D: 0.003083, lr_D: 0.000800
408
+ 2023-07-12 20:13:06,000 - INFO - [Train] step: 18999, loss_G: 8.636139, lr_G: 0.000800
409
+ 2023-07-12 20:14:10,174 - INFO - [Eval] step: 18999, fid: 31.495495
410
+ 2023-07-12 20:15:07,530 - INFO - [Train] step: 19099, loss_D: 0.196236, lr_D: 0.000800
411
+ 2023-07-12 20:15:07,618 - INFO - [Train] step: 19099, loss_G: 4.485887, lr_G: 0.000800
412
+ 2023-07-12 20:16:05,132 - INFO - [Train] step: 19199, loss_D: 0.000892, lr_D: 0.000800
413
+ 2023-07-12 20:16:05,220 - INFO - [Train] step: 19199, loss_G: 11.486435, lr_G: 0.000800
414
+ 2023-07-12 20:17:02,755 - INFO - [Train] step: 19299, loss_D: 0.011981, lr_D: 0.000800
415
+ 2023-07-12 20:17:02,843 - INFO - [Train] step: 19299, loss_G: 8.722584, lr_G: 0.000800
416
+ 2023-07-12 20:18:00,385 - INFO - [Train] step: 19399, loss_D: 0.001844, lr_D: 0.000800
417
+ 2023-07-12 20:18:00,473 - INFO - [Train] step: 19399, loss_G: 9.310270, lr_G: 0.000800
418
+ 2023-07-12 20:18:58,131 - INFO - [Train] step: 19499, loss_D: 0.120050, lr_D: 0.000800
419
+ 2023-07-12 20:18:58,219 - INFO - [Train] step: 19499, loss_G: 5.309950, lr_G: 0.000800
420
+ 2023-07-12 20:19:55,751 - INFO - [Train] step: 19599, loss_D: 0.008602, lr_D: 0.000800
421
+ 2023-07-12 20:19:55,839 - INFO - [Train] step: 19599, loss_G: 8.204571, lr_G: 0.000800
422
+ 2023-07-12 20:20:53,350 - INFO - [Train] step: 19699, loss_D: 0.002973, lr_D: 0.000800
423
+ 2023-07-12 20:20:53,438 - INFO - [Train] step: 19699, loss_G: 9.221530, lr_G: 0.000800
424
+ 2023-07-12 20:21:50,960 - INFO - [Train] step: 19799, loss_D: 0.000326, lr_D: 0.000800
425
+ 2023-07-12 20:21:51,048 - INFO - [Train] step: 19799, loss_G: 11.997440, lr_G: 0.000800
426
+ 2023-07-12 20:22:48,578 - INFO - [Train] step: 19899, loss_D: 0.037789, lr_D: 0.000800
427
+ 2023-07-12 20:22:48,666 - INFO - [Train] step: 19899, loss_G: 8.102247, lr_G: 0.000800
428
+ 2023-07-12 20:23:46,169 - INFO - [Train] step: 19999, loss_D: 0.082576, lr_D: 0.000800
429
+ 2023-07-12 20:23:46,257 - INFO - [Train] step: 19999, loss_G: 5.310421, lr_G: 0.000800
430
+ 2023-07-12 20:24:50,370 - INFO - [Eval] step: 19999, fid: 25.273269
431
+ 2023-07-12 20:25:48,365 - INFO - [Train] step: 20099, loss_D: 0.006555, lr_D: 0.000800
432
+ 2023-07-12 20:25:48,453 - INFO - [Train] step: 20099, loss_G: 8.638845, lr_G: 0.000800
433
+ 2023-07-12 20:26:45,987 - INFO - [Train] step: 20199, loss_D: 0.009030, lr_D: 0.000800
434
+ 2023-07-12 20:26:46,075 - INFO - [Train] step: 20199, loss_G: 9.081997, lr_G: 0.000800
435
+ 2023-07-12 20:27:43,613 - INFO - [Train] step: 20299, loss_D: 0.003769, lr_D: 0.000800
436
+ 2023-07-12 20:27:43,701 - INFO - [Train] step: 20299, loss_G: 9.091711, lr_G: 0.000800
437
+ 2023-07-12 20:28:41,239 - INFO - [Train] step: 20399, loss_D: 0.029936, lr_D: 0.000800
438
+ 2023-07-12 20:28:41,327 - INFO - [Train] step: 20399, loss_G: 7.126809, lr_G: 0.000800
439
+ 2023-07-12 20:29:38,849 - INFO - [Train] step: 20499, loss_D: 0.006551, lr_D: 0.000800
440
+ 2023-07-12 20:29:38,936 - INFO - [Train] step: 20499, loss_G: 8.915304, lr_G: 0.000800
441
+ 2023-07-12 20:30:36,466 - INFO - [Train] step: 20599, loss_D: 0.004902, lr_D: 0.000800
442
+ 2023-07-12 20:30:36,554 - INFO - [Train] step: 20599, loss_G: 8.482403, lr_G: 0.000800
443
+ 2023-07-12 20:31:34,227 - INFO - [Train] step: 20699, loss_D: 0.002523, lr_D: 0.000800
444
+ 2023-07-12 20:31:34,315 - INFO - [Train] step: 20699, loss_G: 9.256380, lr_G: 0.000800
445
+ 2023-07-12 20:32:31,834 - INFO - [Train] step: 20799, loss_D: 0.032943, lr_D: 0.000800
446
+ 2023-07-12 20:32:31,922 - INFO - [Train] step: 20799, loss_G: 7.857731, lr_G: 0.000800
447
+ 2023-07-12 20:33:29,450 - INFO - [Train] step: 20899, loss_D: 0.003886, lr_D: 0.000800
448
+ 2023-07-12 20:33:29,538 - INFO - [Train] step: 20899, loss_G: 9.247038, lr_G: 0.000800
449
+ 2023-07-12 20:34:27,067 - INFO - [Train] step: 20999, loss_D: 0.002908, lr_D: 0.000800
450
+ 2023-07-12 20:34:27,155 - INFO - [Train] step: 20999, loss_G: 8.331944, lr_G: 0.000800
451
+ 2023-07-12 20:35:31,345 - INFO - [Eval] step: 20999, fid: 30.028896
452
+ 2023-07-12 20:36:28,638 - INFO - [Train] step: 21099, loss_D: 0.002126, lr_D: 0.000800
453
+ 2023-07-12 20:36:28,726 - INFO - [Train] step: 21099, loss_G: 9.205379, lr_G: 0.000800
454
+ 2023-07-12 20:37:26,289 - INFO - [Train] step: 21199, loss_D: 0.149846, lr_D: 0.000800
455
+ 2023-07-12 20:37:26,377 - INFO - [Train] step: 21199, loss_G: 4.221058, lr_G: 0.000800
456
+ 2023-07-12 20:38:23,908 - INFO - [Train] step: 21299, loss_D: 0.003761, lr_D: 0.000800
457
+ 2023-07-12 20:38:23,995 - INFO - [Train] step: 21299, loss_G: 9.138874, lr_G: 0.000800
458
+ 2023-07-12 20:39:21,655 - INFO - [Train] step: 21399, loss_D: 0.002520, lr_D: 0.000800
459
+ 2023-07-12 20:39:21,743 - INFO - [Train] step: 21399, loss_G: 10.075739, lr_G: 0.000800
460
+ 2023-07-12 20:40:19,279 - INFO - [Train] step: 21499, loss_D: 0.018885, lr_D: 0.000800
461
+ 2023-07-12 20:40:19,367 - INFO - [Train] step: 21499, loss_G: 8.425512, lr_G: 0.000800
462
+ 2023-07-12 20:41:16,895 - INFO - [Train] step: 21599, loss_D: 0.006437, lr_D: 0.000800
463
+ 2023-07-12 20:41:16,983 - INFO - [Train] step: 21599, loss_G: 9.219998, lr_G: 0.000800
464
+ 2023-07-12 20:42:14,524 - INFO - [Train] step: 21699, loss_D: 0.003460, lr_D: 0.000800
465
+ 2023-07-12 20:42:14,612 - INFO - [Train] step: 21699, loss_G: 9.281570, lr_G: 0.000800
466
+ 2023-07-12 20:43:12,136 - INFO - [Train] step: 21799, loss_D: 0.358669, lr_D: 0.000800
467
+ 2023-07-12 20:43:12,224 - INFO - [Train] step: 21799, loss_G: 3.485884, lr_G: 0.000800
468
+ 2023-07-12 20:44:09,745 - INFO - [Train] step: 21899, loss_D: 0.011900, lr_D: 0.000800
469
+ 2023-07-12 20:44:09,833 - INFO - [Train] step: 21899, loss_G: 7.709840, lr_G: 0.000800
470
+ 2023-07-12 20:45:07,483 - INFO - [Train] step: 21999, loss_D: 0.003525, lr_D: 0.000800
471
+ 2023-07-12 20:45:07,571 - INFO - [Train] step: 21999, loss_G: 9.405258, lr_G: 0.000800
472
+ 2023-07-12 20:46:11,719 - INFO - [Eval] step: 21999, fid: 30.593891
473
+ 2023-07-12 20:47:09,011 - INFO - [Train] step: 22099, loss_D: 0.001346, lr_D: 0.000800
474
+ 2023-07-12 20:47:09,098 - INFO - [Train] step: 22099, loss_G: 10.648615, lr_G: 0.000800
475
+ 2023-07-12 20:48:06,649 - INFO - [Train] step: 22199, loss_D: 0.028642, lr_D: 0.000800
476
+ 2023-07-12 20:48:06,737 - INFO - [Train] step: 22199, loss_G: 7.590566, lr_G: 0.000800
477
+ 2023-07-12 20:49:04,276 - INFO - [Train] step: 22299, loss_D: 0.000499, lr_D: 0.000800
478
+ 2023-07-12 20:49:04,364 - INFO - [Train] step: 22299, loss_G: 11.025435, lr_G: 0.000800
479
+ 2023-07-12 20:50:01,903 - INFO - [Train] step: 22399, loss_D: 0.033459, lr_D: 0.000800
480
+ 2023-07-12 20:50:01,990 - INFO - [Train] step: 22399, loss_G: 7.106796, lr_G: 0.000800
481
+ 2023-07-12 20:50:59,535 - INFO - [Train] step: 22499, loss_D: 0.009488, lr_D: 0.000800
482
+ 2023-07-12 20:50:59,623 - INFO - [Train] step: 22499, loss_G: 9.747847, lr_G: 0.000800
483
+ 2023-07-12 20:51:57,162 - INFO - [Train] step: 22599, loss_D: 0.002946, lr_D: 0.000800
484
+ 2023-07-12 20:51:57,249 - INFO - [Train] step: 22599, loss_G: 8.806899, lr_G: 0.000800
485
+ 2023-07-12 20:52:54,949 - INFO - [Train] step: 22699, loss_D: 0.002408, lr_D: 0.000800
486
+ 2023-07-12 20:52:55,037 - INFO - [Train] step: 22699, loss_G: 9.545119, lr_G: 0.000800
487
+ 2023-07-12 20:53:52,575 - INFO - [Train] step: 22799, loss_D: 0.119984, lr_D: 0.000800
488
+ 2023-07-12 20:53:52,663 - INFO - [Train] step: 22799, loss_G: 5.494000, lr_G: 0.000800
489
+ 2023-07-12 20:54:50,182 - INFO - [Train] step: 22899, loss_D: 0.004186, lr_D: 0.000800
490
+ 2023-07-12 20:54:50,270 - INFO - [Train] step: 22899, loss_G: 9.423959, lr_G: 0.000800
491
+ 2023-07-12 20:55:47,789 - INFO - [Train] step: 22999, loss_D: 0.006070, lr_D: 0.000800
492
+ 2023-07-12 20:55:47,877 - INFO - [Train] step: 22999, loss_G: 9.565794, lr_G: 0.000800
493
+ 2023-07-12 20:56:51,991 - INFO - [Eval] step: 22999, fid: 31.954991
494
+ 2023-07-12 20:57:49,242 - INFO - [Train] step: 23099, loss_D: 0.002778, lr_D: 0.000800
495
+ 2023-07-12 20:57:49,330 - INFO - [Train] step: 23099, loss_G: 9.118521, lr_G: 0.000800
496
+ 2023-07-12 20:58:46,773 - INFO - [Train] step: 23199, loss_D: 0.034686, lr_D: 0.000800
497
+ 2023-07-12 20:58:46,860 - INFO - [Train] step: 23199, loss_G: 7.610370, lr_G: 0.000800
498
+ 2023-07-12 20:59:44,451 - INFO - [Train] step: 23299, loss_D: 0.006708, lr_D: 0.000800
499
+ 2023-07-12 20:59:44,539 - INFO - [Train] step: 23299, loss_G: 8.725147, lr_G: 0.000800
500
+ 2023-07-12 21:00:41,966 - INFO - [Train] step: 23399, loss_D: 0.040776, lr_D: 0.000800
501
+ 2023-07-12 21:00:42,053 - INFO - [Train] step: 23399, loss_G: 7.141646, lr_G: 0.000800
502
+ 2023-07-12 21:01:39,496 - INFO - [Train] step: 23499, loss_D: 0.004511, lr_D: 0.000800
503
+ 2023-07-12 21:01:39,584 - INFO - [Train] step: 23499, loss_G: 8.720704, lr_G: 0.000800
504
+ 2023-07-12 21:02:37,007 - INFO - [Train] step: 23599, loss_D: 0.004814, lr_D: 0.000800
505
+ 2023-07-12 21:02:37,095 - INFO - [Train] step: 23599, loss_G: 9.989141, lr_G: 0.000800
506
+ 2023-07-12 21:03:34,586 - INFO - [Train] step: 23699, loss_D: 0.066452, lr_D: 0.000800
507
+ 2023-07-12 21:03:34,674 - INFO - [Train] step: 23699, loss_G: 6.562090, lr_G: 0.000800
508
+ 2023-07-12 21:04:32,124 - INFO - [Train] step: 23799, loss_D: 0.003175, lr_D: 0.000800
509
+ 2023-07-12 21:04:32,212 - INFO - [Train] step: 23799, loss_G: 8.416210, lr_G: 0.000800
510
+ 2023-07-12 21:05:29,651 - INFO - [Train] step: 23899, loss_D: 0.001774, lr_D: 0.000800
511
+ 2023-07-12 21:05:29,739 - INFO - [Train] step: 23899, loss_G: 9.443905, lr_G: 0.000800
512
+ 2023-07-12 21:06:27,332 - INFO - [Train] step: 23999, loss_D: 0.002429, lr_D: 0.000800
513
+ 2023-07-12 21:06:27,420 - INFO - [Train] step: 23999, loss_G: 10.313082, lr_G: 0.000800
514
+ 2023-07-12 21:07:31,607 - INFO - [Eval] step: 23999, fid: 30.087386
515
+ 2023-07-12 21:08:28,836 - INFO - [Train] step: 24099, loss_D: 0.056333, lr_D: 0.000800
516
+ 2023-07-12 21:08:28,924 - INFO - [Train] step: 24099, loss_G: 7.034041, lr_G: 0.000800
517
+ 2023-07-12 21:09:26,365 - INFO - [Train] step: 24199, loss_D: 0.002856, lr_D: 0.000800
518
+ 2023-07-12 21:09:26,452 - INFO - [Train] step: 24199, loss_G: 9.530584, lr_G: 0.000800
519
+ 2023-07-12 21:10:23,885 - INFO - [Train] step: 24299, loss_D: 0.002775, lr_D: 0.000800
520
+ 2023-07-12 21:10:23,973 - INFO - [Train] step: 24299, loss_G: 9.159224, lr_G: 0.000800
521
+ 2023-07-12 21:11:21,421 - INFO - [Train] step: 24399, loss_D: 0.004340, lr_D: 0.000800
522
+ 2023-07-12 21:11:21,509 - INFO - [Train] step: 24399, loss_G: 9.501318, lr_G: 0.000800
523
+ 2023-07-12 21:12:18,943 - INFO - [Train] step: 24499, loss_D: 0.090740, lr_D: 0.000800
524
+ 2023-07-12 21:12:19,031 - INFO - [Train] step: 24499, loss_G: 6.190283, lr_G: 0.000800
525
+ 2023-07-12 21:13:16,616 - INFO - [Train] step: 24599, loss_D: 0.284641, lr_D: 0.000800
526
+ 2023-07-12 21:13:16,703 - INFO - [Train] step: 24599, loss_G: 4.444631, lr_G: 0.000800
527
+ 2023-07-12 21:14:14,129 - INFO - [Train] step: 24699, loss_D: 0.005125, lr_D: 0.000800
528
+ 2023-07-12 21:14:14,217 - INFO - [Train] step: 24699, loss_G: 8.887693, lr_G: 0.000800
529
+ 2023-07-12 21:15:11,637 - INFO - [Train] step: 24799, loss_D: 0.001778, lr_D: 0.000800
530
+ 2023-07-12 21:15:11,725 - INFO - [Train] step: 24799, loss_G: 10.052690, lr_G: 0.000800
531
+ 2023-07-12 21:16:09,147 - INFO - [Train] step: 24899, loss_D: 0.003375, lr_D: 0.000800
532
+ 2023-07-12 21:16:09,235 - INFO - [Train] step: 24899, loss_G: 9.332700, lr_G: 0.000800
533
+ 2023-07-12 21:17:06,630 - INFO - [Train] step: 24999, loss_D: 0.090686, lr_D: 0.000800
534
+ 2023-07-12 21:17:06,717 - INFO - [Train] step: 24999, loss_G: 6.137249, lr_G: 0.000800
535
+ 2023-07-12 21:18:10,925 - INFO - [Eval] step: 24999, fid: 29.141241
536
+ 2023-07-12 21:19:08,154 - INFO - [Train] step: 25099, loss_D: 0.007973, lr_D: 0.000800
537
+ 2023-07-12 21:19:08,242 - INFO - [Train] step: 25099, loss_G: 9.243431, lr_G: 0.000800
538
+ 2023-07-12 21:20:05,675 - INFO - [Train] step: 25199, loss_D: 0.002208, lr_D: 0.000800
539
+ 2023-07-12 21:20:05,762 - INFO - [Train] step: 25199, loss_G: 10.213819, lr_G: 0.000800
540
+ 2023-07-12 21:21:03,343 - INFO - [Train] step: 25299, loss_D: 0.001321, lr_D: 0.000800
541
+ 2023-07-12 21:21:03,431 - INFO - [Train] step: 25299, loss_G: 9.658791, lr_G: 0.000800
542
+ 2023-07-12 21:22:00,869 - INFO - [Train] step: 25399, loss_D: 0.061258, lr_D: 0.000800
543
+ 2023-07-12 21:22:00,957 - INFO - [Train] step: 25399, loss_G: 9.202701, lr_G: 0.000800
544
+ 2023-07-12 21:22:58,385 - INFO - [Train] step: 25499, loss_D: 0.004342, lr_D: 0.000800
545
+ 2023-07-12 21:22:58,474 - INFO - [Train] step: 25499, loss_G: 9.858047, lr_G: 0.000800
546
+ 2023-07-12 21:23:55,882 - INFO - [Train] step: 25599, loss_D: 0.005552, lr_D: 0.000800
547
+ 2023-07-12 21:23:55,970 - INFO - [Train] step: 25599, loss_G: 9.966296, lr_G: 0.000800
548
+ 2023-07-12 21:24:53,403 - INFO - [Train] step: 25699, loss_D: 0.001028, lr_D: 0.000800
549
+ 2023-07-12 21:24:53,491 - INFO - [Train] step: 25699, loss_G: 12.139588, lr_G: 0.000800
550
+ 2023-07-12 21:25:50,933 - INFO - [Train] step: 25799, loss_D: 0.017251, lr_D: 0.000800
551
+ 2023-07-12 21:25:51,020 - INFO - [Train] step: 25799, loss_G: 9.335152, lr_G: 0.000800
552
+ 2023-07-12 21:26:48,612 - INFO - [Train] step: 25899, loss_D: 0.004782, lr_D: 0.000800
553
+ 2023-07-12 21:26:48,701 - INFO - [Train] step: 25899, loss_G: 9.611577, lr_G: 0.000800
554
+ 2023-07-12 21:27:46,132 - INFO - [Train] step: 25999, loss_D: 0.001584, lr_D: 0.000800
555
+ 2023-07-12 21:27:46,220 - INFO - [Train] step: 25999, loss_G: 10.825005, lr_G: 0.000800
556
+ 2023-07-12 21:28:50,455 - INFO - [Eval] step: 25999, fid: 31.382735
557
+ 2023-07-12 21:29:47,738 - INFO - [Train] step: 26099, loss_D: 0.018527, lr_D: 0.000800
558
+ 2023-07-12 21:29:47,825 - INFO - [Train] step: 26099, loss_G: 7.776439, lr_G: 0.000800
559
+ 2023-07-12 21:30:45,270 - INFO - [Train] step: 26199, loss_D: 0.004127, lr_D: 0.000800
560
+ 2023-07-12 21:30:45,357 - INFO - [Train] step: 26199, loss_G: 10.917329, lr_G: 0.000800
561
+ 2023-07-12 21:31:42,808 - INFO - [Train] step: 26299, loss_D: 0.003236, lr_D: 0.000800
562
+ 2023-07-12 21:31:42,897 - INFO - [Train] step: 26299, loss_G: 9.382841, lr_G: 0.000800
563
+ 2023-07-12 21:32:40,345 - INFO - [Train] step: 26399, loss_D: 0.001694, lr_D: 0.000800
564
+ 2023-07-12 21:32:40,433 - INFO - [Train] step: 26399, loss_G: 10.693460, lr_G: 0.000800
565
+ 2023-07-12 21:33:37,883 - INFO - [Train] step: 26499, loss_D: 0.012227, lr_D: 0.000800
566
+ 2023-07-12 21:33:37,971 - INFO - [Train] step: 26499, loss_G: 8.775512, lr_G: 0.000800
567
+ 2023-07-12 21:34:35,571 - INFO - [Train] step: 26599, loss_D: 0.006952, lr_D: 0.000800
568
+ 2023-07-12 21:34:35,659 - INFO - [Train] step: 26599, loss_G: 9.833460, lr_G: 0.000800
569
+ 2023-07-12 21:35:33,158 - INFO - [Train] step: 26699, loss_D: 0.004025, lr_D: 0.000800
570
+ 2023-07-12 21:35:33,246 - INFO - [Train] step: 26699, loss_G: 10.285404, lr_G: 0.000800
571
+ 2023-07-12 21:36:30,686 - INFO - [Train] step: 26799, loss_D: 0.016425, lr_D: 0.000800
572
+ 2023-07-12 21:36:30,773 - INFO - [Train] step: 26799, loss_G: 8.288247, lr_G: 0.000800
573
+ 2023-07-12 21:37:28,201 - INFO - [Train] step: 26899, loss_D: 0.008729, lr_D: 0.000800
574
+ 2023-07-12 21:37:28,289 - INFO - [Train] step: 26899, loss_G: 9.647539, lr_G: 0.000800
575
+ 2023-07-12 21:38:25,707 - INFO - [Train] step: 26999, loss_D: 0.003342, lr_D: 0.000800
576
+ 2023-07-12 21:38:25,795 - INFO - [Train] step: 26999, loss_G: 10.802924, lr_G: 0.000800
577
+ 2023-07-12 21:39:29,941 - INFO - [Eval] step: 26999, fid: 33.825623
578
+ 2023-07-12 21:40:27,234 - INFO - [Train] step: 27099, loss_D: 0.274537, lr_D: 0.000800
579
+ 2023-07-12 21:40:27,321 - INFO - [Train] step: 27099, loss_G: 4.335018, lr_G: 0.000800
580
+ 2023-07-12 21:41:24,924 - INFO - [Train] step: 27199, loss_D: 0.007387, lr_D: 0.000800
581
+ 2023-07-12 21:41:25,012 - INFO - [Train] step: 27199, loss_G: 9.896776, lr_G: 0.000800
582
+ 2023-07-12 21:42:22,496 - INFO - [Train] step: 27299, loss_D: 0.001368, lr_D: 0.000800
583
+ 2023-07-12 21:42:22,585 - INFO - [Train] step: 27299, loss_G: 10.709142, lr_G: 0.000800
584
+ 2023-07-12 21:43:20,033 - INFO - [Train] step: 27399, loss_D: 0.000269, lr_D: 0.000800
585
+ 2023-07-12 21:43:20,120 - INFO - [Train] step: 27399, loss_G: 11.924906, lr_G: 0.000800
586
+ 2023-07-12 21:44:17,577 - INFO - [Train] step: 27499, loss_D: 0.030256, lr_D: 0.000800
587
+ 2023-07-12 21:44:17,664 - INFO - [Train] step: 27499, loss_G: 7.979499, lr_G: 0.000800
588
+ 2023-07-12 21:45:15,119 - INFO - [Train] step: 27599, loss_D: 0.006751, lr_D: 0.000800
589
+ 2023-07-12 21:45:15,206 - INFO - [Train] step: 27599, loss_G: 9.640858, lr_G: 0.000800
590
+ 2023-07-12 21:46:12,696 - INFO - [Train] step: 27699, loss_D: 0.001246, lr_D: 0.000800
591
+ 2023-07-12 21:46:12,784 - INFO - [Train] step: 27699, loss_G: 10.003857, lr_G: 0.000800
592
+ 2023-07-12 21:47:10,222 - INFO - [Train] step: 27799, loss_D: 0.255972, lr_D: 0.000800
593
+ 2023-07-12 21:47:10,309 - INFO - [Train] step: 27799, loss_G: 4.942447, lr_G: 0.000800
594
+ 2023-07-12 21:48:07,903 - INFO - [Train] step: 27899, loss_D: 0.009320, lr_D: 0.000800
595
+ 2023-07-12 21:48:07,991 - INFO - [Train] step: 27899, loss_G: 9.027043, lr_G: 0.000800
596
+ 2023-07-12 21:49:05,469 - INFO - [Train] step: 27999, loss_D: 0.002308, lr_D: 0.000800
597
+ 2023-07-12 21:49:05,557 - INFO - [Train] step: 27999, loss_G: 9.631499, lr_G: 0.000800
598
+ 2023-07-12 21:50:09,757 - INFO - [Eval] step: 27999, fid: 31.702556
599
+ 2023-07-12 21:51:07,010 - INFO - [Train] step: 28099, loss_D: 0.003240, lr_D: 0.000800
600
+ 2023-07-12 21:51:07,098 - INFO - [Train] step: 28099, loss_G: 10.972561, lr_G: 0.000800
601
+ 2023-07-12 21:52:04,528 - INFO - [Train] step: 28199, loss_D: 0.000051, lr_D: 0.000800
602
+ 2023-07-12 21:52:04,615 - INFO - [Train] step: 28199, loss_G: 15.420893, lr_G: 0.000800
603
+ 2023-07-12 21:53:02,062 - INFO - [Train] step: 28299, loss_D: 0.070368, lr_D: 0.000800
604
+ 2023-07-12 21:53:02,150 - INFO - [Train] step: 28299, loss_G: 5.961785, lr_G: 0.000800
605
+ 2023-07-12 21:53:59,591 - INFO - [Train] step: 28399, loss_D: 0.007552, lr_D: 0.000800
606
+ 2023-07-12 21:53:59,678 - INFO - [Train] step: 28399, loss_G: 10.017643, lr_G: 0.000800
607
+ 2023-07-12 21:54:57,250 - INFO - [Train] step: 28499, loss_D: 0.004088, lr_D: 0.000800
608
+ 2023-07-12 21:54:57,338 - INFO - [Train] step: 28499, loss_G: 11.113300, lr_G: 0.000800
609
+ 2023-07-12 21:55:54,770 - INFO - [Train] step: 28599, loss_D: 0.013245, lr_D: 0.000800
610
+ 2023-07-12 21:55:54,857 - INFO - [Train] step: 28599, loss_G: 11.390383, lr_G: 0.000800
611
+ 2023-07-12 21:56:52,272 - INFO - [Train] step: 28699, loss_D: 0.007095, lr_D: 0.000800
612
+ 2023-07-12 21:56:52,360 - INFO - [Train] step: 28699, loss_G: 9.659991, lr_G: 0.000800
613
+ 2023-07-12 21:57:49,809 - INFO - [Train] step: 28799, loss_D: 0.003698, lr_D: 0.000800
614
+ 2023-07-12 21:57:49,897 - INFO - [Train] step: 28799, loss_G: 10.763487, lr_G: 0.000800
615
+ 2023-07-12 21:58:47,345 - INFO - [Train] step: 28899, loss_D: 0.006696, lr_D: 0.000800
616
+ 2023-07-12 21:58:47,433 - INFO - [Train] step: 28899, loss_G: 9.704466, lr_G: 0.000800
617
+ 2023-07-12 21:59:44,869 - INFO - [Train] step: 28999, loss_D: 0.271052, lr_D: 0.000800
618
+ 2023-07-12 21:59:44,957 - INFO - [Train] step: 28999, loss_G: 4.130814, lr_G: 0.000800
619
+ 2023-07-12 22:00:49,138 - INFO - [Eval] step: 28999, fid: 28.163407
620
+ 2023-07-12 22:01:46,409 - INFO - [Train] step: 29099, loss_D: 0.003764, lr_D: 0.000800
621
+ 2023-07-12 22:01:46,496 - INFO - [Train] step: 29099, loss_G: 10.126225, lr_G: 0.000800
622
+ 2023-07-12 22:02:44,098 - INFO - [Train] step: 29199, loss_D: 0.001515, lr_D: 0.000800
623
+ 2023-07-12 22:02:44,185 - INFO - [Train] step: 29199, loss_G: 10.568647, lr_G: 0.000800
624
+ 2023-07-12 22:03:41,616 - INFO - [Train] step: 29299, loss_D: 0.003810, lr_D: 0.000800
625
+ 2023-07-12 22:03:41,703 - INFO - [Train] step: 29299, loss_G: 10.424676, lr_G: 0.000800
626
+ 2023-07-12 22:04:39,150 - INFO - [Train] step: 29399, loss_D: 0.039448, lr_D: 0.000800
627
+ 2023-07-12 22:04:39,238 - INFO - [Train] step: 29399, loss_G: 7.600483, lr_G: 0.000800
628
+ 2023-07-12 22:05:36,654 - INFO - [Train] step: 29499, loss_D: 0.004987, lr_D: 0.000800
629
+ 2023-07-12 22:05:36,742 - INFO - [Train] step: 29499, loss_G: 10.001286, lr_G: 0.000800
630
+ 2023-07-12 22:06:34,216 - INFO - [Train] step: 29599, loss_D: 0.001375, lr_D: 0.000800
631
+ 2023-07-12 22:06:34,303 - INFO - [Train] step: 29599, loss_G: 11.971235, lr_G: 0.000800
632
+ 2023-07-12 22:07:31,757 - INFO - [Train] step: 29699, loss_D: 0.008408, lr_D: 0.000800
633
+ 2023-07-12 22:07:31,844 - INFO - [Train] step: 29699, loss_G: 10.731680, lr_G: 0.000800
634
+ 2023-07-12 22:08:29,442 - INFO - [Train] step: 29799, loss_D: 0.011289, lr_D: 0.000800
635
+ 2023-07-12 22:08:29,530 - INFO - [Train] step: 29799, loss_G: 9.489864, lr_G: 0.000800
636
+ 2023-07-12 22:09:26,981 - INFO - [Train] step: 29899, loss_D: 0.004255, lr_D: 0.000800
637
+ 2023-07-12 22:09:27,069 - INFO - [Train] step: 29899, loss_G: 10.597130, lr_G: 0.000800
638
+ 2023-07-12 22:10:24,524 - INFO - [Train] step: 29999, loss_D: 0.006684, lr_D: 0.000800
639
+ 2023-07-12 22:10:24,611 - INFO - [Train] step: 29999, loss_G: 10.497586, lr_G: 0.000800
640
+ 2023-07-12 22:11:28,863 - INFO - [Eval] step: 29999, fid: 32.671274
641
+ 2023-07-12 22:12:26,338 - INFO - [Train] step: 30099, loss_D: 0.053281, lr_D: 0.000800
642
+ 2023-07-12 22:12:26,426 - INFO - [Train] step: 30099, loss_G: 7.095622, lr_G: 0.000800
643
+ 2023-07-12 22:13:23,873 - INFO - [Train] step: 30199, loss_D: 0.001930, lr_D: 0.000800
644
+ 2023-07-12 22:13:23,960 - INFO - [Train] step: 30199, loss_G: 10.507368, lr_G: 0.000800
645
+ 2023-07-12 22:14:21,395 - INFO - [Train] step: 30299, loss_D: 0.000255, lr_D: 0.000800
646
+ 2023-07-12 22:14:21,482 - INFO - [Train] step: 30299, loss_G: 12.630657, lr_G: 0.000800
647
+ 2023-07-12 22:15:19,069 - INFO - [Train] step: 30399, loss_D: 0.007309, lr_D: 0.000800
648
+ 2023-07-12 22:15:19,157 - INFO - [Train] step: 30399, loss_G: 10.190763, lr_G: 0.000800
649
+ 2023-07-12 22:16:16,605 - INFO - [Train] step: 30499, loss_D: 0.001074, lr_D: 0.000800
650
+ 2023-07-12 22:16:16,692 - INFO - [Train] step: 30499, loss_G: 10.938757, lr_G: 0.000800
651
+ 2023-07-12 22:17:14,122 - INFO - [Train] step: 30599, loss_D: 0.001955, lr_D: 0.000800
652
+ 2023-07-12 22:17:14,210 - INFO - [Train] step: 30599, loss_G: 11.106121, lr_G: 0.000800
653
+ 2023-07-12 22:18:11,677 - INFO - [Train] step: 30699, loss_D: 0.220415, lr_D: 0.000800
654
+ 2023-07-12 22:18:11,765 - INFO - [Train] step: 30699, loss_G: 4.759703, lr_G: 0.000800
655
+ 2023-07-12 22:19:09,216 - INFO - [Train] step: 30799, loss_D: 0.003759, lr_D: 0.000800
656
+ 2023-07-12 22:19:09,303 - INFO - [Train] step: 30799, loss_G: 9.949022, lr_G: 0.000800
657
+ 2023-07-12 22:20:06,738 - INFO - [Train] step: 30899, loss_D: 0.002265, lr_D: 0.000800
658
+ 2023-07-12 22:20:06,826 - INFO - [Train] step: 30899, loss_G: 11.118217, lr_G: 0.000800
659
+ 2023-07-12 22:21:04,256 - INFO - [Train] step: 30999, loss_D: 0.003710, lr_D: 0.000800
660
+ 2023-07-12 22:21:04,343 - INFO - [Train] step: 30999, loss_G: 10.478477, lr_G: 0.000800
661
+ 2023-07-12 22:22:08,572 - INFO - [Eval] step: 30999, fid: 33.573450
662
+ 2023-07-12 22:23:05,981 - INFO - [Train] step: 31099, loss_D: 0.175706, lr_D: 0.000800
663
+ 2023-07-12 22:23:06,069 - INFO - [Train] step: 31099, loss_G: 5.722181, lr_G: 0.000800
664
+ 2023-07-12 22:24:03,525 - INFO - [Train] step: 31199, loss_D: 0.005773, lr_D: 0.000800
665
+ 2023-07-12 22:24:03,613 - INFO - [Train] step: 31199, loss_G: 9.832756, lr_G: 0.000800
666
+ 2023-07-12 22:25:01,061 - INFO - [Train] step: 31299, loss_D: 0.001319, lr_D: 0.000800
667
+ 2023-07-12 22:25:01,149 - INFO - [Train] step: 31299, loss_G: 11.099640, lr_G: 0.000800
668
+ 2023-07-12 22:25:58,614 - INFO - [Train] step: 31399, loss_D: 0.008508, lr_D: 0.000800
669
+ 2023-07-12 22:25:58,702 - INFO - [Train] step: 31399, loss_G: 11.431013, lr_G: 0.000800
670
+ 2023-07-12 22:26:56,138 - INFO - [Train] step: 31499, loss_D: 0.026461, lr_D: 0.000800
671
+ 2023-07-12 22:26:56,226 - INFO - [Train] step: 31499, loss_G: 8.374590, lr_G: 0.000800
672
+ 2023-07-12 22:27:53,637 - INFO - [Train] step: 31599, loss_D: 0.002943, lr_D: 0.000800
673
+ 2023-07-12 22:27:53,725 - INFO - [Train] step: 31599, loss_G: 10.384491, lr_G: 0.000800
674
+ 2023-07-12 22:28:51,298 - INFO - [Train] step: 31699, loss_D: 0.001426, lr_D: 0.000800
675
+ 2023-07-12 22:28:51,386 - INFO - [Train] step: 31699, loss_G: 11.429605, lr_G: 0.000800
676
+ 2023-07-12 22:29:48,843 - INFO - [Train] step: 31799, loss_D: 0.003666, lr_D: 0.000800
677
+ 2023-07-12 22:29:48,931 - INFO - [Train] step: 31799, loss_G: 11.365452, lr_G: 0.000800
678
+ 2023-07-12 22:30:46,384 - INFO - [Train] step: 31899, loss_D: 0.008970, lr_D: 0.000800
679
+ 2023-07-12 22:30:46,471 - INFO - [Train] step: 31899, loss_G: 10.068462, lr_G: 0.000800
680
+ 2023-07-12 22:31:43,901 - INFO - [Train] step: 31999, loss_D: 0.010737, lr_D: 0.000800
681
+ 2023-07-12 22:31:43,988 - INFO - [Train] step: 31999, loss_G: 9.883368, lr_G: 0.000800
682
+ 2023-07-12 22:32:48,133 - INFO - [Eval] step: 31999, fid: 35.443428
683
+ 2023-07-12 22:33:45,399 - INFO - [Train] step: 32099, loss_D: 0.006147, lr_D: 0.000800
684
+ 2023-07-12 22:33:45,487 - INFO - [Train] step: 32099, loss_G: 10.407740, lr_G: 0.000800
685
+ 2023-07-12 22:34:42,939 - INFO - [Train] step: 32199, loss_D: 0.019705, lr_D: 0.000800
686
+ 2023-07-12 22:34:43,026 - INFO - [Train] step: 32199, loss_G: 9.505739, lr_G: 0.000800
687
+ 2023-07-12 22:35:40,458 - INFO - [Train] step: 32299, loss_D: 0.003729, lr_D: 0.000800
688
+ 2023-07-12 22:35:40,546 - INFO - [Train] step: 32299, loss_G: 12.261547, lr_G: 0.000800
689
+ 2023-07-12 22:36:38,165 - INFO - [Train] step: 32399, loss_D: 0.001340, lr_D: 0.000800
690
+ 2023-07-12 22:36:38,253 - INFO - [Train] step: 32399, loss_G: 12.019957, lr_G: 0.000800
691
+ 2023-07-12 22:37:35,693 - INFO - [Train] step: 32499, loss_D: 0.035626, lr_D: 0.000800
692
+ 2023-07-12 22:37:35,780 - INFO - [Train] step: 32499, loss_G: 8.075516, lr_G: 0.000800
693
+ 2023-07-12 22:38:33,234 - INFO - [Train] step: 32599, loss_D: 0.005533, lr_D: 0.000800
694
+ 2023-07-12 22:38:33,322 - INFO - [Train] step: 32599, loss_G: 10.814066, lr_G: 0.000800
695
+ 2023-07-12 22:39:30,751 - INFO - [Train] step: 32699, loss_D: 0.003841, lr_D: 0.000800
696
+ 2023-07-12 22:39:30,839 - INFO - [Train] step: 32699, loss_G: 10.745924, lr_G: 0.000800
697
+ 2023-07-12 22:40:28,256 - INFO - [Train] step: 32799, loss_D: 0.000586, lr_D: 0.000800
698
+ 2023-07-12 22:40:28,344 - INFO - [Train] step: 32799, loss_G: 13.178089, lr_G: 0.000800
699
+ 2023-07-12 22:41:25,778 - INFO - [Train] step: 32899, loss_D: 0.014644, lr_D: 0.000800
700
+ 2023-07-12 22:41:25,866 - INFO - [Train] step: 32899, loss_G: 11.053964, lr_G: 0.000800
701
+ 2023-07-12 22:42:23,449 - INFO - [Train] step: 32999, loss_D: 0.005550, lr_D: 0.000800
702
+ 2023-07-12 22:42:23,536 - INFO - [Train] step: 32999, loss_G: 10.551401, lr_G: 0.000800
703
+ 2023-07-12 22:43:27,725 - INFO - [Eval] step: 32999, fid: 28.576473
704
+ 2023-07-12 22:44:25,027 - INFO - [Train] step: 33099, loss_D: 0.001653, lr_D: 0.000800
705
+ 2023-07-12 22:44:25,114 - INFO - [Train] step: 33099, loss_G: 11.080724, lr_G: 0.000800
706
+ 2023-07-12 22:45:22,570 - INFO - [Train] step: 33199, loss_D: 0.002321, lr_D: 0.000800
707
+ 2023-07-12 22:45:22,657 - INFO - [Train] step: 33199, loss_G: 9.947018, lr_G: 0.000800
708
+ 2023-07-12 22:46:20,111 - INFO - [Train] step: 33299, loss_D: 0.003834, lr_D: 0.000800
709
+ 2023-07-12 22:46:20,198 - INFO - [Train] step: 33299, loss_G: 10.804760, lr_G: 0.000800
710
+ 2023-07-12 22:47:17,611 - INFO - [Train] step: 33399, loss_D: 0.014321, lr_D: 0.000800
711
+ 2023-07-12 22:47:17,699 - INFO - [Train] step: 33399, loss_G: 11.016256, lr_G: 0.000800
712
+ 2023-07-12 22:48:15,163 - INFO - [Train] step: 33499, loss_D: 0.004392, lr_D: 0.000800
713
+ 2023-07-12 22:48:15,252 - INFO - [Train] step: 33499, loss_G: 11.169493, lr_G: 0.000800
714
+ 2023-07-12 22:49:12,737 - INFO - [Train] step: 33599, loss_D: 0.011924, lr_D: 0.000800
715
+ 2023-07-12 22:49:12,824 - INFO - [Train] step: 33599, loss_G: 9.494864, lr_G: 0.000800
716
+ 2023-07-12 22:50:10,395 - INFO - [Train] step: 33699, loss_D: 0.003595, lr_D: 0.000800
717
+ 2023-07-12 22:50:10,482 - INFO - [Train] step: 33699, loss_G: 11.242525, lr_G: 0.000800
718
+ 2023-07-12 22:51:08,012 - INFO - [Train] step: 33799, loss_D: 0.009201, lr_D: 0.000800
719
+ 2023-07-12 22:51:08,100 - INFO - [Train] step: 33799, loss_G: 10.786451, lr_G: 0.000800
720
+ 2023-07-12 22:52:05,560 - INFO - [Train] step: 33899, loss_D: 0.023879, lr_D: 0.000800
721
+ 2023-07-12 22:52:05,648 - INFO - [Train] step: 33899, loss_G: 9.217155, lr_G: 0.000800
722
+ 2023-07-12 22:53:03,146 - INFO - [Train] step: 33999, loss_D: 0.006648, lr_D: 0.000800
723
+ 2023-07-12 22:53:03,234 - INFO - [Train] step: 33999, loss_G: 12.203970, lr_G: 0.000800
724
+ 2023-07-12 22:54:07,477 - INFO - [Eval] step: 33999, fid: 33.351323
725
+ 2023-07-12 22:55:04,751 - INFO - [Train] step: 34099, loss_D: 0.001176, lr_D: 0.000800
726
+ 2023-07-12 22:55:04,839 - INFO - [Train] step: 34099, loss_G: 11.682909, lr_G: 0.000800
727
+ 2023-07-12 22:56:02,307 - INFO - [Train] step: 34199, loss_D: 0.007645, lr_D: 0.000800
728
+ 2023-07-12 22:56:02,395 - INFO - [Train] step: 34199, loss_G: 11.872160, lr_G: 0.000800
729
+ 2023-07-12 22:56:59,973 - INFO - [Train] step: 34299, loss_D: 0.007828, lr_D: 0.000800
730
+ 2023-07-12 22:57:00,061 - INFO - [Train] step: 34299, loss_G: 10.393745, lr_G: 0.000800
731
+ 2023-07-12 22:57:57,490 - INFO - [Train] step: 34399, loss_D: 0.005186, lr_D: 0.000800
732
+ 2023-07-12 22:57:57,578 - INFO - [Train] step: 34399, loss_G: 10.462216, lr_G: 0.000800
733
+ 2023-07-12 22:58:55,023 - INFO - [Train] step: 34499, loss_D: 0.004028, lr_D: 0.000800
734
+ 2023-07-12 22:58:55,111 - INFO - [Train] step: 34499, loss_G: 11.343937, lr_G: 0.000800
735
+ 2023-07-12 22:59:52,576 - INFO - [Train] step: 34599, loss_D: 0.020670, lr_D: 0.000800
736
+ 2023-07-12 22:59:52,663 - INFO - [Train] step: 34599, loss_G: 9.974153, lr_G: 0.000800
737
+ 2023-07-12 23:00:50,102 - INFO - [Train] step: 34699, loss_D: 0.005216, lr_D: 0.000800
738
+ 2023-07-12 23:00:50,189 - INFO - [Train] step: 34699, loss_G: 11.225788, lr_G: 0.000800
739
+ 2023-07-12 23:01:47,630 - INFO - [Train] step: 34799, loss_D: 0.011507, lr_D: 0.000800
740
+ 2023-07-12 23:01:47,718 - INFO - [Train] step: 34799, loss_G: 11.041207, lr_G: 0.000800
741
+ 2023-07-12 23:02:45,148 - INFO - [Train] step: 34899, loss_D: 0.095275, lr_D: 0.000800
742
+ 2023-07-12 23:02:45,235 - INFO - [Train] step: 34899, loss_G: 6.455341, lr_G: 0.000800
743
+ 2023-07-12 23:03:42,786 - INFO - [Train] step: 34999, loss_D: 0.002610, lr_D: 0.000800
744
+ 2023-07-12 23:03:42,873 - INFO - [Train] step: 34999, loss_G: 10.728231, lr_G: 0.000800
745
+ 2023-07-12 23:04:47,104 - INFO - [Eval] step: 34999, fid: 30.765701
746
+ 2023-07-12 23:05:44,397 - INFO - [Train] step: 35099, loss_D: 0.005187, lr_D: 0.000800
747
+ 2023-07-12 23:05:44,485 - INFO - [Train] step: 35099, loss_G: 10.715443, lr_G: 0.000800
748
+ 2023-07-12 23:06:41,922 - INFO - [Train] step: 35199, loss_D: 0.010139, lr_D: 0.000800
749
+ 2023-07-12 23:06:42,010 - INFO - [Train] step: 35199, loss_G: 10.884523, lr_G: 0.000800
750
+ 2023-07-12 23:07:39,428 - INFO - [Train] step: 35299, loss_D: 0.005389, lr_D: 0.000800
751
+ 2023-07-12 23:07:39,515 - INFO - [Train] step: 35299, loss_G: 10.399111, lr_G: 0.000800
752
+ 2023-07-12 23:08:36,981 - INFO - [Train] step: 35399, loss_D: 0.003755, lr_D: 0.000800
753
+ 2023-07-12 23:08:37,069 - INFO - [Train] step: 35399, loss_G: 10.443095, lr_G: 0.000800
754
+ 2023-07-12 23:09:34,519 - INFO - [Train] step: 35499, loss_D: 0.002526, lr_D: 0.000800
755
+ 2023-07-12 23:09:34,607 - INFO - [Train] step: 35499, loss_G: 11.863384, lr_G: 0.000800
756
+ 2023-07-12 23:10:32,178 - INFO - [Train] step: 35599, loss_D: 0.116400, lr_D: 0.000800
757
+ 2023-07-12 23:10:32,266 - INFO - [Train] step: 35599, loss_G: 7.125185, lr_G: 0.000800
758
+ 2023-07-12 23:11:29,672 - INFO - [Train] step: 35699, loss_D: 0.000549, lr_D: 0.000800
759
+ 2023-07-12 23:11:29,760 - INFO - [Train] step: 35699, loss_G: 16.245834, lr_G: 0.000800
760
+ 2023-07-12 23:12:27,201 - INFO - [Train] step: 35799, loss_D: 0.010855, lr_D: 0.000800
761
+ 2023-07-12 23:12:27,289 - INFO - [Train] step: 35799, loss_G: 10.353662, lr_G: 0.000800
762
+ 2023-07-12 23:13:24,703 - INFO - [Train] step: 35899, loss_D: 0.002622, lr_D: 0.000800
763
+ 2023-07-12 23:13:24,791 - INFO - [Train] step: 35899, loss_G: 10.945267, lr_G: 0.000800
764
+ 2023-07-12 23:14:22,233 - INFO - [Train] step: 35999, loss_D: 0.005417, lr_D: 0.000800
765
+ 2023-07-12 23:14:22,321 - INFO - [Train] step: 35999, loss_G: 11.604994, lr_G: 0.000800
766
+ 2023-07-12 23:15:26,492 - INFO - [Eval] step: 35999, fid: 33.813961
767
+ 2023-07-12 23:16:23,835 - INFO - [Train] step: 36099, loss_D: 0.003256, lr_D: 0.000800
768
+ 2023-07-12 23:16:23,922 - INFO - [Train] step: 36099, loss_G: 12.505394, lr_G: 0.000800
769
+ 2023-07-12 23:17:21,363 - INFO - [Train] step: 36199, loss_D: 0.013223, lr_D: 0.000800
770
+ 2023-07-12 23:17:21,450 - INFO - [Train] step: 36199, loss_G: 10.757958, lr_G: 0.000800
771
+ 2023-07-12 23:18:19,030 - INFO - [Train] step: 36299, loss_D: 0.006589, lr_D: 0.000800
772
+ 2023-07-12 23:18:19,118 - INFO - [Train] step: 36299, loss_G: 10.596739, lr_G: 0.000800
773
+ 2023-07-12 23:19:16,590 - INFO - [Train] step: 36399, loss_D: 0.002567, lr_D: 0.000800
774
+ 2023-07-12 23:19:16,678 - INFO - [Train] step: 36399, loss_G: 11.028767, lr_G: 0.000800
775
+ 2023-07-12 23:20:14,158 - INFO - [Train] step: 36499, loss_D: 0.004308, lr_D: 0.000800
776
+ 2023-07-12 23:20:14,246 - INFO - [Train] step: 36499, loss_G: 11.318540, lr_G: 0.000800
777
+ 2023-07-12 23:21:11,697 - INFO - [Train] step: 36599, loss_D: 0.011383, lr_D: 0.000800
778
+ 2023-07-12 23:21:11,785 - INFO - [Train] step: 36599, loss_G: 10.280243, lr_G: 0.000800
779
+ 2023-07-12 23:22:09,210 - INFO - [Train] step: 36699, loss_D: 0.002741, lr_D: 0.000800
780
+ 2023-07-12 23:22:09,297 - INFO - [Train] step: 36699, loss_G: 11.439182, lr_G: 0.000800
781
+ 2023-07-12 23:23:06,763 - INFO - [Train] step: 36799, loss_D: 0.001333, lr_D: 0.000800
782
+ 2023-07-12 23:23:06,851 - INFO - [Train] step: 36799, loss_G: 11.009180, lr_G: 0.000800
783
+ 2023-07-12 23:24:04,417 - INFO - [Train] step: 36899, loss_D: 0.004642, lr_D: 0.000800
784
+ 2023-07-12 23:24:04,505 - INFO - [Train] step: 36899, loss_G: 11.503386, lr_G: 0.000800
785
+ 2023-07-12 23:25:01,947 - INFO - [Train] step: 36999, loss_D: 0.060789, lr_D: 0.000800
786
+ 2023-07-12 23:25:02,035 - INFO - [Train] step: 36999, loss_G: 7.120116, lr_G: 0.000800
787
+ 2023-07-12 23:26:06,249 - INFO - [Eval] step: 36999, fid: 30.180895
788
+ 2023-07-12 23:27:03,490 - INFO - [Train] step: 37099, loss_D: 0.002030, lr_D: 0.000800
789
+ 2023-07-12 23:27:03,578 - INFO - [Train] step: 37099, loss_G: 11.002737, lr_G: 0.000800
790
+ 2023-07-12 23:28:01,023 - INFO - [Train] step: 37199, loss_D: 0.001246, lr_D: 0.000800
791
+ 2023-07-12 23:28:01,111 - INFO - [Train] step: 37199, loss_G: 10.928606, lr_G: 0.000800
792
+ 2023-07-12 23:28:58,544 - INFO - [Train] step: 37299, loss_D: 0.001570, lr_D: 0.000800
793
+ 2023-07-12 23:28:58,632 - INFO - [Train] step: 37299, loss_G: 12.395868, lr_G: 0.000800
794
+ 2023-07-12 23:29:56,066 - INFO - [Train] step: 37399, loss_D: 0.027853, lr_D: 0.000800
795
+ 2023-07-12 23:29:56,154 - INFO - [Train] step: 37399, loss_G: 8.463290, lr_G: 0.000800
796
+ 2023-07-12 23:30:53,579 - INFO - [Train] step: 37499, loss_D: 0.004656, lr_D: 0.000800
797
+ 2023-07-12 23:30:53,666 - INFO - [Train] step: 37499, loss_G: 10.859463, lr_G: 0.000800
798
+ 2023-07-12 23:31:51,253 - INFO - [Train] step: 37599, loss_D: 0.001431, lr_D: 0.000800
799
+ 2023-07-12 23:31:51,341 - INFO - [Train] step: 37599, loss_G: 11.592205, lr_G: 0.000800
800
+ 2023-07-12 23:32:48,781 - INFO - [Train] step: 37699, loss_D: 0.000697, lr_D: 0.000800
801
+ 2023-07-12 23:32:48,869 - INFO - [Train] step: 37699, loss_G: 12.845480, lr_G: 0.000800
802
+ 2023-07-12 23:33:46,300 - INFO - [Train] step: 37799, loss_D: 0.016359, lr_D: 0.000800
803
+ 2023-07-12 23:33:46,388 - INFO - [Train] step: 37799, loss_G: 9.111818, lr_G: 0.000800
804
+ 2023-07-12 23:34:43,819 - INFO - [Train] step: 37899, loss_D: 0.001703, lr_D: 0.000800
805
+ 2023-07-12 23:34:43,907 - INFO - [Train] step: 37899, loss_G: 10.874673, lr_G: 0.000800
806
+ 2023-07-12 23:35:41,330 - INFO - [Train] step: 37999, loss_D: 0.005083, lr_D: 0.000800
807
+ 2023-07-12 23:35:41,418 - INFO - [Train] step: 37999, loss_G: 11.865730, lr_G: 0.000800
808
+ 2023-07-12 23:36:45,535 - INFO - [Eval] step: 37999, fid: 33.537507
809
+ 2023-07-12 23:37:42,810 - INFO - [Train] step: 38099, loss_D: 0.006491, lr_D: 0.000800
810
+ 2023-07-12 23:37:42,897 - INFO - [Train] step: 38099, loss_G: 12.005985, lr_G: 0.000800
811
+ 2023-07-12 23:38:40,481 - INFO - [Train] step: 38199, loss_D: 0.009269, lr_D: 0.000800
812
+ 2023-07-12 23:38:40,569 - INFO - [Train] step: 38199, loss_G: 10.862595, lr_G: 0.000800
813
+ 2023-07-12 23:39:37,992 - INFO - [Train] step: 38299, loss_D: 0.002769, lr_D: 0.000800
814
+ 2023-07-12 23:39:38,079 - INFO - [Train] step: 38299, loss_G: 12.476723, lr_G: 0.000800
815
+ 2023-07-12 23:40:35,535 - INFO - [Train] step: 38399, loss_D: 0.002460, lr_D: 0.000800
816
+ 2023-07-12 23:40:35,622 - INFO - [Train] step: 38399, loss_G: 11.591688, lr_G: 0.000800
817
+ 2023-07-12 23:41:33,038 - INFO - [Train] step: 38499, loss_D: 0.054696, lr_D: 0.000800
818
+ 2023-07-12 23:41:33,126 - INFO - [Train] step: 38499, loss_G: 7.878085, lr_G: 0.000800
819
+ 2023-07-12 23:42:30,591 - INFO - [Train] step: 38599, loss_D: 0.003342, lr_D: 0.000800
820
+ 2023-07-12 23:42:30,678 - INFO - [Train] step: 38599, loss_G: 11.112391, lr_G: 0.000800
821
+ 2023-07-12 23:43:28,146 - INFO - [Train] step: 38699, loss_D: 0.002558, lr_D: 0.000800
822
+ 2023-07-12 23:43:28,234 - INFO - [Train] step: 38699, loss_G: 11.609528, lr_G: 0.000800
823
+ 2023-07-12 23:44:25,672 - INFO - [Train] step: 38799, loss_D: 0.005021, lr_D: 0.000800
824
+ 2023-07-12 23:44:25,759 - INFO - [Train] step: 38799, loss_G: 11.698695, lr_G: 0.000800
825
+ 2023-07-12 23:45:23,302 - INFO - [Train] step: 38899, loss_D: 0.003490, lr_D: 0.000800
826
+ 2023-07-12 23:45:23,390 - INFO - [Train] step: 38899, loss_G: 12.095455, lr_G: 0.000800
827
+ 2023-07-12 23:46:20,807 - INFO - [Train] step: 38999, loss_D: 0.007112, lr_D: 0.000800
828
+ 2023-07-12 23:46:20,894 - INFO - [Train] step: 38999, loss_G: 11.458523, lr_G: 0.000800
829
+ 2023-07-12 23:47:25,070 - INFO - [Eval] step: 38999, fid: 28.558995
830
+ 2023-07-12 23:48:22,341 - INFO - [Train] step: 39099, loss_D: 0.001426, lr_D: 0.000800
831
+ 2023-07-12 23:48:22,428 - INFO - [Train] step: 39099, loss_G: 11.771294, lr_G: 0.000800
832
+ 2023-07-12 23:49:19,900 - INFO - [Train] step: 39199, loss_D: 0.002724, lr_D: 0.000800
833
+ 2023-07-12 23:49:19,988 - INFO - [Train] step: 39199, loss_G: 12.342468, lr_G: 0.000800
834
+ 2023-07-12 23:50:17,423 - INFO - [Train] step: 39299, loss_D: 0.263416, lr_D: 0.000800
835
+ 2023-07-12 23:50:17,510 - INFO - [Train] step: 39299, loss_G: 5.966827, lr_G: 0.000800
836
+ 2023-07-12 23:51:14,929 - INFO - [Train] step: 39399, loss_D: 0.007828, lr_D: 0.000800
837
+ 2023-07-12 23:51:15,016 - INFO - [Train] step: 39399, loss_G: 10.605097, lr_G: 0.000800
838
+ 2023-07-12 23:52:12,560 - INFO - [Train] step: 39499, loss_D: 0.001735, lr_D: 0.000800
839
+ 2023-07-12 23:52:12,648 - INFO - [Train] step: 39499, loss_G: 12.437737, lr_G: 0.000800
840
+ 2023-07-12 23:53:10,076 - INFO - [Train] step: 39599, loss_D: 0.160521, lr_D: 0.000800
841
+ 2023-07-12 23:53:10,164 - INFO - [Train] step: 39599, loss_G: 6.120922, lr_G: 0.000800
842
+ 2023-07-12 23:54:07,610 - INFO - [Train] step: 39699, loss_D: 0.001688, lr_D: 0.000800
843
+ 2023-07-12 23:54:07,697 - INFO - [Train] step: 39699, loss_G: 12.907055, lr_G: 0.000800
844
+ 2023-07-12 23:55:05,153 - INFO - [Train] step: 39799, loss_D: 0.002692, lr_D: 0.000800
845
+ 2023-07-12 23:55:05,241 - INFO - [Train] step: 39799, loss_G: 12.623354, lr_G: 0.000800
846
+ 2023-07-12 23:56:02,663 - INFO - [Train] step: 39899, loss_D: 0.007917, lr_D: 0.000800
847
+ 2023-07-12 23:56:02,751 - INFO - [Train] step: 39899, loss_G: 10.545288, lr_G: 0.000800
848
+ 2023-07-12 23:57:00,166 - INFO - [Train] step: 39999, loss_D: 0.003061, lr_D: 0.000800
849
+ 2023-07-12 23:57:00,254 - INFO - [Train] step: 39999, loss_G: 12.191031, lr_G: 0.000800
850
+ 2023-07-12 23:58:04,475 - INFO - [Eval] step: 39999, fid: 30.039670
851
+ 2023-07-12 23:58:04,687 - INFO - Best FID score: 25.273268722103296
852
+ 2023-07-12 23:58:04,687 - INFO - End of training
cgan_cbn_cifar10/samples.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e350f06a8ee52d4d5f9e0a292078ca822d9005026bd3437b48a1d27831d82344
3
+ size 5708547
cgan_cbn_cifar10/tensorboard/events.out.tfevents.1689180691.jason-system.1481.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8891f48bf5d7509cc2752efaf22eb7f35fae22fc4d44fc289b02f8e7b11430cf
3
+ size 8095936