xyfJASON commited on
Commit
006273d
1 Parent(s): ff128dd

Upload wgan_cifar10 checkpoints and training logs

Browse files
wgan_cifar10/ckpt/best/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fe9cc9e1733e4c4c213e07d7756f836c4c12eeb211b5164e5ed4cf9f875ea8ac
3
+ size 425
wgan_cifar10/ckpt/best/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1172c714983868cff63df1d6457c26071fbd03252e0f7c6479520733cca9e128
3
+ size 90682196
wgan_cifar10/ckpt/best/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c5d67ea60a9516d311709bd253766180a0c2b23d01de34a6974b65a6779689fe
3
+ size 90652559
wgan_cifar10/ckpt/step039999/meta.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8e7a378489a5f9d70e87f0b5523a463502db1f7ac14123eb79883dbf4abc7161
3
+ size 425
wgan_cifar10/ckpt/step039999/model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c8c230fa5cc39dcbdd77d7979838c9a4d457f3fbb483eb7eb3139ebc33082170
3
+ size 90682196
wgan_cifar10/ckpt/step039999/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8a7fec833e618e8f47d86c2ba4e249dc4a62d2c8c6271386caa91f316cb2f6dd
3
+ size 90652559
wgan_cifar10/config-2023-06-29-15-31-40.yaml ADDED
@@ -0,0 +1,57 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ seed: 5678
2
+ data:
3
+ name: CIFAR-10
4
+ dataroot: /data/CIFAR-10/
5
+ img_size: 32
6
+ img_channels: 3
7
+ dataloader:
8
+ num_workers: 4
9
+ pin_memory: true
10
+ prefetch_factor: 2
11
+ G:
12
+ target: models.simple_cnn.Generator
13
+ params:
14
+ z_dim: 100
15
+ dim: 256
16
+ dim_mults:
17
+ - 4
18
+ - 2
19
+ - 1
20
+ out_dim: 3
21
+ with_bn: true
22
+ with_tanh: true
23
+ D:
24
+ target: models.simple_cnn.Discriminator
25
+ params:
26
+ in_dim: 3
27
+ dim: 256
28
+ dim_mults:
29
+ - 1
30
+ - 2
31
+ - 4
32
+ with_bn: true
33
+ train:
34
+ n_steps: 40000
35
+ batch_size: 512
36
+ d_iters: 5
37
+ resume: null
38
+ print_freq: 100
39
+ save_freq: 10000
40
+ sample_freq: 1000
41
+ eval_freq: 1000
42
+ n_samples: 64
43
+ weight_clip:
44
+ - -0.01
45
+ - 0.01
46
+ loss_fn:
47
+ target: losses.wgan_loss.WGANLoss
48
+ params:
49
+ lambda_gp: 0.0
50
+ optim_G:
51
+ target: torch.optim.RMSprop
52
+ params:
53
+ lr: 0.0002
54
+ optim_D:
55
+ target: torch.optim.RMSprop
56
+ params:
57
+ lr: 0.0002
wgan_cifar10/output-2023-06-29-15-31-40.log ADDED
@@ -0,0 +1,852 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-06-29 15:31:40,233 - INFO - Experiment directory: ./runs/wgan_cifar10
2
+ 2023-06-29 15:31:40,233 - INFO - Number of processes: 1
3
+ 2023-06-29 15:31:40,233 - INFO - Distributed type: DistributedType.NO
4
+ 2023-06-29 15:31:40,233 - INFO - Mixed precision: no
5
+ 2023-06-29 15:31:40,233 - INFO - ==============================
6
+ 2023-06-29 15:31:40,702 - INFO - Size of training set: 50000
7
+ 2023-06-29 15:31:40,702 - INFO - Batch size per process: 512
8
+ 2023-06-29 15:31:40,703 - INFO - Total batch size: 512
9
+ 2023-06-29 15:31:40,703 - INFO - ==============================
10
+ 2023-06-29 15:31:41,619 - INFO - Start training...
11
+ 2023-06-29 15:32:38,820 - INFO - [Train] step: 99, loss_D: -2.536046, lr_D: 0.000200
12
+ 2023-06-29 15:32:38,901 - INFO - [Train] step: 99, loss_G: 1.299868, lr_G: 0.000200
13
+ 2023-06-29 15:33:34,660 - INFO - [Train] step: 199, loss_D: -2.314827, lr_D: 0.000200
14
+ 2023-06-29 15:33:34,741 - INFO - [Train] step: 199, loss_G: 1.296806, lr_G: 0.000200
15
+ 2023-06-29 15:34:30,629 - INFO - [Train] step: 299, loss_D: -2.633831, lr_D: 0.000200
16
+ 2023-06-29 15:34:30,710 - INFO - [Train] step: 299, loss_G: 1.301573, lr_G: 0.000200
17
+ 2023-06-29 15:35:26,719 - INFO - [Train] step: 399, loss_D: -2.871247, lr_D: 0.000200
18
+ 2023-06-29 15:35:26,800 - INFO - [Train] step: 399, loss_G: 1.431868, lr_G: 0.000200
19
+ 2023-06-29 15:36:22,806 - INFO - [Train] step: 499, loss_D: -2.918863, lr_D: 0.000200
20
+ 2023-06-29 15:36:22,887 - INFO - [Train] step: 499, loss_G: 1.451805, lr_G: 0.000200
21
+ 2023-06-29 15:37:18,894 - INFO - [Train] step: 599, loss_D: -2.475342, lr_D: 0.000200
22
+ 2023-06-29 15:37:18,974 - INFO - [Train] step: 599, loss_G: 1.244406, lr_G: 0.000200
23
+ 2023-06-29 15:38:15,131 - INFO - [Train] step: 699, loss_D: -1.910297, lr_D: 0.000200
24
+ 2023-06-29 15:38:15,212 - INFO - [Train] step: 699, loss_G: 1.360220, lr_G: 0.000200
25
+ 2023-06-29 15:39:11,250 - INFO - [Train] step: 799, loss_D: -2.558151, lr_D: 0.000200
26
+ 2023-06-29 15:39:11,331 - INFO - [Train] step: 799, loss_G: 1.235151, lr_G: 0.000200
27
+ 2023-06-29 15:40:07,339 - INFO - [Train] step: 899, loss_D: -1.907712, lr_D: 0.000200
28
+ 2023-06-29 15:40:07,419 - INFO - [Train] step: 899, loss_G: 1.330655, lr_G: 0.000200
29
+ 2023-06-29 15:41:03,465 - INFO - [Train] step: 999, loss_D: -1.550007, lr_D: 0.000200
30
+ 2023-06-29 15:41:03,547 - INFO - [Train] step: 999, loss_G: 1.253951, lr_G: 0.000200
31
+ 2023-06-29 15:42:07,898 - INFO - [Eval] step: 999, fid: 141.521986
32
+ 2023-06-29 15:43:03,926 - INFO - [Train] step: 1099, loss_D: -1.581714, lr_D: 0.000200
33
+ 2023-06-29 15:43:04,007 - INFO - [Train] step: 1099, loss_G: 0.663381, lr_G: 0.000200
34
+ 2023-06-29 15:44:00,093 - INFO - [Train] step: 1199, loss_D: -2.781310, lr_D: 0.000200
35
+ 2023-06-29 15:44:00,174 - INFO - [Train] step: 1199, loss_G: 1.395338, lr_G: 0.000200
36
+ 2023-06-29 15:44:56,460 - INFO - [Train] step: 1299, loss_D: -1.426236, lr_D: 0.000200
37
+ 2023-06-29 15:44:56,542 - INFO - [Train] step: 1299, loss_G: 1.228306, lr_G: 0.000200
38
+ 2023-06-29 15:45:52,621 - INFO - [Train] step: 1399, loss_D: -1.334370, lr_D: 0.000200
39
+ 2023-06-29 15:45:52,702 - INFO - [Train] step: 1399, loss_G: 1.272762, lr_G: 0.000200
40
+ 2023-06-29 15:46:48,804 - INFO - [Train] step: 1499, loss_D: -1.227064, lr_D: 0.000200
41
+ 2023-06-29 15:46:48,885 - INFO - [Train] step: 1499, loss_G: 0.575165, lr_G: 0.000200
42
+ 2023-06-29 15:47:44,977 - INFO - [Train] step: 1599, loss_D: -0.277768, lr_D: 0.000200
43
+ 2023-06-29 15:47:45,058 - INFO - [Train] step: 1599, loss_G: 1.219209, lr_G: 0.000200
44
+ 2023-06-29 15:48:41,165 - INFO - [Train] step: 1699, loss_D: -2.772713, lr_D: 0.000200
45
+ 2023-06-29 15:48:41,246 - INFO - [Train] step: 1699, loss_G: 1.380306, lr_G: 0.000200
46
+ 2023-06-29 15:49:37,329 - INFO - [Train] step: 1799, loss_D: -0.546433, lr_D: 0.000200
47
+ 2023-06-29 15:49:37,410 - INFO - [Train] step: 1799, loss_G: 0.954659, lr_G: 0.000200
48
+ 2023-06-29 15:50:33,535 - INFO - [Train] step: 1899, loss_D: -1.484231, lr_D: 0.000200
49
+ 2023-06-29 15:50:33,616 - INFO - [Train] step: 1899, loss_G: 1.238492, lr_G: 0.000200
50
+ 2023-06-29 15:51:29,898 - INFO - [Train] step: 1999, loss_D: -1.300908, lr_D: 0.000200
51
+ 2023-06-29 15:51:29,980 - INFO - [Train] step: 1999, loss_G: 0.318852, lr_G: 0.000200
52
+ 2023-06-29 15:52:34,131 - INFO - [Eval] step: 1999, fid: 130.083544
53
+ 2023-06-29 15:53:30,268 - INFO - [Train] step: 2099, loss_D: -2.620611, lr_D: 0.000200
54
+ 2023-06-29 15:53:30,349 - INFO - [Train] step: 2099, loss_G: 1.283782, lr_G: 0.000200
55
+ 2023-06-29 15:54:26,440 - INFO - [Train] step: 2199, loss_D: -1.801735, lr_D: 0.000200
56
+ 2023-06-29 15:54:26,522 - INFO - [Train] step: 2199, loss_G: 0.745474, lr_G: 0.000200
57
+ 2023-06-29 15:55:22,607 - INFO - [Train] step: 2299, loss_D: -1.499837, lr_D: 0.000200
58
+ 2023-06-29 15:55:22,688 - INFO - [Train] step: 2299, loss_G: 0.296975, lr_G: 0.000200
59
+ 2023-06-29 15:56:18,786 - INFO - [Train] step: 2399, loss_D: -1.333253, lr_D: 0.000200
60
+ 2023-06-29 15:56:18,867 - INFO - [Train] step: 2399, loss_G: 0.298166, lr_G: 0.000200
61
+ 2023-06-29 15:57:14,965 - INFO - [Train] step: 2499, loss_D: -1.467808, lr_D: 0.000200
62
+ 2023-06-29 15:57:15,046 - INFO - [Train] step: 2499, loss_G: 1.099047, lr_G: 0.000200
63
+ 2023-06-29 15:58:11,277 - INFO - [Train] step: 2599, loss_D: -1.491312, lr_D: 0.000200
64
+ 2023-06-29 15:58:11,358 - INFO - [Train] step: 2599, loss_G: 1.142577, lr_G: 0.000200
65
+ 2023-06-29 15:59:07,448 - INFO - [Train] step: 2699, loss_D: -0.748707, lr_D: 0.000200
66
+ 2023-06-29 15:59:07,529 - INFO - [Train] step: 2699, loss_G: 0.555365, lr_G: 0.000200
67
+ 2023-06-29 16:00:03,620 - INFO - [Train] step: 2799, loss_D: -1.530109, lr_D: 0.000200
68
+ 2023-06-29 16:00:03,701 - INFO - [Train] step: 2799, loss_G: 0.568366, lr_G: 0.000200
69
+ 2023-06-29 16:00:59,794 - INFO - [Train] step: 2899, loss_D: -1.178074, lr_D: 0.000200
70
+ 2023-06-29 16:00:59,874 - INFO - [Train] step: 2899, loss_G: 1.251658, lr_G: 0.000200
71
+ 2023-06-29 16:01:55,961 - INFO - [Train] step: 2999, loss_D: -1.334241, lr_D: 0.000200
72
+ 2023-06-29 16:01:56,042 - INFO - [Train] step: 2999, loss_G: 0.050308, lr_G: 0.000200
73
+ 2023-06-29 16:03:00,140 - INFO - [Eval] step: 2999, fid: 109.416049
74
+ 2023-06-29 16:03:56,345 - INFO - [Train] step: 3099, loss_D: -1.053510, lr_D: 0.000200
75
+ 2023-06-29 16:03:56,426 - INFO - [Train] step: 3099, loss_G: 0.138814, lr_G: 0.000200
76
+ 2023-06-29 16:04:52,483 - INFO - [Train] step: 3199, loss_D: -2.660021, lr_D: 0.000200
77
+ 2023-06-29 16:04:52,563 - INFO - [Train] step: 3199, loss_G: 1.311523, lr_G: 0.000200
78
+ 2023-06-29 16:05:48,868 - INFO - [Train] step: 3299, loss_D: -1.333482, lr_D: 0.000200
79
+ 2023-06-29 16:05:48,949 - INFO - [Train] step: 3299, loss_G: 0.843449, lr_G: 0.000200
80
+ 2023-06-29 16:06:45,061 - INFO - [Train] step: 3399, loss_D: -0.863752, lr_D: 0.000200
81
+ 2023-06-29 16:06:45,142 - INFO - [Train] step: 3399, loss_G: 1.018586, lr_G: 0.000200
82
+ 2023-06-29 16:07:41,197 - INFO - [Train] step: 3499, loss_D: -0.864138, lr_D: 0.000200
83
+ 2023-06-29 16:07:41,279 - INFO - [Train] step: 3499, loss_G: 1.098570, lr_G: 0.000200
84
+ 2023-06-29 16:08:37,352 - INFO - [Train] step: 3599, loss_D: -1.403031, lr_D: 0.000200
85
+ 2023-06-29 16:08:37,433 - INFO - [Train] step: 3599, loss_G: 1.103705, lr_G: 0.000200
86
+ 2023-06-29 16:09:33,509 - INFO - [Train] step: 3699, loss_D: -1.173839, lr_D: 0.000200
87
+ 2023-06-29 16:09:33,590 - INFO - [Train] step: 3699, loss_G: 0.165722, lr_G: 0.000200
88
+ 2023-06-29 16:10:29,683 - INFO - [Train] step: 3799, loss_D: -1.334900, lr_D: 0.000200
89
+ 2023-06-29 16:10:29,764 - INFO - [Train] step: 3799, loss_G: 0.845531, lr_G: 0.000200
90
+ 2023-06-29 16:11:26,006 - INFO - [Train] step: 3899, loss_D: -1.433599, lr_D: 0.000200
91
+ 2023-06-29 16:11:26,087 - INFO - [Train] step: 3899, loss_G: 0.599591, lr_G: 0.000200
92
+ 2023-06-29 16:12:22,173 - INFO - [Train] step: 3999, loss_D: -1.366960, lr_D: 0.000200
93
+ 2023-06-29 16:12:22,253 - INFO - [Train] step: 3999, loss_G: 0.302691, lr_G: 0.000200
94
+ 2023-06-29 16:13:26,374 - INFO - [Eval] step: 3999, fid: 108.948503
95
+ 2023-06-29 16:14:22,554 - INFO - [Train] step: 4099, loss_D: -1.443813, lr_D: 0.000200
96
+ 2023-06-29 16:14:22,636 - INFO - [Train] step: 4099, loss_G: 0.465111, lr_G: 0.000200
97
+ 2023-06-29 16:15:18,721 - INFO - [Train] step: 4199, loss_D: -0.781468, lr_D: 0.000200
98
+ 2023-06-29 16:15:18,802 - INFO - [Train] step: 4199, loss_G: 1.075934, lr_G: 0.000200
99
+ 2023-06-29 16:16:14,935 - INFO - [Train] step: 4299, loss_D: -1.616517, lr_D: 0.000200
100
+ 2023-06-29 16:16:15,015 - INFO - [Train] step: 4299, loss_G: 0.737859, lr_G: 0.000200
101
+ 2023-06-29 16:17:11,095 - INFO - [Train] step: 4399, loss_D: -1.107855, lr_D: 0.000200
102
+ 2023-06-29 16:17:11,176 - INFO - [Train] step: 4399, loss_G: 1.140422, lr_G: 0.000200
103
+ 2023-06-29 16:18:07,261 - INFO - [Train] step: 4499, loss_D: -2.042777, lr_D: 0.000200
104
+ 2023-06-29 16:18:07,342 - INFO - [Train] step: 4499, loss_G: 1.088213, lr_G: 0.000200
105
+ 2023-06-29 16:19:03,593 - INFO - [Train] step: 4599, loss_D: -0.847451, lr_D: 0.000200
106
+ 2023-06-29 16:19:03,674 - INFO - [Train] step: 4599, loss_G: 0.604658, lr_G: 0.000200
107
+ 2023-06-29 16:19:59,784 - INFO - [Train] step: 4699, loss_D: -1.088606, lr_D: 0.000200
108
+ 2023-06-29 16:19:59,865 - INFO - [Train] step: 4699, loss_G: 1.001215, lr_G: 0.000200
109
+ 2023-06-29 16:20:55,958 - INFO - [Train] step: 4799, loss_D: -1.995230, lr_D: 0.000200
110
+ 2023-06-29 16:20:56,039 - INFO - [Train] step: 4799, loss_G: 0.740384, lr_G: 0.000200
111
+ 2023-06-29 16:21:52,136 - INFO - [Train] step: 4899, loss_D: -0.701935, lr_D: 0.000200
112
+ 2023-06-29 16:21:52,217 - INFO - [Train] step: 4899, loss_G: 0.871169, lr_G: 0.000200
113
+ 2023-06-29 16:22:48,293 - INFO - [Train] step: 4999, loss_D: -0.964997, lr_D: 0.000200
114
+ 2023-06-29 16:22:48,374 - INFO - [Train] step: 4999, loss_G: 1.048569, lr_G: 0.000200
115
+ 2023-06-29 16:23:52,344 - INFO - [Eval] step: 4999, fid: 86.620488
116
+ 2023-06-29 16:24:48,480 - INFO - [Train] step: 5099, loss_D: -0.410015, lr_D: 0.000200
117
+ 2023-06-29 16:24:48,561 - INFO - [Train] step: 5099, loss_G: 0.272456, lr_G: 0.000200
118
+ 2023-06-29 16:25:44,801 - INFO - [Train] step: 5199, loss_D: -1.588230, lr_D: 0.000200
119
+ 2023-06-29 16:25:44,882 - INFO - [Train] step: 5199, loss_G: 0.570502, lr_G: 0.000200
120
+ 2023-06-29 16:26:40,952 - INFO - [Train] step: 5299, loss_D: -0.893380, lr_D: 0.000200
121
+ 2023-06-29 16:26:41,034 - INFO - [Train] step: 5299, loss_G: 0.970169, lr_G: 0.000200
122
+ 2023-06-29 16:27:37,119 - INFO - [Train] step: 5399, loss_D: -1.808293, lr_D: 0.000200
123
+ 2023-06-29 16:27:37,200 - INFO - [Train] step: 5399, loss_G: 1.149795, lr_G: 0.000200
124
+ 2023-06-29 16:28:33,293 - INFO - [Train] step: 5499, loss_D: -1.070094, lr_D: 0.000200
125
+ 2023-06-29 16:28:33,374 - INFO - [Train] step: 5499, loss_G: 0.582415, lr_G: 0.000200
126
+ 2023-06-29 16:29:29,423 - INFO - [Train] step: 5599, loss_D: -0.588687, lr_D: 0.000200
127
+ 2023-06-29 16:29:29,504 - INFO - [Train] step: 5599, loss_G: 0.314720, lr_G: 0.000200
128
+ 2023-06-29 16:30:25,580 - INFO - [Train] step: 5699, loss_D: -0.958954, lr_D: 0.000200
129
+ 2023-06-29 16:30:25,661 - INFO - [Train] step: 5699, loss_G: -0.083431, lr_G: 0.000200
130
+ 2023-06-29 16:31:21,767 - INFO - [Train] step: 5799, loss_D: -0.613495, lr_D: 0.000200
131
+ 2023-06-29 16:31:21,848 - INFO - [Train] step: 5799, loss_G: -0.206521, lr_G: 0.000200
132
+ 2023-06-29 16:32:18,105 - INFO - [Train] step: 5899, loss_D: -0.859607, lr_D: 0.000200
133
+ 2023-06-29 16:32:18,186 - INFO - [Train] step: 5899, loss_G: -0.453748, lr_G: 0.000200
134
+ 2023-06-29 16:33:14,273 - INFO - [Train] step: 5999, loss_D: -0.591561, lr_D: 0.000200
135
+ 2023-06-29 16:33:14,355 - INFO - [Train] step: 5999, loss_G: -0.189856, lr_G: 0.000200
136
+ 2023-06-29 16:34:18,355 - INFO - [Eval] step: 5999, fid: 79.783066
137
+ 2023-06-29 16:35:14,486 - INFO - [Train] step: 6099, loss_D: -0.646017, lr_D: 0.000200
138
+ 2023-06-29 16:35:14,567 - INFO - [Train] step: 6099, loss_G: -0.426766, lr_G: 0.000200
139
+ 2023-06-29 16:36:10,643 - INFO - [Train] step: 6199, loss_D: -0.668536, lr_D: 0.000200
140
+ 2023-06-29 16:36:10,724 - INFO - [Train] step: 6199, loss_G: -0.197903, lr_G: 0.000200
141
+ 2023-06-29 16:37:06,774 - INFO - [Train] step: 6299, loss_D: -0.816381, lr_D: 0.000200
142
+ 2023-06-29 16:37:06,855 - INFO - [Train] step: 6299, loss_G: 1.118829, lr_G: 0.000200
143
+ 2023-06-29 16:38:02,907 - INFO - [Train] step: 6399, loss_D: -0.696219, lr_D: 0.000200
144
+ 2023-06-29 16:38:02,989 - INFO - [Train] step: 6399, loss_G: -0.586171, lr_G: 0.000200
145
+ 2023-06-29 16:38:59,224 - INFO - [Train] step: 6499, loss_D: -0.724043, lr_D: 0.000200
146
+ 2023-06-29 16:38:59,305 - INFO - [Train] step: 6499, loss_G: 1.047213, lr_G: 0.000200
147
+ 2023-06-29 16:39:55,377 - INFO - [Train] step: 6599, loss_D: -0.650036, lr_D: 0.000200
148
+ 2023-06-29 16:39:55,457 - INFO - [Train] step: 6599, loss_G: -0.185448, lr_G: 0.000200
149
+ 2023-06-29 16:40:51,537 - INFO - [Train] step: 6699, loss_D: -0.658937, lr_D: 0.000200
150
+ 2023-06-29 16:40:51,618 - INFO - [Train] step: 6699, loss_G: 0.642112, lr_G: 0.000200
151
+ 2023-06-29 16:41:47,684 - INFO - [Train] step: 6799, loss_D: -0.492009, lr_D: 0.000200
152
+ 2023-06-29 16:41:47,764 - INFO - [Train] step: 6799, loss_G: 0.157293, lr_G: 0.000200
153
+ 2023-06-29 16:42:43,812 - INFO - [Train] step: 6899, loss_D: -1.471045, lr_D: 0.000200
154
+ 2023-06-29 16:42:43,892 - INFO - [Train] step: 6899, loss_G: 0.487549, lr_G: 0.000200
155
+ 2023-06-29 16:43:39,951 - INFO - [Train] step: 6999, loss_D: -1.243823, lr_D: 0.000200
156
+ 2023-06-29 16:43:40,032 - INFO - [Train] step: 6999, loss_G: 0.337886, lr_G: 0.000200
157
+ 2023-06-29 16:44:44,095 - INFO - [Eval] step: 6999, fid: 74.545250
158
+ 2023-06-29 16:45:40,198 - INFO - [Train] step: 7099, loss_D: -1.308108, lr_D: 0.000200
159
+ 2023-06-29 16:45:40,278 - INFO - [Train] step: 7099, loss_G: 0.125601, lr_G: 0.000200
160
+ 2023-06-29 16:46:36,550 - INFO - [Train] step: 7199, loss_D: -1.163555, lr_D: 0.000200
161
+ 2023-06-29 16:46:36,631 - INFO - [Train] step: 7199, loss_G: 0.903597, lr_G: 0.000200
162
+ 2023-06-29 16:47:32,693 - INFO - [Train] step: 7299, loss_D: -1.231115, lr_D: 0.000200
163
+ 2023-06-29 16:47:32,774 - INFO - [Train] step: 7299, loss_G: 0.390586, lr_G: 0.000200
164
+ 2023-06-29 16:48:28,836 - INFO - [Train] step: 7399, loss_D: -0.698335, lr_D: 0.000200
165
+ 2023-06-29 16:48:28,917 - INFO - [Train] step: 7399, loss_G: -0.633322, lr_G: 0.000200
166
+ 2023-06-29 16:49:25,009 - INFO - [Train] step: 7499, loss_D: -0.448978, lr_D: 0.000200
167
+ 2023-06-29 16:49:25,090 - INFO - [Train] step: 7499, loss_G: 0.413854, lr_G: 0.000200
168
+ 2023-06-29 16:50:21,165 - INFO - [Train] step: 7599, loss_D: -0.826074, lr_D: 0.000200
169
+ 2023-06-29 16:50:21,246 - INFO - [Train] step: 7599, loss_G: 0.081575, lr_G: 0.000200
170
+ 2023-06-29 16:51:17,302 - INFO - [Train] step: 7699, loss_D: -1.286108, lr_D: 0.000200
171
+ 2023-06-29 16:51:17,382 - INFO - [Train] step: 7699, loss_G: 1.209222, lr_G: 0.000200
172
+ 2023-06-29 16:52:13,646 - INFO - [Train] step: 7799, loss_D: -0.515108, lr_D: 0.000200
173
+ 2023-06-29 16:52:13,727 - INFO - [Train] step: 7799, loss_G: -0.075957, lr_G: 0.000200
174
+ 2023-06-29 16:53:09,775 - INFO - [Train] step: 7899, loss_D: -0.606370, lr_D: 0.000200
175
+ 2023-06-29 16:53:09,856 - INFO - [Train] step: 7899, loss_G: -0.259059, lr_G: 0.000200
176
+ 2023-06-29 16:54:05,884 - INFO - [Train] step: 7999, loss_D: -0.692962, lr_D: 0.000200
177
+ 2023-06-29 16:54:05,965 - INFO - [Train] step: 7999, loss_G: -0.613615, lr_G: 0.000200
178
+ 2023-06-29 16:55:10,031 - INFO - [Eval] step: 7999, fid: 70.045487
179
+ 2023-06-29 16:56:06,146 - INFO - [Train] step: 8099, loss_D: -0.540420, lr_D: 0.000200
180
+ 2023-06-29 16:56:06,227 - INFO - [Train] step: 8099, loss_G: 0.951155, lr_G: 0.000200
181
+ 2023-06-29 16:57:02,288 - INFO - [Train] step: 8199, loss_D: -0.352719, lr_D: 0.000200
182
+ 2023-06-29 16:57:02,368 - INFO - [Train] step: 8199, loss_G: 0.431055, lr_G: 0.000200
183
+ 2023-06-29 16:57:58,418 - INFO - [Train] step: 8299, loss_D: -1.170076, lr_D: 0.000200
184
+ 2023-06-29 16:57:58,499 - INFO - [Train] step: 8299, loss_G: 0.892625, lr_G: 0.000200
185
+ 2023-06-29 16:58:54,548 - INFO - [Train] step: 8399, loss_D: -0.536502, lr_D: 0.000200
186
+ 2023-06-29 16:58:54,628 - INFO - [Train] step: 8399, loss_G: -0.433586, lr_G: 0.000200
187
+ 2023-06-29 16:59:50,847 - INFO - [Train] step: 8499, loss_D: -0.406223, lr_D: 0.000200
188
+ 2023-06-29 16:59:50,929 - INFO - [Train] step: 8499, loss_G: -0.137544, lr_G: 0.000200
189
+ 2023-06-29 17:00:46,971 - INFO - [Train] step: 8599, loss_D: -0.908966, lr_D: 0.000200
190
+ 2023-06-29 17:00:47,052 - INFO - [Train] step: 8599, loss_G: 0.702197, lr_G: 0.000200
191
+ 2023-06-29 17:01:43,092 - INFO - [Train] step: 8699, loss_D: -1.059980, lr_D: 0.000200
192
+ 2023-06-29 17:01:43,173 - INFO - [Train] step: 8699, loss_G: 0.695262, lr_G: 0.000200
193
+ 2023-06-29 17:02:39,212 - INFO - [Train] step: 8799, loss_D: -0.681741, lr_D: 0.000200
194
+ 2023-06-29 17:02:39,292 - INFO - [Train] step: 8799, loss_G: -0.685544, lr_G: 0.000200
195
+ 2023-06-29 17:03:35,354 - INFO - [Train] step: 8899, loss_D: -0.537733, lr_D: 0.000200
196
+ 2023-06-29 17:03:35,435 - INFO - [Train] step: 8899, loss_G: 0.877044, lr_G: 0.000200
197
+ 2023-06-29 17:04:31,522 - INFO - [Train] step: 8999, loss_D: -0.919181, lr_D: 0.000200
198
+ 2023-06-29 17:04:31,603 - INFO - [Train] step: 8999, loss_G: 0.063250, lr_G: 0.000200
199
+ 2023-06-29 17:05:35,665 - INFO - [Eval] step: 8999, fid: 70.877376
200
+ 2023-06-29 17:06:31,744 - INFO - [Train] step: 9099, loss_D: -0.437882, lr_D: 0.000200
201
+ 2023-06-29 17:06:31,825 - INFO - [Train] step: 9099, loss_G: 0.745878, lr_G: 0.000200
202
+ 2023-06-29 17:07:27,858 - INFO - [Train] step: 9199, loss_D: -0.784309, lr_D: 0.000200
203
+ 2023-06-29 17:07:27,939 - INFO - [Train] step: 9199, loss_G: 0.838880, lr_G: 0.000200
204
+ 2023-06-29 17:08:23,990 - INFO - [Train] step: 9299, loss_D: -0.711158, lr_D: 0.000200
205
+ 2023-06-29 17:08:24,071 - INFO - [Train] step: 9299, loss_G: 0.640251, lr_G: 0.000200
206
+ 2023-06-29 17:09:20,145 - INFO - [Train] step: 9399, loss_D: -0.770175, lr_D: 0.000200
207
+ 2023-06-29 17:09:20,226 - INFO - [Train] step: 9399, loss_G: -0.648681, lr_G: 0.000200
208
+ 2023-06-29 17:10:16,272 - INFO - [Train] step: 9499, loss_D: -0.385077, lr_D: 0.000200
209
+ 2023-06-29 17:10:16,353 - INFO - [Train] step: 9499, loss_G: 0.353278, lr_G: 0.000200
210
+ 2023-06-29 17:11:12,456 - INFO - [Train] step: 9599, loss_D: -0.432066, lr_D: 0.000200
211
+ 2023-06-29 17:11:12,537 - INFO - [Train] step: 9599, loss_G: -0.357730, lr_G: 0.000200
212
+ 2023-06-29 17:12:08,644 - INFO - [Train] step: 9699, loss_D: -1.025551, lr_D: 0.000200
213
+ 2023-06-29 17:12:08,725 - INFO - [Train] step: 9699, loss_G: 0.590583, lr_G: 0.000200
214
+ 2023-06-29 17:13:04,926 - INFO - [Train] step: 9799, loss_D: -1.554502, lr_D: 0.000200
215
+ 2023-06-29 17:13:05,007 - INFO - [Train] step: 9799, loss_G: -0.306077, lr_G: 0.000200
216
+ 2023-06-29 17:14:01,048 - INFO - [Train] step: 9899, loss_D: -1.194705, lr_D: 0.000200
217
+ 2023-06-29 17:14:01,128 - INFO - [Train] step: 9899, loss_G: 1.170041, lr_G: 0.000200
218
+ 2023-06-29 17:14:57,208 - INFO - [Train] step: 9999, loss_D: -0.610925, lr_D: 0.000200
219
+ 2023-06-29 17:14:57,289 - INFO - [Train] step: 9999, loss_G: 0.715918, lr_G: 0.000200
220
+ 2023-06-29 17:16:01,426 - INFO - [Eval] step: 9999, fid: 64.962130
221
+ 2023-06-29 17:16:57,648 - INFO - [Train] step: 10099, loss_D: -0.956697, lr_D: 0.000200
222
+ 2023-06-29 17:16:57,729 - INFO - [Train] step: 10099, loss_G: -0.437733, lr_G: 0.000200
223
+ 2023-06-29 17:17:53,788 - INFO - [Train] step: 10199, loss_D: -0.395838, lr_D: 0.000200
224
+ 2023-06-29 17:17:53,869 - INFO - [Train] step: 10199, loss_G: 0.869936, lr_G: 0.000200
225
+ 2023-06-29 17:18:49,913 - INFO - [Train] step: 10299, loss_D: -0.295774, lr_D: 0.000200
226
+ 2023-06-29 17:18:49,994 - INFO - [Train] step: 10299, loss_G: 0.514070, lr_G: 0.000200
227
+ 2023-06-29 17:19:46,212 - INFO - [Train] step: 10399, loss_D: -1.209484, lr_D: 0.000200
228
+ 2023-06-29 17:19:46,294 - INFO - [Train] step: 10399, loss_G: 1.122522, lr_G: 0.000200
229
+ 2023-06-29 17:20:42,349 - INFO - [Train] step: 10499, loss_D: -0.047249, lr_D: 0.000200
230
+ 2023-06-29 17:20:42,430 - INFO - [Train] step: 10499, loss_G: -0.724041, lr_G: 0.000200
231
+ 2023-06-29 17:21:38,506 - INFO - [Train] step: 10599, loss_D: -0.478335, lr_D: 0.000200
232
+ 2023-06-29 17:21:38,587 - INFO - [Train] step: 10599, loss_G: -0.450097, lr_G: 0.000200
233
+ 2023-06-29 17:22:34,649 - INFO - [Train] step: 10699, loss_D: -1.531432, lr_D: 0.000200
234
+ 2023-06-29 17:22:34,730 - INFO - [Train] step: 10699, loss_G: -0.149178, lr_G: 0.000200
235
+ 2023-06-29 17:23:30,778 - INFO - [Train] step: 10799, loss_D: -0.983194, lr_D: 0.000200
236
+ 2023-06-29 17:23:30,859 - INFO - [Train] step: 10799, loss_G: 0.238813, lr_G: 0.000200
237
+ 2023-06-29 17:24:26,904 - INFO - [Train] step: 10899, loss_D: -1.341863, lr_D: 0.000200
238
+ 2023-06-29 17:24:26,985 - INFO - [Train] step: 10899, loss_G: 0.026060, lr_G: 0.000200
239
+ 2023-06-29 17:25:23,204 - INFO - [Train] step: 10999, loss_D: -0.570181, lr_D: 0.000200
240
+ 2023-06-29 17:25:23,286 - INFO - [Train] step: 10999, loss_G: -0.242370, lr_G: 0.000200
241
+ 2023-06-29 17:26:27,367 - INFO - [Eval] step: 10999, fid: 62.563432
242
+ 2023-06-29 17:27:23,514 - INFO - [Train] step: 11099, loss_D: -0.280303, lr_D: 0.000200
243
+ 2023-06-29 17:27:23,596 - INFO - [Train] step: 11099, loss_G: 0.884322, lr_G: 0.000200
244
+ 2023-06-29 17:28:19,664 - INFO - [Train] step: 11199, loss_D: -0.200627, lr_D: 0.000200
245
+ 2023-06-29 17:28:19,745 - INFO - [Train] step: 11199, loss_G: -0.087638, lr_G: 0.000200
246
+ 2023-06-29 17:29:15,836 - INFO - [Train] step: 11299, loss_D: -0.420731, lr_D: 0.000200
247
+ 2023-06-29 17:29:15,917 - INFO - [Train] step: 11299, loss_G: -0.588285, lr_G: 0.000200
248
+ 2023-06-29 17:30:11,964 - INFO - [Train] step: 11399, loss_D: -1.198455, lr_D: 0.000200
249
+ 2023-06-29 17:30:12,045 - INFO - [Train] step: 11399, loss_G: 0.999904, lr_G: 0.000200
250
+ 2023-06-29 17:31:08,103 - INFO - [Train] step: 11499, loss_D: -0.439501, lr_D: 0.000200
251
+ 2023-06-29 17:31:08,184 - INFO - [Train] step: 11499, loss_G: 0.134601, lr_G: 0.000200
252
+ 2023-06-29 17:32:04,240 - INFO - [Train] step: 11599, loss_D: -0.381982, lr_D: 0.000200
253
+ 2023-06-29 17:32:04,320 - INFO - [Train] step: 11599, loss_G: -0.489952, lr_G: 0.000200
254
+ 2023-06-29 17:33:00,565 - INFO - [Train] step: 11699, loss_D: -0.519575, lr_D: 0.000200
255
+ 2023-06-29 17:33:00,646 - INFO - [Train] step: 11699, loss_G: -0.531970, lr_G: 0.000200
256
+ 2023-06-29 17:33:56,740 - INFO - [Train] step: 11799, loss_D: -0.440978, lr_D: 0.000200
257
+ 2023-06-29 17:33:56,820 - INFO - [Train] step: 11799, loss_G: 0.863738, lr_G: 0.000200
258
+ 2023-06-29 17:34:52,858 - INFO - [Train] step: 11899, loss_D: -0.738905, lr_D: 0.000200
259
+ 2023-06-29 17:34:52,939 - INFO - [Train] step: 11899, loss_G: -0.795318, lr_G: 0.000200
260
+ 2023-06-29 17:35:48,992 - INFO - [Train] step: 11999, loss_D: -0.509335, lr_D: 0.000200
261
+ 2023-06-29 17:35:49,073 - INFO - [Train] step: 11999, loss_G: -0.135098, lr_G: 0.000200
262
+ 2023-06-29 17:36:53,131 - INFO - [Eval] step: 11999, fid: 59.676505
263
+ 2023-06-29 17:37:49,187 - INFO - [Train] step: 12099, loss_D: -0.339884, lr_D: 0.000200
264
+ 2023-06-29 17:37:49,267 - INFO - [Train] step: 12099, loss_G: 0.338395, lr_G: 0.000200
265
+ 2023-06-29 17:38:45,144 - INFO - [Train] step: 12199, loss_D: -0.419338, lr_D: 0.000200
266
+ 2023-06-29 17:38:45,224 - INFO - [Train] step: 12199, loss_G: 0.814205, lr_G: 0.000200
267
+ 2023-06-29 17:39:41,278 - INFO - [Train] step: 12299, loss_D: -0.919625, lr_D: 0.000200
268
+ 2023-06-29 17:39:41,360 - INFO - [Train] step: 12299, loss_G: 1.173610, lr_G: 0.000200
269
+ 2023-06-29 17:40:37,222 - INFO - [Train] step: 12399, loss_D: -0.450915, lr_D: 0.000200
270
+ 2023-06-29 17:40:37,302 - INFO - [Train] step: 12399, loss_G: -0.164965, lr_G: 0.000200
271
+ 2023-06-29 17:41:33,218 - INFO - [Train] step: 12499, loss_D: -0.438179, lr_D: 0.000200
272
+ 2023-06-29 17:41:33,299 - INFO - [Train] step: 12499, loss_G: -0.835378, lr_G: 0.000200
273
+ 2023-06-29 17:42:29,190 - INFO - [Train] step: 12599, loss_D: -0.348132, lr_D: 0.000200
274
+ 2023-06-29 17:42:29,271 - INFO - [Train] step: 12599, loss_G: 0.154851, lr_G: 0.000200
275
+ 2023-06-29 17:43:25,186 - INFO - [Train] step: 12699, loss_D: -0.297570, lr_D: 0.000200
276
+ 2023-06-29 17:43:25,266 - INFO - [Train] step: 12699, loss_G: 0.977051, lr_G: 0.000200
277
+ 2023-06-29 17:44:21,152 - INFO - [Train] step: 12799, loss_D: -0.171625, lr_D: 0.000200
278
+ 2023-06-29 17:44:21,232 - INFO - [Train] step: 12799, loss_G: 0.122617, lr_G: 0.000200
279
+ 2023-06-29 17:45:17,145 - INFO - [Train] step: 12899, loss_D: -0.823310, lr_D: 0.000200
280
+ 2023-06-29 17:45:17,226 - INFO - [Train] step: 12899, loss_G: 0.794641, lr_G: 0.000200
281
+ 2023-06-29 17:46:13,266 - INFO - [Train] step: 12999, loss_D: -0.381076, lr_D: 0.000200
282
+ 2023-06-29 17:46:13,347 - INFO - [Train] step: 12999, loss_G: 0.048126, lr_G: 0.000200
283
+ 2023-06-29 17:47:17,550 - INFO - [Eval] step: 12999, fid: 58.681690
284
+ 2023-06-29 17:48:13,613 - INFO - [Train] step: 13099, loss_D: -0.448997, lr_D: 0.000200
285
+ 2023-06-29 17:48:13,693 - INFO - [Train] step: 13099, loss_G: -0.565688, lr_G: 0.000200
286
+ 2023-06-29 17:49:09,595 - INFO - [Train] step: 13199, loss_D: -1.243984, lr_D: 0.000200
287
+ 2023-06-29 17:49:09,676 - INFO - [Train] step: 13199, loss_G: 0.287285, lr_G: 0.000200
288
+ 2023-06-29 17:50:05,578 - INFO - [Train] step: 13299, loss_D: -0.290654, lr_D: 0.000200
289
+ 2023-06-29 17:50:05,658 - INFO - [Train] step: 13299, loss_G: -0.117949, lr_G: 0.000200
290
+ 2023-06-29 17:51:01,542 - INFO - [Train] step: 13399, loss_D: -0.395656, lr_D: 0.000200
291
+ 2023-06-29 17:51:01,623 - INFO - [Train] step: 13399, loss_G: -0.493619, lr_G: 0.000200
292
+ 2023-06-29 17:51:57,530 - INFO - [Train] step: 13499, loss_D: -0.443512, lr_D: 0.000200
293
+ 2023-06-29 17:51:57,611 - INFO - [Train] step: 13499, loss_G: 0.814565, lr_G: 0.000200
294
+ 2023-06-29 17:52:53,689 - INFO - [Train] step: 13599, loss_D: -0.342547, lr_D: 0.000200
295
+ 2023-06-29 17:52:53,770 - INFO - [Train] step: 13599, loss_G: -0.662215, lr_G: 0.000200
296
+ 2023-06-29 17:53:49,646 - INFO - [Train] step: 13699, loss_D: -0.347762, lr_D: 0.000200
297
+ 2023-06-29 17:53:49,726 - INFO - [Train] step: 13699, loss_G: 0.612988, lr_G: 0.000200
298
+ 2023-06-29 17:54:45,590 - INFO - [Train] step: 13799, loss_D: -0.422387, lr_D: 0.000200
299
+ 2023-06-29 17:54:45,671 - INFO - [Train] step: 13799, loss_G: 0.926356, lr_G: 0.000200
300
+ 2023-06-29 17:55:41,528 - INFO - [Train] step: 13899, loss_D: -0.357768, lr_D: 0.000200
301
+ 2023-06-29 17:55:41,609 - INFO - [Train] step: 13899, loss_G: 0.319675, lr_G: 0.000200
302
+ 2023-06-29 17:56:37,486 - INFO - [Train] step: 13999, loss_D: -0.584301, lr_D: 0.000200
303
+ 2023-06-29 17:56:37,566 - INFO - [Train] step: 13999, loss_G: -0.080352, lr_G: 0.000200
304
+ 2023-06-29 17:57:41,640 - INFO - [Eval] step: 13999, fid: 57.308234
305
+ 2023-06-29 17:58:37,721 - INFO - [Train] step: 14099, loss_D: -0.889745, lr_D: 0.000200
306
+ 2023-06-29 17:58:37,802 - INFO - [Train] step: 14099, loss_G: -0.841540, lr_G: 0.000200
307
+ 2023-06-29 17:59:33,682 - INFO - [Train] step: 14199, loss_D: -0.440458, lr_D: 0.000200
308
+ 2023-06-29 17:59:33,763 - INFO - [Train] step: 14199, loss_G: -0.187589, lr_G: 0.000200
309
+ 2023-06-29 18:00:29,819 - INFO - [Train] step: 14299, loss_D: -0.350256, lr_D: 0.000200
310
+ 2023-06-29 18:00:29,900 - INFO - [Train] step: 14299, loss_G: -0.290799, lr_G: 0.000200
311
+ 2023-06-29 18:01:25,806 - INFO - [Train] step: 14399, loss_D: -0.770788, lr_D: 0.000200
312
+ 2023-06-29 18:01:25,887 - INFO - [Train] step: 14399, loss_G: 0.730464, lr_G: 0.000200
313
+ 2023-06-29 18:02:21,805 - INFO - [Train] step: 14499, loss_D: -0.394986, lr_D: 0.000200
314
+ 2023-06-29 18:02:21,886 - INFO - [Train] step: 14499, loss_G: -0.081612, lr_G: 0.000200
315
+ 2023-06-29 18:03:17,735 - INFO - [Train] step: 14599, loss_D: -0.355028, lr_D: 0.000200
316
+ 2023-06-29 18:03:17,815 - INFO - [Train] step: 14599, loss_G: 0.267899, lr_G: 0.000200
317
+ 2023-06-29 18:04:13,658 - INFO - [Train] step: 14699, loss_D: -1.142739, lr_D: 0.000200
318
+ 2023-06-29 18:04:13,739 - INFO - [Train] step: 14699, loss_G: -0.808706, lr_G: 0.000200
319
+ 2023-06-29 18:05:09,585 - INFO - [Train] step: 14799, loss_D: -0.221553, lr_D: 0.000200
320
+ 2023-06-29 18:05:09,665 - INFO - [Train] step: 14799, loss_G: 0.753036, lr_G: 0.000200
321
+ 2023-06-29 18:06:05,695 - INFO - [Train] step: 14899, loss_D: -0.248396, lr_D: 0.000200
322
+ 2023-06-29 18:06:05,776 - INFO - [Train] step: 14899, loss_G: -0.233466, lr_G: 0.000200
323
+ 2023-06-29 18:07:01,641 - INFO - [Train] step: 14999, loss_D: -0.356773, lr_D: 0.000200
324
+ 2023-06-29 18:07:01,722 - INFO - [Train] step: 14999, loss_G: -0.278331, lr_G: 0.000200
325
+ 2023-06-29 18:08:05,816 - INFO - [Eval] step: 14999, fid: 55.594016
326
+ 2023-06-29 18:09:01,853 - INFO - [Train] step: 15099, loss_D: -0.227824, lr_D: 0.000200
327
+ 2023-06-29 18:09:01,934 - INFO - [Train] step: 15099, loss_G: 0.480639, lr_G: 0.000200
328
+ 2023-06-29 18:09:57,805 - INFO - [Train] step: 15199, loss_D: -0.563642, lr_D: 0.000200
329
+ 2023-06-29 18:09:57,886 - INFO - [Train] step: 15199, loss_G: -0.322533, lr_G: 0.000200
330
+ 2023-06-29 18:10:53,766 - INFO - [Train] step: 15299, loss_D: -0.464957, lr_D: 0.000200
331
+ 2023-06-29 18:10:53,846 - INFO - [Train] step: 15299, loss_G: -0.721107, lr_G: 0.000200
332
+ 2023-06-29 18:11:49,727 - INFO - [Train] step: 15399, loss_D: -0.621022, lr_D: 0.000200
333
+ 2023-06-29 18:11:49,808 - INFO - [Train] step: 15399, loss_G: 0.688873, lr_G: 0.000200
334
+ 2023-06-29 18:12:45,717 - INFO - [Train] step: 15499, loss_D: -0.908276, lr_D: 0.000200
335
+ 2023-06-29 18:12:45,797 - INFO - [Train] step: 15499, loss_G: 0.178746, lr_G: 0.000200
336
+ 2023-06-29 18:13:41,877 - INFO - [Train] step: 15599, loss_D: -0.404183, lr_D: 0.000200
337
+ 2023-06-29 18:13:41,958 - INFO - [Train] step: 15599, loss_G: -0.783284, lr_G: 0.000200
338
+ 2023-06-29 18:14:37,852 - INFO - [Train] step: 15699, loss_D: -0.355067, lr_D: 0.000200
339
+ 2023-06-29 18:14:37,932 - INFO - [Train] step: 15699, loss_G: -0.090385, lr_G: 0.000200
340
+ 2023-06-29 18:15:33,830 - INFO - [Train] step: 15799, loss_D: -0.208509, lr_D: 0.000200
341
+ 2023-06-29 18:15:33,911 - INFO - [Train] step: 15799, loss_G: 0.151047, lr_G: 0.000200
342
+ 2023-06-29 18:16:29,780 - INFO - [Train] step: 15899, loss_D: -0.520379, lr_D: 0.000200
343
+ 2023-06-29 18:16:29,861 - INFO - [Train] step: 15899, loss_G: 0.641194, lr_G: 0.000200
344
+ 2023-06-29 18:17:25,755 - INFO - [Train] step: 15999, loss_D: -0.672088, lr_D: 0.000200
345
+ 2023-06-29 18:17:25,835 - INFO - [Train] step: 15999, loss_G: 0.211358, lr_G: 0.000200
346
+ 2023-06-29 18:18:29,985 - INFO - [Eval] step: 15999, fid: 55.079816
347
+ 2023-06-29 18:19:26,043 - INFO - [Train] step: 16099, loss_D: -0.686529, lr_D: 0.000200
348
+ 2023-06-29 18:19:26,124 - INFO - [Train] step: 16099, loss_G: 1.066148, lr_G: 0.000200
349
+ 2023-06-29 18:20:22,163 - INFO - [Train] step: 16199, loss_D: -0.419679, lr_D: 0.000200
350
+ 2023-06-29 18:20:22,244 - INFO - [Train] step: 16199, loss_G: 0.884453, lr_G: 0.000200
351
+ 2023-06-29 18:21:18,113 - INFO - [Train] step: 16299, loss_D: -0.216057, lr_D: 0.000200
352
+ 2023-06-29 18:21:18,194 - INFO - [Train] step: 16299, loss_G: 0.181024, lr_G: 0.000200
353
+ 2023-06-29 18:22:14,079 - INFO - [Train] step: 16399, loss_D: -0.392544, lr_D: 0.000200
354
+ 2023-06-29 18:22:14,159 - INFO - [Train] step: 16399, loss_G: 0.207282, lr_G: 0.000200
355
+ 2023-06-29 18:23:10,037 - INFO - [Train] step: 16499, loss_D: -0.857889, lr_D: 0.000200
356
+ 2023-06-29 18:23:10,118 - INFO - [Train] step: 16499, loss_G: 0.467662, lr_G: 0.000200
357
+ 2023-06-29 18:24:05,970 - INFO - [Train] step: 16599, loss_D: -0.176511, lr_D: 0.000200
358
+ 2023-06-29 18:24:06,050 - INFO - [Train] step: 16599, loss_G: 0.708342, lr_G: 0.000200
359
+ 2023-06-29 18:25:01,920 - INFO - [Train] step: 16699, loss_D: -0.345966, lr_D: 0.000200
360
+ 2023-06-29 18:25:02,000 - INFO - [Train] step: 16699, loss_G: -0.199189, lr_G: 0.000200
361
+ 2023-06-29 18:25:57,925 - INFO - [Train] step: 16799, loss_D: -0.691300, lr_D: 0.000200
362
+ 2023-06-29 18:25:58,006 - INFO - [Train] step: 16799, loss_G: 0.919125, lr_G: 0.000200
363
+ 2023-06-29 18:26:54,068 - INFO - [Train] step: 16899, loss_D: -0.222382, lr_D: 0.000200
364
+ 2023-06-29 18:26:54,149 - INFO - [Train] step: 16899, loss_G: 0.249175, lr_G: 0.000200
365
+ 2023-06-29 18:27:50,006 - INFO - [Train] step: 16999, loss_D: -0.205306, lr_D: 0.000200
366
+ 2023-06-29 18:27:50,087 - INFO - [Train] step: 16999, loss_G: -0.099644, lr_G: 0.000200
367
+ 2023-06-29 18:28:54,189 - INFO - [Eval] step: 16999, fid: 53.539638
368
+ 2023-06-29 18:29:50,275 - INFO - [Train] step: 17099, loss_D: -0.513577, lr_D: 0.000200
369
+ 2023-06-29 18:29:50,356 - INFO - [Train] step: 17099, loss_G: 0.931904, lr_G: 0.000200
370
+ 2023-06-29 18:30:46,219 - INFO - [Train] step: 17199, loss_D: -0.255238, lr_D: 0.000200
371
+ 2023-06-29 18:30:46,299 - INFO - [Train] step: 17199, loss_G: -0.037735, lr_G: 0.000200
372
+ 2023-06-29 18:31:42,190 - INFO - [Train] step: 17299, loss_D: -0.681707, lr_D: 0.000200
373
+ 2023-06-29 18:31:42,271 - INFO - [Train] step: 17299, loss_G: 0.908049, lr_G: 0.000200
374
+ 2023-06-29 18:32:38,149 - INFO - [Train] step: 17399, loss_D: -0.256309, lr_D: 0.000200
375
+ 2023-06-29 18:32:38,230 - INFO - [Train] step: 17399, loss_G: 0.309436, lr_G: 0.000200
376
+ 2023-06-29 18:33:34,263 - INFO - [Train] step: 17499, loss_D: -0.259828, lr_D: 0.000200
377
+ 2023-06-29 18:33:34,344 - INFO - [Train] step: 17499, loss_G: 0.339073, lr_G: 0.000200
378
+ 2023-06-29 18:34:30,203 - INFO - [Train] step: 17599, loss_D: -0.359323, lr_D: 0.000200
379
+ 2023-06-29 18:34:30,284 - INFO - [Train] step: 17599, loss_G: -0.370633, lr_G: 0.000200
380
+ 2023-06-29 18:35:26,160 - INFO - [Train] step: 17699, loss_D: -0.486307, lr_D: 0.000200
381
+ 2023-06-29 18:35:26,240 - INFO - [Train] step: 17699, loss_G: 0.707744, lr_G: 0.000200
382
+ 2023-06-29 18:36:22,174 - INFO - [Train] step: 17799, loss_D: -0.213498, lr_D: 0.000200
383
+ 2023-06-29 18:36:22,255 - INFO - [Train] step: 17799, loss_G: 0.594274, lr_G: 0.000200
384
+ 2023-06-29 18:37:18,141 - INFO - [Train] step: 17899, loss_D: -0.377751, lr_D: 0.000200
385
+ 2023-06-29 18:37:18,222 - INFO - [Train] step: 17899, loss_G: 0.543531, lr_G: 0.000200
386
+ 2023-06-29 18:38:14,094 - INFO - [Train] step: 17999, loss_D: -0.684372, lr_D: 0.000200
387
+ 2023-06-29 18:38:14,175 - INFO - [Train] step: 17999, loss_G: -0.341155, lr_G: 0.000200
388
+ 2023-06-29 18:39:18,333 - INFO - [Eval] step: 17999, fid: 53.375392
389
+ 2023-06-29 18:40:14,436 - INFO - [Train] step: 18099, loss_D: -0.283494, lr_D: 0.000200
390
+ 2023-06-29 18:40:14,517 - INFO - [Train] step: 18099, loss_G: 0.039420, lr_G: 0.000200
391
+ 2023-06-29 18:41:10,574 - INFO - [Train] step: 18199, loss_D: -0.268967, lr_D: 0.000200
392
+ 2023-06-29 18:41:10,655 - INFO - [Train] step: 18199, loss_G: 0.587384, lr_G: 0.000200
393
+ 2023-06-29 18:42:06,560 - INFO - [Train] step: 18299, loss_D: -0.443104, lr_D: 0.000200
394
+ 2023-06-29 18:42:06,641 - INFO - [Train] step: 18299, loss_G: 1.050815, lr_G: 0.000200
395
+ 2023-06-29 18:43:02,529 - INFO - [Train] step: 18399, loss_D: -0.277237, lr_D: 0.000200
396
+ 2023-06-29 18:43:02,610 - INFO - [Train] step: 18399, loss_G: 0.223883, lr_G: 0.000200
397
+ 2023-06-29 18:43:58,511 - INFO - [Train] step: 18499, loss_D: -0.327148, lr_D: 0.000200
398
+ 2023-06-29 18:43:58,592 - INFO - [Train] step: 18499, loss_G: 0.570275, lr_G: 0.000200
399
+ 2023-06-29 18:44:54,458 - INFO - [Train] step: 18599, loss_D: -0.260445, lr_D: 0.000200
400
+ 2023-06-29 18:44:54,539 - INFO - [Train] step: 18599, loss_G: 0.236552, lr_G: 0.000200
401
+ 2023-06-29 18:45:50,428 - INFO - [Train] step: 18699, loss_D: -0.594039, lr_D: 0.000200
402
+ 2023-06-29 18:45:50,509 - INFO - [Train] step: 18699, loss_G: 0.821654, lr_G: 0.000200
403
+ 2023-06-29 18:46:46,568 - INFO - [Train] step: 18799, loss_D: -0.642464, lr_D: 0.000200
404
+ 2023-06-29 18:46:46,648 - INFO - [Train] step: 18799, loss_G: 0.513774, lr_G: 0.000200
405
+ 2023-06-29 18:47:42,511 - INFO - [Train] step: 18899, loss_D: -0.348884, lr_D: 0.000200
406
+ 2023-06-29 18:47:42,592 - INFO - [Train] step: 18899, loss_G: 0.379078, lr_G: 0.000200
407
+ 2023-06-29 18:48:38,473 - INFO - [Train] step: 18999, loss_D: -0.232302, lr_D: 0.000200
408
+ 2023-06-29 18:48:38,554 - INFO - [Train] step: 18999, loss_G: -0.231867, lr_G: 0.000200
409
+ 2023-06-29 18:49:42,625 - INFO - [Eval] step: 18999, fid: 52.194281
410
+ 2023-06-29 18:50:38,694 - INFO - [Train] step: 19099, loss_D: -0.299051, lr_D: 0.000200
411
+ 2023-06-29 18:50:38,775 - INFO - [Train] step: 19099, loss_G: 0.621282, lr_G: 0.000200
412
+ 2023-06-29 18:51:34,636 - INFO - [Train] step: 19199, loss_D: -0.622900, lr_D: 0.000200
413
+ 2023-06-29 18:51:34,718 - INFO - [Train] step: 19199, loss_G: -0.551604, lr_G: 0.000200
414
+ 2023-06-29 18:52:30,628 - INFO - [Train] step: 19299, loss_D: -0.284863, lr_D: 0.000200
415
+ 2023-06-29 18:52:30,709 - INFO - [Train] step: 19299, loss_G: 0.672477, lr_G: 0.000200
416
+ 2023-06-29 18:53:26,639 - INFO - [Train] step: 19399, loss_D: -0.267192, lr_D: 0.000200
417
+ 2023-06-29 18:53:26,719 - INFO - [Train] step: 19399, loss_G: 0.599022, lr_G: 0.000200
418
+ 2023-06-29 18:54:22,743 - INFO - [Train] step: 19499, loss_D: -0.970493, lr_D: 0.000200
419
+ 2023-06-29 18:54:22,824 - INFO - [Train] step: 19499, loss_G: 0.028820, lr_G: 0.000200
420
+ 2023-06-29 18:55:18,690 - INFO - [Train] step: 19599, loss_D: -0.142857, lr_D: 0.000200
421
+ 2023-06-29 18:55:18,771 - INFO - [Train] step: 19599, loss_G: -0.188066, lr_G: 0.000200
422
+ 2023-06-29 18:56:14,634 - INFO - [Train] step: 19699, loss_D: -0.281831, lr_D: 0.000200
423
+ 2023-06-29 18:56:14,714 - INFO - [Train] step: 19699, loss_G: 0.728521, lr_G: 0.000200
424
+ 2023-06-29 18:57:10,578 - INFO - [Train] step: 19799, loss_D: -0.855503, lr_D: 0.000200
425
+ 2023-06-29 18:57:10,659 - INFO - [Train] step: 19799, loss_G: -0.447113, lr_G: 0.000200
426
+ 2023-06-29 18:58:06,529 - INFO - [Train] step: 19899, loss_D: -0.368946, lr_D: 0.000200
427
+ 2023-06-29 18:58:06,610 - INFO - [Train] step: 19899, loss_G: 0.016978, lr_G: 0.000200
428
+ 2023-06-29 18:59:02,471 - INFO - [Train] step: 19999, loss_D: -0.337500, lr_D: 0.000200
429
+ 2023-06-29 18:59:02,552 - INFO - [Train] step: 19999, loss_G: 0.687786, lr_G: 0.000200
430
+ 2023-06-29 19:00:06,688 - INFO - [Eval] step: 19999, fid: 52.579975
431
+ 2023-06-29 19:01:02,826 - INFO - [Train] step: 20099, loss_D: -0.241059, lr_D: 0.000200
432
+ 2023-06-29 19:01:02,906 - INFO - [Train] step: 20099, loss_G: 0.095442, lr_G: 0.000200
433
+ 2023-06-29 19:01:58,852 - INFO - [Train] step: 20199, loss_D: -0.288343, lr_D: 0.000200
434
+ 2023-06-29 19:01:58,933 - INFO - [Train] step: 20199, loss_G: -0.009804, lr_G: 0.000200
435
+ 2023-06-29 19:02:54,984 - INFO - [Train] step: 20299, loss_D: -0.164176, lr_D: 0.000200
436
+ 2023-06-29 19:02:55,066 - INFO - [Train] step: 20299, loss_G: 0.152547, lr_G: 0.000200
437
+ 2023-06-29 19:03:51,115 - INFO - [Train] step: 20399, loss_D: -0.260839, lr_D: 0.000200
438
+ 2023-06-29 19:03:51,196 - INFO - [Train] step: 20399, loss_G: -0.090593, lr_G: 0.000200
439
+ 2023-06-29 19:04:47,231 - INFO - [Train] step: 20499, loss_D: -0.213408, lr_D: 0.000200
440
+ 2023-06-29 19:04:47,312 - INFO - [Train] step: 20499, loss_G: -0.041496, lr_G: 0.000200
441
+ 2023-06-29 19:05:43,216 - INFO - [Train] step: 20599, loss_D: -0.623333, lr_D: 0.000200
442
+ 2023-06-29 19:05:43,297 - INFO - [Train] step: 20599, loss_G: -0.457506, lr_G: 0.000200
443
+ 2023-06-29 19:06:39,322 - INFO - [Train] step: 20699, loss_D: -0.166982, lr_D: 0.000200
444
+ 2023-06-29 19:06:39,403 - INFO - [Train] step: 20699, loss_G: 0.261889, lr_G: 0.000200
445
+ 2023-06-29 19:07:35,232 - INFO - [Train] step: 20799, loss_D: -0.457764, lr_D: 0.000200
446
+ 2023-06-29 19:07:35,313 - INFO - [Train] step: 20799, loss_G: 0.631069, lr_G: 0.000200
447
+ 2023-06-29 19:08:31,167 - INFO - [Train] step: 20899, loss_D: -0.339654, lr_D: 0.000200
448
+ 2023-06-29 19:08:31,248 - INFO - [Train] step: 20899, loss_G: -0.474465, lr_G: 0.000200
449
+ 2023-06-29 19:09:27,186 - INFO - [Train] step: 20999, loss_D: -0.128796, lr_D: 0.000200
450
+ 2023-06-29 19:09:27,267 - INFO - [Train] step: 20999, loss_G: 0.401827, lr_G: 0.000200
451
+ 2023-06-29 19:10:31,411 - INFO - [Eval] step: 20999, fid: 51.779370
452
+ 2023-06-29 19:11:27,473 - INFO - [Train] step: 21099, loss_D: -0.250321, lr_D: 0.000200
453
+ 2023-06-29 19:11:27,554 - INFO - [Train] step: 21099, loss_G: -0.277908, lr_G: 0.000200
454
+ 2023-06-29 19:12:23,466 - INFO - [Train] step: 21199, loss_D: -0.249741, lr_D: 0.000200
455
+ 2023-06-29 19:12:23,547 - INFO - [Train] step: 21199, loss_G: 0.101692, lr_G: 0.000200
456
+ 2023-06-29 19:13:19,403 - INFO - [Train] step: 21299, loss_D: -0.150207, lr_D: 0.000200
457
+ 2023-06-29 19:13:19,484 - INFO - [Train] step: 21299, loss_G: -0.212442, lr_G: 0.000200
458
+ 2023-06-29 19:14:15,516 - INFO - [Train] step: 21399, loss_D: -0.832891, lr_D: 0.000200
459
+ 2023-06-29 19:14:15,597 - INFO - [Train] step: 21399, loss_G: -0.761273, lr_G: 0.000200
460
+ 2023-06-29 19:15:11,463 - INFO - [Train] step: 21499, loss_D: -0.370257, lr_D: 0.000200
461
+ 2023-06-29 19:15:11,543 - INFO - [Train] step: 21499, loss_G: 0.313626, lr_G: 0.000200
462
+ 2023-06-29 19:16:07,419 - INFO - [Train] step: 21599, loss_D: -0.107755, lr_D: 0.000200
463
+ 2023-06-29 19:16:07,500 - INFO - [Train] step: 21599, loss_G: -0.365736, lr_G: 0.000200
464
+ 2023-06-29 19:17:03,414 - INFO - [Train] step: 21699, loss_D: -0.409263, lr_D: 0.000200
465
+ 2023-06-29 19:17:03,495 - INFO - [Train] step: 21699, loss_G: -0.882881, lr_G: 0.000200
466
+ 2023-06-29 19:17:59,404 - INFO - [Train] step: 21799, loss_D: -0.337196, lr_D: 0.000200
467
+ 2023-06-29 19:17:59,485 - INFO - [Train] step: 21799, loss_G: 0.237929, lr_G: 0.000200
468
+ 2023-06-29 19:18:55,352 - INFO - [Train] step: 21899, loss_D: -0.349589, lr_D: 0.000200
469
+ 2023-06-29 19:18:55,433 - INFO - [Train] step: 21899, loss_G: -0.264714, lr_G: 0.000200
470
+ 2023-06-29 19:19:51,487 - INFO - [Train] step: 21999, loss_D: -0.177130, lr_D: 0.000200
471
+ 2023-06-29 19:19:51,568 - INFO - [Train] step: 21999, loss_G: 0.349135, lr_G: 0.000200
472
+ 2023-06-29 19:20:55,714 - INFO - [Eval] step: 21999, fid: 52.581239
473
+ 2023-06-29 19:21:51,588 - INFO - [Train] step: 22099, loss_D: -0.189163, lr_D: 0.000200
474
+ 2023-06-29 19:21:51,668 - INFO - [Train] step: 22099, loss_G: -0.278413, lr_G: 0.000200
475
+ 2023-06-29 19:22:47,544 - INFO - [Train] step: 22199, loss_D: -0.205289, lr_D: 0.000200
476
+ 2023-06-29 19:22:47,624 - INFO - [Train] step: 22199, loss_G: -0.038330, lr_G: 0.000200
477
+ 2023-06-29 19:23:43,521 - INFO - [Train] step: 22299, loss_D: -0.182958, lr_D: 0.000200
478
+ 2023-06-29 19:23:43,602 - INFO - [Train] step: 22299, loss_G: -0.039926, lr_G: 0.000200
479
+ 2023-06-29 19:24:39,506 - INFO - [Train] step: 22399, loss_D: -0.514402, lr_D: 0.000200
480
+ 2023-06-29 19:24:39,587 - INFO - [Train] step: 22399, loss_G: 0.739511, lr_G: 0.000200
481
+ 2023-06-29 19:25:35,460 - INFO - [Train] step: 22499, loss_D: 0.050178, lr_D: 0.000200
482
+ 2023-06-29 19:25:35,541 - INFO - [Train] step: 22499, loss_G: 0.264819, lr_G: 0.000200
483
+ 2023-06-29 19:26:31,446 - INFO - [Train] step: 22599, loss_D: -0.254130, lr_D: 0.000200
484
+ 2023-06-29 19:26:31,527 - INFO - [Train] step: 22599, loss_G: 0.175104, lr_G: 0.000200
485
+ 2023-06-29 19:27:27,634 - INFO - [Train] step: 22699, loss_D: -0.184589, lr_D: 0.000200
486
+ 2023-06-29 19:27:27,715 - INFO - [Train] step: 22699, loss_G: -0.246641, lr_G: 0.000200
487
+ 2023-06-29 19:28:23,650 - INFO - [Train] step: 22799, loss_D: -0.213186, lr_D: 0.000200
488
+ 2023-06-29 19:28:23,730 - INFO - [Train] step: 22799, loss_G: -0.610001, lr_G: 0.000200
489
+ 2023-06-29 19:29:19,621 - INFO - [Train] step: 22899, loss_D: -0.247078, lr_D: 0.000200
490
+ 2023-06-29 19:29:19,702 - INFO - [Train] step: 22899, loss_G: -0.320821, lr_G: 0.000200
491
+ 2023-06-29 19:30:15,566 - INFO - [Train] step: 22999, loss_D: -0.456847, lr_D: 0.000200
492
+ 2023-06-29 19:30:15,647 - INFO - [Train] step: 22999, loss_G: 0.769439, lr_G: 0.000200
493
+ 2023-06-29 19:31:19,785 - INFO - [Eval] step: 22999, fid: 51.117403
494
+ 2023-06-29 19:32:15,838 - INFO - [Train] step: 23099, loss_D: -0.208380, lr_D: 0.000200
495
+ 2023-06-29 19:32:15,918 - INFO - [Train] step: 23099, loss_G: 0.683111, lr_G: 0.000200
496
+ 2023-06-29 19:33:11,801 - INFO - [Train] step: 23199, loss_D: -0.265071, lr_D: 0.000200
497
+ 2023-06-29 19:33:11,882 - INFO - [Train] step: 23199, loss_G: -0.604456, lr_G: 0.000200
498
+ 2023-06-29 19:34:07,941 - INFO - [Train] step: 23299, loss_D: -0.326309, lr_D: 0.000200
499
+ 2023-06-29 19:34:08,022 - INFO - [Train] step: 23299, loss_G: -0.073202, lr_G: 0.000200
500
+ 2023-06-29 19:35:03,896 - INFO - [Train] step: 23399, loss_D: -0.162443, lr_D: 0.000200
501
+ 2023-06-29 19:35:03,977 - INFO - [Train] step: 23399, loss_G: -0.032662, lr_G: 0.000200
502
+ 2023-06-29 19:35:59,851 - INFO - [Train] step: 23499, loss_D: -0.381471, lr_D: 0.000200
503
+ 2023-06-29 19:35:59,931 - INFO - [Train] step: 23499, loss_G: 0.751587, lr_G: 0.000200
504
+ 2023-06-29 19:36:55,820 - INFO - [Train] step: 23599, loss_D: -0.243243, lr_D: 0.000200
505
+ 2023-06-29 19:36:55,901 - INFO - [Train] step: 23599, loss_G: 0.584602, lr_G: 0.000200
506
+ 2023-06-29 19:37:51,755 - INFO - [Train] step: 23699, loss_D: -0.737125, lr_D: 0.000200
507
+ 2023-06-29 19:37:51,836 - INFO - [Train] step: 23699, loss_G: 0.322124, lr_G: 0.000200
508
+ 2023-06-29 19:38:47,685 - INFO - [Train] step: 23799, loss_D: -0.314371, lr_D: 0.000200
509
+ 2023-06-29 19:38:47,766 - INFO - [Train] step: 23799, loss_G: 0.370130, lr_G: 0.000200
510
+ 2023-06-29 19:39:43,660 - INFO - [Train] step: 23899, loss_D: -0.186635, lr_D: 0.000200
511
+ 2023-06-29 19:39:43,740 - INFO - [Train] step: 23899, loss_G: 0.151037, lr_G: 0.000200
512
+ 2023-06-29 19:40:39,791 - INFO - [Train] step: 23999, loss_D: -0.409385, lr_D: 0.000200
513
+ 2023-06-29 19:40:39,872 - INFO - [Train] step: 23999, loss_G: 0.728581, lr_G: 0.000200
514
+ 2023-06-29 19:41:43,952 - INFO - [Eval] step: 23999, fid: 50.710913
515
+ 2023-06-29 19:42:40,048 - INFO - [Train] step: 24099, loss_D: -0.272841, lr_D: 0.000200
516
+ 2023-06-29 19:42:40,128 - INFO - [Train] step: 24099, loss_G: -0.784432, lr_G: 0.000200
517
+ 2023-06-29 19:43:35,989 - INFO - [Train] step: 24199, loss_D: -0.236885, lr_D: 0.000200
518
+ 2023-06-29 19:43:36,069 - INFO - [Train] step: 24199, loss_G: 0.312796, lr_G: 0.000200
519
+ 2023-06-29 19:44:31,942 - INFO - [Train] step: 24299, loss_D: -0.084026, lr_D: 0.000200
520
+ 2023-06-29 19:44:32,023 - INFO - [Train] step: 24299, loss_G: 0.198190, lr_G: 0.000200
521
+ 2023-06-29 19:45:27,894 - INFO - [Train] step: 24399, loss_D: -0.112706, lr_D: 0.000200
522
+ 2023-06-29 19:45:27,975 - INFO - [Train] step: 24399, loss_G: -0.234522, lr_G: 0.000200
523
+ 2023-06-29 19:46:23,866 - INFO - [Train] step: 24499, loss_D: -0.121795, lr_D: 0.000200
524
+ 2023-06-29 19:46:23,946 - INFO - [Train] step: 24499, loss_G: 0.195130, lr_G: 0.000200
525
+ 2023-06-29 19:47:20,002 - INFO - [Train] step: 24599, loss_D: -0.064105, lr_D: 0.000200
526
+ 2023-06-29 19:47:20,082 - INFO - [Train] step: 24599, loss_G: -0.277513, lr_G: 0.000200
527
+ 2023-06-29 19:48:15,942 - INFO - [Train] step: 24699, loss_D: -0.283035, lr_D: 0.000200
528
+ 2023-06-29 19:48:16,023 - INFO - [Train] step: 24699, loss_G: 0.790682, lr_G: 0.000200
529
+ 2023-06-29 19:49:11,879 - INFO - [Train] step: 24799, loss_D: -0.298452, lr_D: 0.000200
530
+ 2023-06-29 19:49:11,960 - INFO - [Train] step: 24799, loss_G: 0.495903, lr_G: 0.000200
531
+ 2023-06-29 19:50:07,870 - INFO - [Train] step: 24899, loss_D: -0.316555, lr_D: 0.000200
532
+ 2023-06-29 19:50:07,950 - INFO - [Train] step: 24899, loss_G: 0.613321, lr_G: 0.000200
533
+ 2023-06-29 19:51:03,845 - INFO - [Train] step: 24999, loss_D: -0.189986, lr_D: 0.000200
534
+ 2023-06-29 19:51:03,925 - INFO - [Train] step: 24999, loss_G: 0.436718, lr_G: 0.000200
535
+ 2023-06-29 19:52:08,072 - INFO - [Eval] step: 24999, fid: 50.424016
536
+ 2023-06-29 19:53:04,148 - INFO - [Train] step: 25099, loss_D: -0.222354, lr_D: 0.000200
537
+ 2023-06-29 19:53:04,228 - INFO - [Train] step: 25099, loss_G: 0.303798, lr_G: 0.000200
538
+ 2023-06-29 19:54:00,102 - INFO - [Train] step: 25199, loss_D: -0.316311, lr_D: 0.000200
539
+ 2023-06-29 19:54:00,183 - INFO - [Train] step: 25199, loss_G: 0.543657, lr_G: 0.000200
540
+ 2023-06-29 19:54:56,234 - INFO - [Train] step: 25299, loss_D: -0.186369, lr_D: 0.000200
541
+ 2023-06-29 19:54:56,315 - INFO - [Train] step: 25299, loss_G: 0.462160, lr_G: 0.000200
542
+ 2023-06-29 19:55:52,183 - INFO - [Train] step: 25399, loss_D: -0.223183, lr_D: 0.000200
543
+ 2023-06-29 19:55:52,263 - INFO - [Train] step: 25399, loss_G: 0.169095, lr_G: 0.000200
544
+ 2023-06-29 19:56:48,182 - INFO - [Train] step: 25499, loss_D: -0.095240, lr_D: 0.000200
545
+ 2023-06-29 19:56:48,263 - INFO - [Train] step: 25499, loss_G: 0.117177, lr_G: 0.000200
546
+ 2023-06-29 19:57:44,129 - INFO - [Train] step: 25599, loss_D: -0.198062, lr_D: 0.000200
547
+ 2023-06-29 19:57:44,210 - INFO - [Train] step: 25599, loss_G: 0.751349, lr_G: 0.000200
548
+ 2023-06-29 19:58:40,114 - INFO - [Train] step: 25699, loss_D: -0.352834, lr_D: 0.000200
549
+ 2023-06-29 19:58:40,194 - INFO - [Train] step: 25699, loss_G: 0.791671, lr_G: 0.000200
550
+ 2023-06-29 19:59:36,068 - INFO - [Train] step: 25799, loss_D: -0.312917, lr_D: 0.000200
551
+ 2023-06-29 19:59:36,149 - INFO - [Train] step: 25799, loss_G: 0.621706, lr_G: 0.000200
552
+ 2023-06-29 20:00:32,233 - INFO - [Train] step: 25899, loss_D: -0.257795, lr_D: 0.000200
553
+ 2023-06-29 20:00:32,314 - INFO - [Train] step: 25899, loss_G: -0.375227, lr_G: 0.000200
554
+ 2023-06-29 20:01:28,168 - INFO - [Train] step: 25999, loss_D: -0.159485, lr_D: 0.000200
555
+ 2023-06-29 20:01:28,248 - INFO - [Train] step: 25999, loss_G: -0.256657, lr_G: 0.000200
556
+ 2023-06-29 20:02:32,381 - INFO - [Eval] step: 25999, fid: 49.719738
557
+ 2023-06-29 20:03:28,488 - INFO - [Train] step: 26099, loss_D: -0.364207, lr_D: 0.000200
558
+ 2023-06-29 20:03:28,568 - INFO - [Train] step: 26099, loss_G: -0.091527, lr_G: 0.000200
559
+ 2023-06-29 20:04:24,454 - INFO - [Train] step: 26199, loss_D: -0.051903, lr_D: 0.000200
560
+ 2023-06-29 20:04:24,535 - INFO - [Train] step: 26199, loss_G: -0.384937, lr_G: 0.000200
561
+ 2023-06-29 20:05:20,387 - INFO - [Train] step: 26299, loss_D: -0.283852, lr_D: 0.000200
562
+ 2023-06-29 20:05:20,468 - INFO - [Train] step: 26299, loss_G: -0.062731, lr_G: 0.000200
563
+ 2023-06-29 20:06:16,406 - INFO - [Train] step: 26399, loss_D: -0.156551, lr_D: 0.000200
564
+ 2023-06-29 20:06:16,487 - INFO - [Train] step: 26399, loss_G: -0.098009, lr_G: 0.000200
565
+ 2023-06-29 20:07:12,341 - INFO - [Train] step: 26499, loss_D: -0.541418, lr_D: 0.000200
566
+ 2023-06-29 20:07:12,421 - INFO - [Train] step: 26499, loss_G: 0.012804, lr_G: 0.000200
567
+ 2023-06-29 20:08:08,537 - INFO - [Train] step: 26599, loss_D: -0.147421, lr_D: 0.000200
568
+ 2023-06-29 20:08:08,618 - INFO - [Train] step: 26599, loss_G: -0.102931, lr_G: 0.000200
569
+ 2023-06-29 20:09:04,485 - INFO - [Train] step: 26699, loss_D: -0.225948, lr_D: 0.000200
570
+ 2023-06-29 20:09:04,566 - INFO - [Train] step: 26699, loss_G: -0.221375, lr_G: 0.000200
571
+ 2023-06-29 20:10:00,443 - INFO - [Train] step: 26799, loss_D: -0.304943, lr_D: 0.000200
572
+ 2023-06-29 20:10:00,523 - INFO - [Train] step: 26799, loss_G: -0.621033, lr_G: 0.000200
573
+ 2023-06-29 20:10:56,376 - INFO - [Train] step: 26899, loss_D: -0.073950, lr_D: 0.000200
574
+ 2023-06-29 20:10:56,456 - INFO - [Train] step: 26899, loss_G: -0.364380, lr_G: 0.000200
575
+ 2023-06-29 20:11:52,321 - INFO - [Train] step: 26999, loss_D: -0.138982, lr_D: 0.000200
576
+ 2023-06-29 20:11:52,402 - INFO - [Train] step: 26999, loss_G: -0.339443, lr_G: 0.000200
577
+ 2023-06-29 20:12:56,460 - INFO - [Eval] step: 26999, fid: 50.436127
578
+ 2023-06-29 20:13:52,304 - INFO - [Train] step: 27099, loss_D: -0.440900, lr_D: 0.000200
579
+ 2023-06-29 20:13:52,384 - INFO - [Train] step: 27099, loss_G: -0.604666, lr_G: 0.000200
580
+ 2023-06-29 20:14:48,452 - INFO - [Train] step: 27199, loss_D: -0.540771, lr_D: 0.000200
581
+ 2023-06-29 20:14:48,532 - INFO - [Train] step: 27199, loss_G: -0.810424, lr_G: 0.000200
582
+ 2023-06-29 20:15:44,443 - INFO - [Train] step: 27299, loss_D: -0.339045, lr_D: 0.000200
583
+ 2023-06-29 20:15:44,524 - INFO - [Train] step: 27299, loss_G: -0.938320, lr_G: 0.000200
584
+ 2023-06-29 20:16:40,401 - INFO - [Train] step: 27399, loss_D: -0.291234, lr_D: 0.000200
585
+ 2023-06-29 20:16:40,482 - INFO - [Train] step: 27399, loss_G: -0.508957, lr_G: 0.000200
586
+ 2023-06-29 20:17:36,350 - INFO - [Train] step: 27499, loss_D: -0.227649, lr_D: 0.000200
587
+ 2023-06-29 20:17:36,431 - INFO - [Train] step: 27499, loss_G: -0.916310, lr_G: 0.000200
588
+ 2023-06-29 20:18:32,298 - INFO - [Train] step: 27599, loss_D: -0.210387, lr_D: 0.000200
589
+ 2023-06-29 20:18:32,379 - INFO - [Train] step: 27599, loss_G: -0.551878, lr_G: 0.000200
590
+ 2023-06-29 20:19:28,244 - INFO - [Train] step: 27699, loss_D: -0.269410, lr_D: 0.000200
591
+ 2023-06-29 20:19:28,325 - INFO - [Train] step: 27699, loss_G: -0.440470, lr_G: 0.000200
592
+ 2023-06-29 20:20:24,211 - INFO - [Train] step: 27799, loss_D: -0.308587, lr_D: 0.000200
593
+ 2023-06-29 20:20:24,291 - INFO - [Train] step: 27799, loss_G: -0.101410, lr_G: 0.000200
594
+ 2023-06-29 20:21:20,315 - INFO - [Train] step: 27899, loss_D: -0.094001, lr_D: 0.000200
595
+ 2023-06-29 20:21:20,396 - INFO - [Train] step: 27899, loss_G: -0.402393, lr_G: 0.000200
596
+ 2023-06-29 20:22:16,246 - INFO - [Train] step: 27999, loss_D: -0.099825, lr_D: 0.000200
597
+ 2023-06-29 20:22:16,326 - INFO - [Train] step: 27999, loss_G: -0.267738, lr_G: 0.000200
598
+ 2023-06-29 20:23:20,431 - INFO - [Eval] step: 27999, fid: 50.253050
599
+ 2023-06-29 20:24:16,260 - INFO - [Train] step: 28099, loss_D: -0.237462, lr_D: 0.000200
600
+ 2023-06-29 20:24:16,341 - INFO - [Train] step: 28099, loss_G: -0.686338, lr_G: 0.000200
601
+ 2023-06-29 20:25:12,220 - INFO - [Train] step: 28199, loss_D: -0.321778, lr_D: 0.000200
602
+ 2023-06-29 20:25:12,301 - INFO - [Train] step: 28199, loss_G: 0.132861, lr_G: 0.000200
603
+ 2023-06-29 20:26:08,152 - INFO - [Train] step: 28299, loss_D: -0.188186, lr_D: 0.000200
604
+ 2023-06-29 20:26:08,233 - INFO - [Train] step: 28299, loss_G: -0.278624, lr_G: 0.000200
605
+ 2023-06-29 20:27:04,144 - INFO - [Train] step: 28399, loss_D: -0.268780, lr_D: 0.000200
606
+ 2023-06-29 20:27:04,225 - INFO - [Train] step: 28399, loss_G: 0.054225, lr_G: 0.000200
607
+ 2023-06-29 20:28:00,293 - INFO - [Train] step: 28499, loss_D: -0.079942, lr_D: 0.000200
608
+ 2023-06-29 20:28:00,374 - INFO - [Train] step: 28499, loss_G: -0.512941, lr_G: 0.000200
609
+ 2023-06-29 20:28:56,257 - INFO - [Train] step: 28599, loss_D: -0.077265, lr_D: 0.000200
610
+ 2023-06-29 20:28:56,338 - INFO - [Train] step: 28599, loss_G: -0.300269, lr_G: 0.000200
611
+ 2023-06-29 20:29:52,207 - INFO - [Train] step: 28699, loss_D: -0.091510, lr_D: 0.000200
612
+ 2023-06-29 20:29:52,288 - INFO - [Train] step: 28699, loss_G: -0.181123, lr_G: 0.000200
613
+ 2023-06-29 20:30:48,190 - INFO - [Train] step: 28799, loss_D: -0.192811, lr_D: 0.000200
614
+ 2023-06-29 20:30:48,271 - INFO - [Train] step: 28799, loss_G: -0.819615, lr_G: 0.000200
615
+ 2023-06-29 20:31:44,162 - INFO - [Train] step: 28899, loss_D: -0.263077, lr_D: 0.000200
616
+ 2023-06-29 20:31:44,243 - INFO - [Train] step: 28899, loss_G: -0.597801, lr_G: 0.000200
617
+ 2023-06-29 20:32:40,092 - INFO - [Train] step: 28999, loss_D: -0.150362, lr_D: 0.000200
618
+ 2023-06-29 20:32:40,173 - INFO - [Train] step: 28999, loss_G: -0.336823, lr_G: 0.000200
619
+ 2023-06-29 20:33:44,255 - INFO - [Eval] step: 28999, fid: 50.814191
620
+ 2023-06-29 20:34:40,192 - INFO - [Train] step: 29099, loss_D: -0.033635, lr_D: 0.000200
621
+ 2023-06-29 20:34:40,272 - INFO - [Train] step: 29099, loss_G: -0.845183, lr_G: 0.000200
622
+ 2023-06-29 20:35:36,298 - INFO - [Train] step: 29199, loss_D: -0.166216, lr_D: 0.000200
623
+ 2023-06-29 20:35:36,379 - INFO - [Train] step: 29199, loss_G: 0.252297, lr_G: 0.000200
624
+ 2023-06-29 20:36:32,245 - INFO - [Train] step: 29299, loss_D: -0.224406, lr_D: 0.000200
625
+ 2023-06-29 20:36:32,326 - INFO - [Train] step: 29299, loss_G: -0.335757, lr_G: 0.000200
626
+ 2023-06-29 20:37:28,206 - INFO - [Train] step: 29399, loss_D: -0.127075, lr_D: 0.000200
627
+ 2023-06-29 20:37:28,287 - INFO - [Train] step: 29399, loss_G: -0.286405, lr_G: 0.000200
628
+ 2023-06-29 20:38:24,179 - INFO - [Train] step: 29499, loss_D: -0.171968, lr_D: 0.000200
629
+ 2023-06-29 20:38:24,260 - INFO - [Train] step: 29499, loss_G: 0.456022, lr_G: 0.000200
630
+ 2023-06-29 20:39:20,156 - INFO - [Train] step: 29599, loss_D: -0.025580, lr_D: 0.000200
631
+ 2023-06-29 20:39:20,237 - INFO - [Train] step: 29599, loss_G: -0.191437, lr_G: 0.000200
632
+ 2023-06-29 20:40:16,135 - INFO - [Train] step: 29699, loss_D: -0.155578, lr_D: 0.000200
633
+ 2023-06-29 20:40:16,215 - INFO - [Train] step: 29699, loss_G: 0.374811, lr_G: 0.000200
634
+ 2023-06-29 20:41:12,285 - INFO - [Train] step: 29799, loss_D: -0.144225, lr_D: 0.000200
635
+ 2023-06-29 20:41:12,366 - INFO - [Train] step: 29799, loss_G: -0.540611, lr_G: 0.000200
636
+ 2023-06-29 20:42:08,206 - INFO - [Train] step: 29899, loss_D: -0.075165, lr_D: 0.000200
637
+ 2023-06-29 20:42:08,287 - INFO - [Train] step: 29899, loss_G: -0.497311, lr_G: 0.000200
638
+ 2023-06-29 20:43:04,162 - INFO - [Train] step: 29999, loss_D: -0.172618, lr_D: 0.000200
639
+ 2023-06-29 20:43:04,242 - INFO - [Train] step: 29999, loss_G: 0.414521, lr_G: 0.000200
640
+ 2023-06-29 20:44:08,311 - INFO - [Eval] step: 29999, fid: 50.717629
641
+ 2023-06-29 20:45:04,295 - INFO - [Train] step: 30099, loss_D: -0.151617, lr_D: 0.000200
642
+ 2023-06-29 20:45:04,376 - INFO - [Train] step: 30099, loss_G: 0.553588, lr_G: 0.000200
643
+ 2023-06-29 20:46:00,268 - INFO - [Train] step: 30199, loss_D: -0.096222, lr_D: 0.000200
644
+ 2023-06-29 20:46:00,349 - INFO - [Train] step: 30199, loss_G: 0.294322, lr_G: 0.000200
645
+ 2023-06-29 20:46:56,215 - INFO - [Train] step: 30299, loss_D: -0.178306, lr_D: 0.000200
646
+ 2023-06-29 20:46:56,296 - INFO - [Train] step: 30299, loss_G: -0.236296, lr_G: 0.000200
647
+ 2023-06-29 20:47:52,400 - INFO - [Train] step: 30399, loss_D: -0.035864, lr_D: 0.000200
648
+ 2023-06-29 20:47:52,481 - INFO - [Train] step: 30399, loss_G: -0.515588, lr_G: 0.000200
649
+ 2023-06-29 20:48:48,348 - INFO - [Train] step: 30499, loss_D: -0.235994, lr_D: 0.000200
650
+ 2023-06-29 20:48:48,429 - INFO - [Train] step: 30499, loss_G: -0.208434, lr_G: 0.000200
651
+ 2023-06-29 20:49:44,274 - INFO - [Train] step: 30599, loss_D: -0.105147, lr_D: 0.000200
652
+ 2023-06-29 20:49:44,354 - INFO - [Train] step: 30599, loss_G: -0.314841, lr_G: 0.000200
653
+ 2023-06-29 20:50:40,254 - INFO - [Train] step: 30699, loss_D: -0.187854, lr_D: 0.000200
654
+ 2023-06-29 20:50:40,335 - INFO - [Train] step: 30699, loss_G: -0.712218, lr_G: 0.000200
655
+ 2023-06-29 20:51:36,203 - INFO - [Train] step: 30799, loss_D: 0.006439, lr_D: 0.000200
656
+ 2023-06-29 20:51:36,283 - INFO - [Train] step: 30799, loss_G: -0.117921, lr_G: 0.000200
657
+ 2023-06-29 20:52:32,132 - INFO - [Train] step: 30899, loss_D: -0.017228, lr_D: 0.000200
658
+ 2023-06-29 20:52:32,213 - INFO - [Train] step: 30899, loss_G: -0.116085, lr_G: 0.000200
659
+ 2023-06-29 20:53:28,096 - INFO - [Train] step: 30999, loss_D: -0.031700, lr_D: 0.000200
660
+ 2023-06-29 20:53:28,177 - INFO - [Train] step: 30999, loss_G: -0.086945, lr_G: 0.000200
661
+ 2023-06-29 20:54:32,313 - INFO - [Eval] step: 30999, fid: 52.107770
662
+ 2023-06-29 20:55:28,404 - INFO - [Train] step: 31099, loss_D: -0.101991, lr_D: 0.000200
663
+ 2023-06-29 20:55:28,484 - INFO - [Train] step: 31099, loss_G: 0.212889, lr_G: 0.000200
664
+ 2023-06-29 20:56:24,369 - INFO - [Train] step: 31199, loss_D: -0.025268, lr_D: 0.000200
665
+ 2023-06-29 20:56:24,449 - INFO - [Train] step: 31199, loss_G: -0.352042, lr_G: 0.000200
666
+ 2023-06-29 20:57:20,313 - INFO - [Train] step: 31299, loss_D: -0.123498, lr_D: 0.000200
667
+ 2023-06-29 20:57:20,393 - INFO - [Train] step: 31299, loss_G: -0.502366, lr_G: 0.000200
668
+ 2023-06-29 20:58:16,261 - INFO - [Train] step: 31399, loss_D: -0.664067, lr_D: 0.000200
669
+ 2023-06-29 20:58:16,342 - INFO - [Train] step: 31399, loss_G: -0.460703, lr_G: 0.000200
670
+ 2023-06-29 20:59:12,221 - INFO - [Train] step: 31499, loss_D: -0.013609, lr_D: 0.000200
671
+ 2023-06-29 20:59:12,302 - INFO - [Train] step: 31499, loss_G: -0.684312, lr_G: 0.000200
672
+ 2023-06-29 21:00:08,159 - INFO - [Train] step: 31599, loss_D: -0.391789, lr_D: 0.000200
673
+ 2023-06-29 21:00:08,240 - INFO - [Train] step: 31599, loss_G: -0.442279, lr_G: 0.000200
674
+ 2023-06-29 21:01:04,278 - INFO - [Train] step: 31699, loss_D: -0.046703, lr_D: 0.000200
675
+ 2023-06-29 21:01:04,359 - INFO - [Train] step: 31699, loss_G: -0.880185, lr_G: 0.000200
676
+ 2023-06-29 21:02:00,255 - INFO - [Train] step: 31799, loss_D: -0.528566, lr_D: 0.000200
677
+ 2023-06-29 21:02:00,336 - INFO - [Train] step: 31799, loss_G: -0.168644, lr_G: 0.000200
678
+ 2023-06-29 21:02:56,233 - INFO - [Train] step: 31899, loss_D: -0.559059, lr_D: 0.000200
679
+ 2023-06-29 21:02:56,314 - INFO - [Train] step: 31899, loss_G: -1.108367, lr_G: 0.000200
680
+ 2023-06-29 21:03:52,146 - INFO - [Train] step: 31999, loss_D: -0.977145, lr_D: 0.000200
681
+ 2023-06-29 21:03:52,227 - INFO - [Train] step: 31999, loss_G: 0.000788, lr_G: 0.000200
682
+ 2023-06-29 21:04:56,367 - INFO - [Eval] step: 31999, fid: 56.424180
683
+ 2023-06-29 21:05:52,167 - INFO - [Train] step: 32099, loss_D: -0.902921, lr_D: 0.000200
684
+ 2023-06-29 21:05:52,248 - INFO - [Train] step: 32099, loss_G: -0.214012, lr_G: 0.000200
685
+ 2023-06-29 21:06:48,093 - INFO - [Train] step: 32199, loss_D: -0.497733, lr_D: 0.000200
686
+ 2023-06-29 21:06:48,174 - INFO - [Train] step: 32199, loss_G: -0.498960, lr_G: 0.000200
687
+ 2023-06-29 21:07:44,024 - INFO - [Train] step: 32299, loss_D: -0.633462, lr_D: 0.000200
688
+ 2023-06-29 21:07:44,104 - INFO - [Train] step: 32299, loss_G: -0.159241, lr_G: 0.000200
689
+ 2023-06-29 21:08:40,121 - INFO - [Train] step: 32399, loss_D: -0.143592, lr_D: 0.000200
690
+ 2023-06-29 21:08:40,202 - INFO - [Train] step: 32399, loss_G: -0.425177, lr_G: 0.000200
691
+ 2023-06-29 21:09:36,035 - INFO - [Train] step: 32499, loss_D: -0.462151, lr_D: 0.000200
692
+ 2023-06-29 21:09:36,116 - INFO - [Train] step: 32499, loss_G: -0.405348, lr_G: 0.000200
693
+ 2023-06-29 21:10:31,957 - INFO - [Train] step: 32599, loss_D: -0.370812, lr_D: 0.000200
694
+ 2023-06-29 21:10:32,038 - INFO - [Train] step: 32599, loss_G: -0.228857, lr_G: 0.000200
695
+ 2023-06-29 21:11:27,871 - INFO - [Train] step: 32699, loss_D: -0.833916, lr_D: 0.000200
696
+ 2023-06-29 21:11:27,952 - INFO - [Train] step: 32699, loss_G: -1.127996, lr_G: 0.000200
697
+ 2023-06-29 21:12:23,789 - INFO - [Train] step: 32799, loss_D: -1.042816, lr_D: 0.000200
698
+ 2023-06-29 21:12:23,870 - INFO - [Train] step: 32799, loss_G: -0.110342, lr_G: 0.000200
699
+ 2023-06-29 21:13:19,747 - INFO - [Train] step: 32899, loss_D: -0.535119, lr_D: 0.000200
700
+ 2023-06-29 21:13:19,828 - INFO - [Train] step: 32899, loss_G: -0.082344, lr_G: 0.000200
701
+ 2023-06-29 21:14:15,864 - INFO - [Train] step: 32999, loss_D: -0.679631, lr_D: 0.000200
702
+ 2023-06-29 21:14:15,945 - INFO - [Train] step: 32999, loss_G: -0.347484, lr_G: 0.000200
703
+ 2023-06-29 21:15:20,075 - INFO - [Eval] step: 32999, fid: 57.090600
704
+ 2023-06-29 21:16:15,895 - INFO - [Train] step: 33099, loss_D: -0.061740, lr_D: 0.000200
705
+ 2023-06-29 21:16:15,976 - INFO - [Train] step: 33099, loss_G: 0.153369, lr_G: 0.000200
706
+ 2023-06-29 21:17:11,829 - INFO - [Train] step: 33199, loss_D: -0.096634, lr_D: 0.000200
707
+ 2023-06-29 21:17:11,909 - INFO - [Train] step: 33199, loss_G: -0.033168, lr_G: 0.000200
708
+ 2023-06-29 21:18:07,769 - INFO - [Train] step: 33299, loss_D: -0.149542, lr_D: 0.000200
709
+ 2023-06-29 21:18:07,849 - INFO - [Train] step: 33299, loss_G: -0.480602, lr_G: 0.000200
710
+ 2023-06-29 21:19:03,772 - INFO - [Train] step: 33399, loss_D: -0.155791, lr_D: 0.000200
711
+ 2023-06-29 21:19:03,853 - INFO - [Train] step: 33399, loss_G: -0.547603, lr_G: 0.000200
712
+ 2023-06-29 21:19:59,747 - INFO - [Train] step: 33499, loss_D: -0.165494, lr_D: 0.000200
713
+ 2023-06-29 21:19:59,827 - INFO - [Train] step: 33499, loss_G: -0.336066, lr_G: 0.000200
714
+ 2023-06-29 21:20:55,705 - INFO - [Train] step: 33599, loss_D: -0.080641, lr_D: 0.000200
715
+ 2023-06-29 21:20:55,785 - INFO - [Train] step: 33599, loss_G: -0.362375, lr_G: 0.000200
716
+ 2023-06-29 21:21:51,828 - INFO - [Train] step: 33699, loss_D: -0.582405, lr_D: 0.000200
717
+ 2023-06-29 21:21:51,908 - INFO - [Train] step: 33699, loss_G: -0.837543, lr_G: 0.000200
718
+ 2023-06-29 21:22:47,797 - INFO - [Train] step: 33799, loss_D: -0.521492, lr_D: 0.000200
719
+ 2023-06-29 21:22:47,878 - INFO - [Train] step: 33799, loss_G: -0.381544, lr_G: 0.000200
720
+ 2023-06-29 21:23:43,746 - INFO - [Train] step: 33899, loss_D: -0.544216, lr_D: 0.000200
721
+ 2023-06-29 21:23:43,826 - INFO - [Train] step: 33899, loss_G: -1.067221, lr_G: 0.000200
722
+ 2023-06-29 21:24:39,651 - INFO - [Train] step: 33999, loss_D: -0.183867, lr_D: 0.000200
723
+ 2023-06-29 21:24:39,731 - INFO - [Train] step: 33999, loss_G: -0.155556, lr_G: 0.000200
724
+ 2023-06-29 21:25:43,874 - INFO - [Eval] step: 33999, fid: 57.256065
725
+ 2023-06-29 21:26:39,692 - INFO - [Train] step: 34099, loss_D: -0.395606, lr_D: 0.000200
726
+ 2023-06-29 21:26:39,772 - INFO - [Train] step: 34099, loss_G: -0.545324, lr_G: 0.000200
727
+ 2023-06-29 21:27:35,588 - INFO - [Train] step: 34199, loss_D: -0.256745, lr_D: 0.000200
728
+ 2023-06-29 21:27:35,669 - INFO - [Train] step: 34199, loss_G: -0.265838, lr_G: 0.000200
729
+ 2023-06-29 21:28:31,684 - INFO - [Train] step: 34299, loss_D: -0.492628, lr_D: 0.000200
730
+ 2023-06-29 21:28:31,766 - INFO - [Train] step: 34299, loss_G: -0.726785, lr_G: 0.000200
731
+ 2023-06-29 21:29:27,622 - INFO - [Train] step: 34399, loss_D: -0.602279, lr_D: 0.000200
732
+ 2023-06-29 21:29:27,702 - INFO - [Train] step: 34399, loss_G: -1.145029, lr_G: 0.000200
733
+ 2023-06-29 21:30:23,525 - INFO - [Train] step: 34499, loss_D: -0.330295, lr_D: 0.000200
734
+ 2023-06-29 21:30:23,606 - INFO - [Train] step: 34499, loss_G: -1.068182, lr_G: 0.000200
735
+ 2023-06-29 21:31:19,432 - INFO - [Train] step: 34599, loss_D: -0.749275, lr_D: 0.000200
736
+ 2023-06-29 21:31:19,513 - INFO - [Train] step: 34599, loss_G: 0.058408, lr_G: 0.000200
737
+ 2023-06-29 21:32:15,345 - INFO - [Train] step: 34699, loss_D: -0.032100, lr_D: 0.000200
738
+ 2023-06-29 21:32:15,426 - INFO - [Train] step: 34699, loss_G: -0.837189, lr_G: 0.000200
739
+ 2023-06-29 21:33:11,263 - INFO - [Train] step: 34799, loss_D: -0.771454, lr_D: 0.000200
740
+ 2023-06-29 21:33:11,344 - INFO - [Train] step: 34799, loss_G: -0.805041, lr_G: 0.000200
741
+ 2023-06-29 21:34:07,155 - INFO - [Train] step: 34899, loss_D: -0.096931, lr_D: 0.000200
742
+ 2023-06-29 21:34:07,235 - INFO - [Train] step: 34899, loss_G: -0.406319, lr_G: 0.000200
743
+ 2023-06-29 21:35:03,229 - INFO - [Train] step: 34999, loss_D: -0.492362, lr_D: 0.000200
744
+ 2023-06-29 21:35:03,310 - INFO - [Train] step: 34999, loss_G: -0.603581, lr_G: 0.000200
745
+ 2023-06-29 21:36:07,416 - INFO - [Eval] step: 34999, fid: 58.468545
746
+ 2023-06-29 21:37:03,205 - INFO - [Train] step: 35099, loss_D: -0.145390, lr_D: 0.000200
747
+ 2023-06-29 21:37:03,285 - INFO - [Train] step: 35099, loss_G: -0.267470, lr_G: 0.000200
748
+ 2023-06-29 21:37:59,114 - INFO - [Train] step: 35199, loss_D: -0.030206, lr_D: 0.000200
749
+ 2023-06-29 21:37:59,194 - INFO - [Train] step: 35199, loss_G: 0.284759, lr_G: 0.000200
750
+ 2023-06-29 21:38:55,048 - INFO - [Train] step: 35299, loss_D: -0.531895, lr_D: 0.000200
751
+ 2023-06-29 21:38:55,128 - INFO - [Train] step: 35299, loss_G: 0.029803, lr_G: 0.000200
752
+ 2023-06-29 21:39:50,945 - INFO - [Train] step: 35399, loss_D: -0.129737, lr_D: 0.000200
753
+ 2023-06-29 21:39:51,026 - INFO - [Train] step: 35399, loss_G: -0.589824, lr_G: 0.000200
754
+ 2023-06-29 21:40:46,853 - INFO - [Train] step: 35499, loss_D: -0.051374, lr_D: 0.000200
755
+ 2023-06-29 21:40:46,933 - INFO - [Train] step: 35499, loss_G: 0.235879, lr_G: 0.000200
756
+ 2023-06-29 21:41:42,915 - INFO - [Train] step: 35599, loss_D: -0.613551, lr_D: 0.000200
757
+ 2023-06-29 21:41:42,996 - INFO - [Train] step: 35599, loss_G: -0.609162, lr_G: 0.000200
758
+ 2023-06-29 21:42:38,838 - INFO - [Train] step: 35699, loss_D: -0.068948, lr_D: 0.000200
759
+ 2023-06-29 21:42:38,918 - INFO - [Train] step: 35699, loss_G: -0.962997, lr_G: 0.000200
760
+ 2023-06-29 21:43:34,741 - INFO - [Train] step: 35799, loss_D: -0.699282, lr_D: 0.000200
761
+ 2023-06-29 21:43:34,821 - INFO - [Train] step: 35799, loss_G: -0.002667, lr_G: 0.000200
762
+ 2023-06-29 21:44:30,644 - INFO - [Train] step: 35899, loss_D: -0.715806, lr_D: 0.000200
763
+ 2023-06-29 21:44:30,724 - INFO - [Train] step: 35899, loss_G: -0.428615, lr_G: 0.000200
764
+ 2023-06-29 21:45:26,558 - INFO - [Train] step: 35999, loss_D: -0.230025, lr_D: 0.000200
765
+ 2023-06-29 21:45:26,638 - INFO - [Train] step: 35999, loss_G: -0.478223, lr_G: 0.000200
766
+ 2023-06-29 21:46:30,716 - INFO - [Eval] step: 35999, fid: 58.636501
767
+ 2023-06-29 21:47:26,529 - INFO - [Train] step: 36099, loss_D: -0.472976, lr_D: 0.000200
768
+ 2023-06-29 21:47:26,609 - INFO - [Train] step: 36099, loss_G: -1.136947, lr_G: 0.000200
769
+ 2023-06-29 21:48:22,456 - INFO - [Train] step: 36199, loss_D: -0.849616, lr_D: 0.000200
770
+ 2023-06-29 21:48:22,536 - INFO - [Train] step: 36199, loss_G: -0.400073, lr_G: 0.000200
771
+ 2023-06-29 21:49:18,578 - INFO - [Train] step: 36299, loss_D: -0.659632, lr_D: 0.000200
772
+ 2023-06-29 21:49:18,659 - INFO - [Train] step: 36299, loss_G: 0.019345, lr_G: 0.000200
773
+ 2023-06-29 21:50:14,518 - INFO - [Train] step: 36399, loss_D: -0.278380, lr_D: 0.000200
774
+ 2023-06-29 21:50:14,599 - INFO - [Train] step: 36399, loss_G: -0.489173, lr_G: 0.000200
775
+ 2023-06-29 21:51:10,454 - INFO - [Train] step: 36499, loss_D: -0.326016, lr_D: 0.000200
776
+ 2023-06-29 21:51:10,535 - INFO - [Train] step: 36499, loss_G: -0.999476, lr_G: 0.000200
777
+ 2023-06-29 21:52:06,385 - INFO - [Train] step: 36599, loss_D: -0.609397, lr_D: 0.000200
778
+ 2023-06-29 21:52:06,465 - INFO - [Train] step: 36599, loss_G: -0.175923, lr_G: 0.000200
779
+ 2023-06-29 21:53:02,309 - INFO - [Train] step: 36699, loss_D: -0.062609, lr_D: 0.000200
780
+ 2023-06-29 21:53:02,390 - INFO - [Train] step: 36699, loss_G: -0.243956, lr_G: 0.000200
781
+ 2023-06-29 21:53:58,247 - INFO - [Train] step: 36799, loss_D: -1.201340, lr_D: 0.000200
782
+ 2023-06-29 21:53:58,327 - INFO - [Train] step: 36799, loss_G: -0.016666, lr_G: 0.000200
783
+ 2023-06-29 21:54:54,329 - INFO - [Train] step: 36899, loss_D: -0.773190, lr_D: 0.000200
784
+ 2023-06-29 21:54:54,410 - INFO - [Train] step: 36899, loss_G: -0.076329, lr_G: 0.000200
785
+ 2023-06-29 21:55:50,251 - INFO - [Train] step: 36999, loss_D: -0.248432, lr_D: 0.000200
786
+ 2023-06-29 21:55:50,331 - INFO - [Train] step: 36999, loss_G: -0.414903, lr_G: 0.000200
787
+ 2023-06-29 21:56:54,581 - INFO - [Eval] step: 36999, fid: 57.801297
788
+ 2023-06-29 21:57:50,379 - INFO - [Train] step: 37099, loss_D: 0.036923, lr_D: 0.000200
789
+ 2023-06-29 21:57:50,460 - INFO - [Train] step: 37099, loss_G: -0.189487, lr_G: 0.000200
790
+ 2023-06-29 21:58:46,339 - INFO - [Train] step: 37199, loss_D: -0.634635, lr_D: 0.000200
791
+ 2023-06-29 21:58:46,420 - INFO - [Train] step: 37199, loss_G: -0.734248, lr_G: 0.000200
792
+ 2023-06-29 21:59:42,246 - INFO - [Train] step: 37299, loss_D: -0.120110, lr_D: 0.000200
793
+ 2023-06-29 21:59:42,327 - INFO - [Train] step: 37299, loss_G: -0.241276, lr_G: 0.000200
794
+ 2023-06-29 22:00:38,158 - INFO - [Train] step: 37399, loss_D: -0.290200, lr_D: 0.000200
795
+ 2023-06-29 22:00:38,238 - INFO - [Train] step: 37399, loss_G: -0.061077, lr_G: 0.000200
796
+ 2023-06-29 22:01:34,109 - INFO - [Train] step: 37499, loss_D: -0.093320, lr_D: 0.000200
797
+ 2023-06-29 22:01:34,189 - INFO - [Train] step: 37499, loss_G: -0.534815, lr_G: 0.000200
798
+ 2023-06-29 22:02:30,235 - INFO - [Train] step: 37599, loss_D: -0.425610, lr_D: 0.000200
799
+ 2023-06-29 22:02:30,316 - INFO - [Train] step: 37599, loss_G: -0.569052, lr_G: 0.000200
800
+ 2023-06-29 22:03:26,154 - INFO - [Train] step: 37699, loss_D: -0.086899, lr_D: 0.000200
801
+ 2023-06-29 22:03:26,235 - INFO - [Train] step: 37699, loss_G: -0.195559, lr_G: 0.000200
802
+ 2023-06-29 22:04:22,072 - INFO - [Train] step: 37799, loss_D: -0.443643, lr_D: 0.000200
803
+ 2023-06-29 22:04:22,152 - INFO - [Train] step: 37799, loss_G: -0.341666, lr_G: 0.000200
804
+ 2023-06-29 22:05:17,998 - INFO - [Train] step: 37899, loss_D: -0.548785, lr_D: 0.000200
805
+ 2023-06-29 22:05:18,078 - INFO - [Train] step: 37899, loss_G: -0.773496, lr_G: 0.000200
806
+ 2023-06-29 22:06:13,923 - INFO - [Train] step: 37999, loss_D: -0.123959, lr_D: 0.000200
807
+ 2023-06-29 22:06:14,004 - INFO - [Train] step: 37999, loss_G: -0.172185, lr_G: 0.000200
808
+ 2023-06-29 22:07:18,097 - INFO - [Eval] step: 37999, fid: 57.867587
809
+ 2023-06-29 22:08:13,944 - INFO - [Train] step: 38099, loss_D: -0.025133, lr_D: 0.000200
810
+ 2023-06-29 22:08:14,024 - INFO - [Train] step: 38099, loss_G: 0.233737, lr_G: 0.000200
811
+ 2023-06-29 22:09:10,060 - INFO - [Train] step: 38199, loss_D: -0.873192, lr_D: 0.000200
812
+ 2023-06-29 22:09:10,140 - INFO - [Train] step: 38199, loss_G: 0.076724, lr_G: 0.000200
813
+ 2023-06-29 22:10:06,030 - INFO - [Train] step: 38299, loss_D: -0.693048, lr_D: 0.000200
814
+ 2023-06-29 22:10:06,111 - INFO - [Train] step: 38299, loss_G: -0.163037, lr_G: 0.000200
815
+ 2023-06-29 22:11:01,962 - INFO - [Train] step: 38399, loss_D: -0.290819, lr_D: 0.000200
816
+ 2023-06-29 22:11:02,043 - INFO - [Train] step: 38399, loss_G: 0.120225, lr_G: 0.000200
817
+ 2023-06-29 22:11:57,920 - INFO - [Train] step: 38499, loss_D: -0.463271, lr_D: 0.000200
818
+ 2023-06-29 22:11:58,001 - INFO - [Train] step: 38499, loss_G: 0.037867, lr_G: 0.000200
819
+ 2023-06-29 22:12:53,895 - INFO - [Train] step: 38599, loss_D: -0.687023, lr_D: 0.000200
820
+ 2023-06-29 22:12:53,975 - INFO - [Train] step: 38599, loss_G: -0.011624, lr_G: 0.000200
821
+ 2023-06-29 22:13:49,854 - INFO - [Train] step: 38699, loss_D: -1.064394, lr_D: 0.000200
822
+ 2023-06-29 22:13:49,935 - INFO - [Train] step: 38699, loss_G: -0.213514, lr_G: 0.000200
823
+ 2023-06-29 22:14:45,815 - INFO - [Train] step: 38799, loss_D: -0.418050, lr_D: 0.000200
824
+ 2023-06-29 22:14:45,895 - INFO - [Train] step: 38799, loss_G: -0.971072, lr_G: 0.000200
825
+ 2023-06-29 22:15:41,867 - INFO - [Train] step: 38899, loss_D: -0.126263, lr_D: 0.000200
826
+ 2023-06-29 22:15:41,948 - INFO - [Train] step: 38899, loss_G: -0.516200, lr_G: 0.000200
827
+ 2023-06-29 22:16:37,786 - INFO - [Train] step: 38999, loss_D: -0.241696, lr_D: 0.000200
828
+ 2023-06-29 22:16:37,867 - INFO - [Train] step: 38999, loss_G: -0.793683, lr_G: 0.000200
829
+ 2023-06-29 22:17:41,990 - INFO - [Eval] step: 38999, fid: 58.003176
830
+ 2023-06-29 22:18:37,760 - INFO - [Train] step: 39099, loss_D: -0.415770, lr_D: 0.000200
831
+ 2023-06-29 22:18:37,840 - INFO - [Train] step: 39099, loss_G: -0.628297, lr_G: 0.000200
832
+ 2023-06-29 22:19:33,688 - INFO - [Train] step: 39199, loss_D: -0.137480, lr_D: 0.000200
833
+ 2023-06-29 22:19:33,769 - INFO - [Train] step: 39199, loss_G: -0.146063, lr_G: 0.000200
834
+ 2023-06-29 22:20:29,616 - INFO - [Train] step: 39299, loss_D: -0.468586, lr_D: 0.000200
835
+ 2023-06-29 22:20:29,697 - INFO - [Train] step: 39299, loss_G: -0.089141, lr_G: 0.000200
836
+ 2023-06-29 22:21:25,557 - INFO - [Train] step: 39399, loss_D: -0.178473, lr_D: 0.000200
837
+ 2023-06-29 22:21:25,637 - INFO - [Train] step: 39399, loss_G: -0.972881, lr_G: 0.000200
838
+ 2023-06-29 22:22:21,679 - INFO - [Train] step: 39499, loss_D: -0.062802, lr_D: 0.000200
839
+ 2023-06-29 22:22:21,759 - INFO - [Train] step: 39499, loss_G: 0.119458, lr_G: 0.000200
840
+ 2023-06-29 22:23:17,582 - INFO - [Train] step: 39599, loss_D: -0.019836, lr_D: 0.000200
841
+ 2023-06-29 22:23:17,662 - INFO - [Train] step: 39599, loss_G: -0.719415, lr_G: 0.000200
842
+ 2023-06-29 22:24:13,490 - INFO - [Train] step: 39699, loss_D: -0.549166, lr_D: 0.000200
843
+ 2023-06-29 22:24:13,570 - INFO - [Train] step: 39699, loss_G: -0.316521, lr_G: 0.000200
844
+ 2023-06-29 22:25:09,408 - INFO - [Train] step: 39799, loss_D: -0.161954, lr_D: 0.000200
845
+ 2023-06-29 22:25:09,489 - INFO - [Train] step: 39799, loss_G: -0.456900, lr_G: 0.000200
846
+ 2023-06-29 22:26:05,342 - INFO - [Train] step: 39899, loss_D: -0.362520, lr_D: 0.000200
847
+ 2023-06-29 22:26:05,422 - INFO - [Train] step: 39899, loss_G: -0.831629, lr_G: 0.000200
848
+ 2023-06-29 22:27:01,268 - INFO - [Train] step: 39999, loss_D: -0.082073, lr_D: 0.000200
849
+ 2023-06-29 22:27:01,349 - INFO - [Train] step: 39999, loss_G: -0.797569, lr_G: 0.000200
850
+ 2023-06-29 22:28:05,474 - INFO - [Eval] step: 39999, fid: 58.146436
851
+ 2023-06-29 22:28:05,596 - INFO - Best FID score: 49.719737961805265
852
+ 2023-06-29 22:28:05,597 - INFO - End of training
wgan_cifar10/samples.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f3905dddf2970d5902605962a6a4b241cb9ba7ff02764d5bd3590a6c03027a54
3
+ size 5929779
wgan_cifar10/tensorboard/events.out.tfevents.1688052700.jason-system.4698.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1291b9faf22999aae6dd3e268fa2660ca30d1ef64dc9e5937a19df6673af1ea5
3
+ size 8095936