GANs-Implementations / gan_cifar10 /output-2023-06-19-06-41-43.log
xyfJASON's picture
Upload gan_cifar10 checkpoints and training logs
d76d7ed
raw history blame
No virus
72.9 kB
2023-06-19 06:41:43,797 - INFO - Experiment directory: ./runs/gan_cifar10/
2023-06-19 06:41:43,797 - INFO - Number of processes: 1
2023-06-19 06:41:43,797 - INFO - Distributed type: DistributedType.NO
2023-06-19 06:41:43,797 - INFO - Mixed precision: no
2023-06-19 06:41:43,797 - INFO - ==============================
2023-06-19 06:41:44,083 - INFO - Size of training set: 50000
2023-06-19 06:41:44,083 - INFO - Batch size per process: 512
2023-06-19 06:41:44,083 - INFO - Total batch size: 512
2023-06-19 06:41:44,083 - INFO - ==============================
2023-06-19 06:41:44,612 - INFO - Start training...
2023-06-19 06:42:41,116 - INFO - [Train] step: 99, loss_D: 0.015413, lr_D: 0.000800
2023-06-19 06:42:41,197 - INFO - [Train] step: 99, loss_G: 8.533952, lr_G: 0.000800
2023-06-19 06:43:37,102 - INFO - [Train] step: 199, loss_D: 0.013439, lr_D: 0.000800
2023-06-19 06:43:37,183 - INFO - [Train] step: 199, loss_G: 5.837270, lr_G: 0.000800
2023-06-19 06:44:33,289 - INFO - [Train] step: 299, loss_D: 0.013354, lr_D: 0.000800
2023-06-19 06:44:33,371 - INFO - [Train] step: 299, loss_G: 5.493741, lr_G: 0.000800
2023-06-19 06:45:29,520 - INFO - [Train] step: 399, loss_D: 0.011446, lr_D: 0.000800
2023-06-19 06:45:29,601 - INFO - [Train] step: 399, loss_G: 6.936215, lr_G: 0.000800
2023-06-19 06:46:25,774 - INFO - [Train] step: 499, loss_D: 0.017225, lr_D: 0.000800
2023-06-19 06:46:25,855 - INFO - [Train] step: 499, loss_G: 6.011294, lr_G: 0.000800
2023-06-19 06:47:22,039 - INFO - [Train] step: 599, loss_D: 2.128814, lr_D: 0.000800
2023-06-19 06:47:22,120 - INFO - [Train] step: 599, loss_G: 7.097383, lr_G: 0.000800
2023-06-19 06:48:18,457 - INFO - [Train] step: 699, loss_D: 0.008529, lr_D: 0.000800
2023-06-19 06:48:18,538 - INFO - [Train] step: 699, loss_G: 5.912282, lr_G: 0.000800
2023-06-19 06:49:14,675 - INFO - [Train] step: 799, loss_D: 0.002951, lr_D: 0.000800
2023-06-19 06:49:14,757 - INFO - [Train] step: 799, loss_G: 7.359232, lr_G: 0.000800
2023-06-19 06:50:10,911 - INFO - [Train] step: 899, loss_D: 0.004851, lr_D: 0.000800
2023-06-19 06:50:10,992 - INFO - [Train] step: 899, loss_G: 6.559004, lr_G: 0.000800
2023-06-19 06:51:07,166 - INFO - [Train] step: 999, loss_D: 0.036782, lr_D: 0.000800
2023-06-19 06:51:07,248 - INFO - [Train] step: 999, loss_G: 6.337249, lr_G: 0.000800
2023-06-19 06:52:11,297 - INFO - [Eval] step: 999, fid: 116.689816
2023-06-19 06:53:07,431 - INFO - [Train] step: 1099, loss_D: 0.015905, lr_D: 0.000800
2023-06-19 06:53:07,512 - INFO - [Train] step: 1099, loss_G: 4.911904, lr_G: 0.000800
2023-06-19 06:54:03,736 - INFO - [Train] step: 1199, loss_D: 0.009055, lr_D: 0.000800
2023-06-19 06:54:03,817 - INFO - [Train] step: 1199, loss_G: 6.414727, lr_G: 0.000800
2023-06-19 06:55:00,198 - INFO - [Train] step: 1299, loss_D: 0.031957, lr_D: 0.000800
2023-06-19 06:55:00,280 - INFO - [Train] step: 1299, loss_G: 5.431652, lr_G: 0.000800
2023-06-19 06:55:56,600 - INFO - [Train] step: 1399, loss_D: 0.008166, lr_D: 0.000800
2023-06-19 06:55:56,682 - INFO - [Train] step: 1399, loss_G: 6.883259, lr_G: 0.000800
2023-06-19 06:56:52,961 - INFO - [Train] step: 1499, loss_D: 0.014873, lr_D: 0.000800
2023-06-19 06:56:53,043 - INFO - [Train] step: 1499, loss_G: 4.983109, lr_G: 0.000800
2023-06-19 06:57:49,259 - INFO - [Train] step: 1599, loss_D: 0.029732, lr_D: 0.000800
2023-06-19 06:57:49,341 - INFO - [Train] step: 1599, loss_G: 5.318909, lr_G: 0.000800
2023-06-19 06:58:45,567 - INFO - [Train] step: 1699, loss_D: 0.017640, lr_D: 0.000800
2023-06-19 06:58:45,649 - INFO - [Train] step: 1699, loss_G: 5.521641, lr_G: 0.000800
2023-06-19 06:59:41,898 - INFO - [Train] step: 1799, loss_D: 0.019546, lr_D: 0.000800
2023-06-19 06:59:41,979 - INFO - [Train] step: 1799, loss_G: 4.917681, lr_G: 0.000800
2023-06-19 07:00:38,241 - INFO - [Train] step: 1899, loss_D: 0.023809, lr_D: 0.000800
2023-06-19 07:00:38,322 - INFO - [Train] step: 1899, loss_G: 4.472042, lr_G: 0.000800
2023-06-19 07:01:34,701 - INFO - [Train] step: 1999, loss_D: 0.014409, lr_D: 0.000800
2023-06-19 07:01:34,783 - INFO - [Train] step: 1999, loss_G: 5.573315, lr_G: 0.000800
2023-06-19 07:02:38,954 - INFO - [Eval] step: 1999, fid: 73.604650
2023-06-19 07:03:35,296 - INFO - [Train] step: 2099, loss_D: 0.037417, lr_D: 0.000800
2023-06-19 07:03:35,377 - INFO - [Train] step: 2099, loss_G: 5.429271, lr_G: 0.000800
2023-06-19 07:04:31,611 - INFO - [Train] step: 2199, loss_D: 0.072442, lr_D: 0.000800
2023-06-19 07:04:31,693 - INFO - [Train] step: 2199, loss_G: 6.113911, lr_G: 0.000800
2023-06-19 07:05:27,966 - INFO - [Train] step: 2299, loss_D: 0.030240, lr_D: 0.000800
2023-06-19 07:05:28,047 - INFO - [Train] step: 2299, loss_G: 5.275312, lr_G: 0.000800
2023-06-19 07:06:24,295 - INFO - [Train] step: 2399, loss_D: 0.027413, lr_D: 0.000800
2023-06-19 07:06:24,377 - INFO - [Train] step: 2399, loss_G: 4.359949, lr_G: 0.000800
2023-06-19 07:07:20,608 - INFO - [Train] step: 2499, loss_D: 0.092313, lr_D: 0.000800
2023-06-19 07:07:20,690 - INFO - [Train] step: 2499, loss_G: 5.406056, lr_G: 0.000800
2023-06-19 07:08:17,081 - INFO - [Train] step: 2599, loss_D: 0.003134, lr_D: 0.000800
2023-06-19 07:08:17,163 - INFO - [Train] step: 2599, loss_G: 6.895704, lr_G: 0.000800
2023-06-19 07:09:13,392 - INFO - [Train] step: 2699, loss_D: 0.032127, lr_D: 0.000800
2023-06-19 07:09:13,474 - INFO - [Train] step: 2699, loss_G: 6.057601, lr_G: 0.000800
2023-06-19 07:10:09,725 - INFO - [Train] step: 2799, loss_D: 0.033065, lr_D: 0.000800
2023-06-19 07:10:09,807 - INFO - [Train] step: 2799, loss_G: 4.513358, lr_G: 0.000800
2023-06-19 07:11:06,021 - INFO - [Train] step: 2899, loss_D: 0.128139, lr_D: 0.000800
2023-06-19 07:11:06,102 - INFO - [Train] step: 2899, loss_G: 4.308289, lr_G: 0.000800
2023-06-19 07:12:02,330 - INFO - [Train] step: 2999, loss_D: 0.021178, lr_D: 0.000800
2023-06-19 07:12:02,412 - INFO - [Train] step: 2999, loss_G: 4.663414, lr_G: 0.000800
2023-06-19 07:13:06,549 - INFO - [Eval] step: 2999, fid: 59.465212
2023-06-19 07:14:02,944 - INFO - [Train] step: 3099, loss_D: 0.035834, lr_D: 0.000800
2023-06-19 07:14:03,025 - INFO - [Train] step: 3099, loss_G: 5.410472, lr_G: 0.000800
2023-06-19 07:14:59,268 - INFO - [Train] step: 3199, loss_D: 0.018581, lr_D: 0.000800
2023-06-19 07:14:59,349 - INFO - [Train] step: 3199, loss_G: 5.978933, lr_G: 0.000800
2023-06-19 07:15:55,769 - INFO - [Train] step: 3299, loss_D: 0.091326, lr_D: 0.000800
2023-06-19 07:15:55,851 - INFO - [Train] step: 3299, loss_G: 6.560109, lr_G: 0.000800
2023-06-19 07:16:52,110 - INFO - [Train] step: 3399, loss_D: 0.039605, lr_D: 0.000800
2023-06-19 07:16:52,191 - INFO - [Train] step: 3399, loss_G: 4.526829, lr_G: 0.000800
2023-06-19 07:17:48,463 - INFO - [Train] step: 3499, loss_D: 0.046596, lr_D: 0.000800
2023-06-19 07:17:48,544 - INFO - [Train] step: 3499, loss_G: 5.566928, lr_G: 0.000800
2023-06-19 07:18:44,784 - INFO - [Train] step: 3599, loss_D: 0.051578, lr_D: 0.000800
2023-06-19 07:18:44,866 - INFO - [Train] step: 3599, loss_G: 4.628630, lr_G: 0.000800
2023-06-19 07:19:41,136 - INFO - [Train] step: 3699, loss_D: 0.875083, lr_D: 0.000800
2023-06-19 07:19:41,218 - INFO - [Train] step: 3699, loss_G: 2.216166, lr_G: 0.000800
2023-06-19 07:20:37,482 - INFO - [Train] step: 3799, loss_D: 0.004091, lr_D: 0.000800
2023-06-19 07:20:37,563 - INFO - [Train] step: 3799, loss_G: 6.654954, lr_G: 0.000800
2023-06-19 07:21:33,978 - INFO - [Train] step: 3899, loss_D: 0.121475, lr_D: 0.000800
2023-06-19 07:21:34,059 - INFO - [Train] step: 3899, loss_G: 2.944900, lr_G: 0.000800
2023-06-19 07:22:30,332 - INFO - [Train] step: 3999, loss_D: 0.000624, lr_D: 0.000800
2023-06-19 07:22:30,413 - INFO - [Train] step: 3999, loss_G: 8.885368, lr_G: 0.000800
2023-06-19 07:23:34,675 - INFO - [Eval] step: 3999, fid: 89.851385
2023-06-19 07:24:30,768 - INFO - [Train] step: 4099, loss_D: 0.058851, lr_D: 0.000800
2023-06-19 07:24:30,850 - INFO - [Train] step: 4099, loss_G: 4.135179, lr_G: 0.000800
2023-06-19 07:25:27,096 - INFO - [Train] step: 4199, loss_D: 0.056411, lr_D: 0.000800
2023-06-19 07:25:27,177 - INFO - [Train] step: 4199, loss_G: 5.088530, lr_G: 0.000800
2023-06-19 07:26:23,462 - INFO - [Train] step: 4299, loss_D: 0.054239, lr_D: 0.000800
2023-06-19 07:26:23,543 - INFO - [Train] step: 4299, loss_G: 4.548752, lr_G: 0.000800
2023-06-19 07:27:19,796 - INFO - [Train] step: 4399, loss_D: 0.018338, lr_D: 0.000800
2023-06-19 07:27:19,878 - INFO - [Train] step: 4399, loss_G: 5.101788, lr_G: 0.000800
2023-06-19 07:28:16,157 - INFO - [Train] step: 4499, loss_D: 0.037503, lr_D: 0.000800
2023-06-19 07:28:16,238 - INFO - [Train] step: 4499, loss_G: 5.311071, lr_G: 0.000800
2023-06-19 07:29:12,651 - INFO - [Train] step: 4599, loss_D: 0.065745, lr_D: 0.000800
2023-06-19 07:29:12,732 - INFO - [Train] step: 4599, loss_G: 4.077043, lr_G: 0.000800
2023-06-19 07:30:08,992 - INFO - [Train] step: 4699, loss_D: 0.053056, lr_D: 0.000800
2023-06-19 07:30:09,074 - INFO - [Train] step: 4699, loss_G: 4.404660, lr_G: 0.000800
2023-06-19 07:31:05,346 - INFO - [Train] step: 4799, loss_D: 0.113238, lr_D: 0.000800
2023-06-19 07:31:05,427 - INFO - [Train] step: 4799, loss_G: 3.932576, lr_G: 0.000800
2023-06-19 07:32:01,688 - INFO - [Train] step: 4899, loss_D: 0.058167, lr_D: 0.000800
2023-06-19 07:32:01,770 - INFO - [Train] step: 4899, loss_G: 4.172578, lr_G: 0.000800
2023-06-19 07:32:58,055 - INFO - [Train] step: 4999, loss_D: 0.001282, lr_D: 0.000800
2023-06-19 07:32:58,137 - INFO - [Train] step: 4999, loss_G: 7.975042, lr_G: 0.000800
2023-06-19 07:34:02,470 - INFO - [Eval] step: 4999, fid: 73.093654
2023-06-19 07:34:58,585 - INFO - [Train] step: 5099, loss_D: 0.052123, lr_D: 0.000800
2023-06-19 07:34:58,666 - INFO - [Train] step: 5099, loss_G: 4.467057, lr_G: 0.000800
2023-06-19 07:35:55,094 - INFO - [Train] step: 5199, loss_D: 0.062087, lr_D: 0.000800
2023-06-19 07:35:55,176 - INFO - [Train] step: 5199, loss_G: 5.340582, lr_G: 0.000800
2023-06-19 07:36:51,444 - INFO - [Train] step: 5299, loss_D: 0.049948, lr_D: 0.000800
2023-06-19 07:36:51,525 - INFO - [Train] step: 5299, loss_G: 4.679468, lr_G: 0.000800
2023-06-19 07:37:47,776 - INFO - [Train] step: 5399, loss_D: 0.089791, lr_D: 0.000800
2023-06-19 07:37:47,858 - INFO - [Train] step: 5399, loss_G: 4.531784, lr_G: 0.000800
2023-06-19 07:38:44,138 - INFO - [Train] step: 5499, loss_D: 0.065635, lr_D: 0.000800
2023-06-19 07:38:44,219 - INFO - [Train] step: 5499, loss_G: 5.405015, lr_G: 0.000800
2023-06-19 07:39:40,458 - INFO - [Train] step: 5599, loss_D: 0.068110, lr_D: 0.000800
2023-06-19 07:39:40,540 - INFO - [Train] step: 5599, loss_G: 3.712178, lr_G: 0.000800
2023-06-19 07:40:36,835 - INFO - [Train] step: 5699, loss_D: 0.050999, lr_D: 0.000800
2023-06-19 07:40:36,917 - INFO - [Train] step: 5699, loss_G: 4.633172, lr_G: 0.000800
2023-06-19 07:41:33,190 - INFO - [Train] step: 5799, loss_D: 0.109617, lr_D: 0.000800
2023-06-19 07:41:33,271 - INFO - [Train] step: 5799, loss_G: 3.421207, lr_G: 0.000800
2023-06-19 07:42:29,705 - INFO - [Train] step: 5899, loss_D: 0.059575, lr_D: 0.000800
2023-06-19 07:42:29,787 - INFO - [Train] step: 5899, loss_G: 4.240124, lr_G: 0.000800
2023-06-19 07:43:26,059 - INFO - [Train] step: 5999, loss_D: 0.000489, lr_D: 0.000800
2023-06-19 07:43:26,140 - INFO - [Train] step: 5999, loss_G: 11.243366, lr_G: 0.000800
2023-06-19 07:44:30,365 - INFO - [Eval] step: 5999, fid: 40.310135
2023-06-19 07:45:26,777 - INFO - [Train] step: 6099, loss_D: 0.027193, lr_D: 0.000800
2023-06-19 07:45:26,858 - INFO - [Train] step: 6099, loss_G: 5.202495, lr_G: 0.000800
2023-06-19 07:46:23,114 - INFO - [Train] step: 6199, loss_D: 0.026119, lr_D: 0.000800
2023-06-19 07:46:23,195 - INFO - [Train] step: 6199, loss_G: 6.296937, lr_G: 0.000800
2023-06-19 07:47:19,412 - INFO - [Train] step: 6299, loss_D: 0.061847, lr_D: 0.000800
2023-06-19 07:47:19,494 - INFO - [Train] step: 6299, loss_G: 4.704062, lr_G: 0.000800
2023-06-19 07:48:15,725 - INFO - [Train] step: 6399, loss_D: 0.030437, lr_D: 0.000800
2023-06-19 07:48:15,806 - INFO - [Train] step: 6399, loss_G: 5.458444, lr_G: 0.000800
2023-06-19 07:49:12,193 - INFO - [Train] step: 6499, loss_D: 0.045358, lr_D: 0.000800
2023-06-19 07:49:12,275 - INFO - [Train] step: 6499, loss_G: 4.910829, lr_G: 0.000800
2023-06-19 07:50:08,514 - INFO - [Train] step: 6599, loss_D: 0.002097, lr_D: 0.000800
2023-06-19 07:50:08,596 - INFO - [Train] step: 6599, loss_G: 8.095797, lr_G: 0.000800
2023-06-19 07:51:04,876 - INFO - [Train] step: 6699, loss_D: 0.000352, lr_D: 0.000800
2023-06-19 07:51:04,958 - INFO - [Train] step: 6699, loss_G: 18.736599, lr_G: 0.000800
2023-06-19 07:52:01,207 - INFO - [Train] step: 6799, loss_D: 0.045580, lr_D: 0.000800
2023-06-19 07:52:01,288 - INFO - [Train] step: 6799, loss_G: 6.744883, lr_G: 0.000800
2023-06-19 07:52:57,573 - INFO - [Train] step: 6899, loss_D: 0.169104, lr_D: 0.000800
2023-06-19 07:52:57,654 - INFO - [Train] step: 6899, loss_G: 3.048979, lr_G: 0.000800
2023-06-19 07:53:53,906 - INFO - [Train] step: 6999, loss_D: 0.500750, lr_D: 0.000800
2023-06-19 07:53:53,988 - INFO - [Train] step: 6999, loss_G: 11.189466, lr_G: 0.000800
2023-06-19 07:54:58,265 - INFO - [Eval] step: 6999, fid: 34.780130
2023-06-19 07:55:54,726 - INFO - [Train] step: 7099, loss_D: 0.052225, lr_D: 0.000800
2023-06-19 07:55:54,807 - INFO - [Train] step: 7099, loss_G: 5.818181, lr_G: 0.000800
2023-06-19 07:56:51,233 - INFO - [Train] step: 7199, loss_D: 0.045626, lr_D: 0.000800
2023-06-19 07:56:51,315 - INFO - [Train] step: 7199, loss_G: 4.660539, lr_G: 0.000800
2023-06-19 07:57:47,629 - INFO - [Train] step: 7299, loss_D: 0.001194, lr_D: 0.000800
2023-06-19 07:57:47,710 - INFO - [Train] step: 7299, loss_G: 8.500381, lr_G: 0.000800
2023-06-19 07:58:43,988 - INFO - [Train] step: 7399, loss_D: 0.037506, lr_D: 0.000800
2023-06-19 07:58:44,069 - INFO - [Train] step: 7399, loss_G: 5.260871, lr_G: 0.000800
2023-06-19 07:59:40,350 - INFO - [Train] step: 7499, loss_D: 0.037570, lr_D: 0.000800
2023-06-19 07:59:40,431 - INFO - [Train] step: 7499, loss_G: 5.033466, lr_G: 0.000800
2023-06-19 08:00:36,696 - INFO - [Train] step: 7599, loss_D: 0.051622, lr_D: 0.000800
2023-06-19 08:00:36,777 - INFO - [Train] step: 7599, loss_G: 4.794438, lr_G: 0.000800
2023-06-19 08:01:33,021 - INFO - [Train] step: 7699, loss_D: 0.498722, lr_D: 0.000800
2023-06-19 08:01:33,102 - INFO - [Train] step: 7699, loss_G: 2.245214, lr_G: 0.000800
2023-06-19 08:02:29,496 - INFO - [Train] step: 7799, loss_D: 0.013566, lr_D: 0.000800
2023-06-19 08:02:29,578 - INFO - [Train] step: 7799, loss_G: 6.209473, lr_G: 0.000800
2023-06-19 08:03:25,849 - INFO - [Train] step: 7899, loss_D: 0.000463, lr_D: 0.000800
2023-06-19 08:03:25,931 - INFO - [Train] step: 7899, loss_G: 8.900876, lr_G: 0.000800
2023-06-19 08:04:22,196 - INFO - [Train] step: 7999, loss_D: 0.036793, lr_D: 0.000800
2023-06-19 08:04:22,278 - INFO - [Train] step: 7999, loss_G: 6.037778, lr_G: 0.000800
2023-06-19 08:05:26,463 - INFO - [Eval] step: 7999, fid: 36.936231
2023-06-19 08:06:22,570 - INFO - [Train] step: 8099, loss_D: 0.112042, lr_D: 0.000800
2023-06-19 08:06:22,651 - INFO - [Train] step: 8099, loss_G: 3.544972, lr_G: 0.000800
2023-06-19 08:07:18,903 - INFO - [Train] step: 8199, loss_D: 0.455392, lr_D: 0.000800
2023-06-19 08:07:18,984 - INFO - [Train] step: 8199, loss_G: 5.307905, lr_G: 0.000800
2023-06-19 08:08:15,237 - INFO - [Train] step: 8299, loss_D: 0.051783, lr_D: 0.000800
2023-06-19 08:08:15,318 - INFO - [Train] step: 8299, loss_G: 5.489743, lr_G: 0.000800
2023-06-19 08:09:11,558 - INFO - [Train] step: 8399, loss_D: 0.091903, lr_D: 0.000800
2023-06-19 08:09:11,639 - INFO - [Train] step: 8399, loss_G: 5.354832, lr_G: 0.000800
2023-06-19 08:10:08,109 - INFO - [Train] step: 8499, loss_D: 0.055279, lr_D: 0.000800
2023-06-19 08:10:08,191 - INFO - [Train] step: 8499, loss_G: 4.809944, lr_G: 0.000800
2023-06-19 08:11:04,460 - INFO - [Train] step: 8599, loss_D: 0.036242, lr_D: 0.000800
2023-06-19 08:11:04,541 - INFO - [Train] step: 8599, loss_G: 5.183571, lr_G: 0.000800
2023-06-19 08:12:00,790 - INFO - [Train] step: 8699, loss_D: 0.062853, lr_D: 0.000800
2023-06-19 08:12:00,872 - INFO - [Train] step: 8699, loss_G: 4.411985, lr_G: 0.000800
2023-06-19 08:12:57,137 - INFO - [Train] step: 8799, loss_D: 0.001296, lr_D: 0.000800
2023-06-19 08:12:57,218 - INFO - [Train] step: 8799, loss_G: 8.633904, lr_G: 0.000800
2023-06-19 08:13:53,493 - INFO - [Train] step: 8899, loss_D: 0.068424, lr_D: 0.000800
2023-06-19 08:13:53,574 - INFO - [Train] step: 8899, loss_G: 4.135492, lr_G: 0.000800
2023-06-19 08:14:49,859 - INFO - [Train] step: 8999, loss_D: 0.048434, lr_D: 0.000800
2023-06-19 08:14:49,941 - INFO - [Train] step: 8999, loss_G: 5.008927, lr_G: 0.000800
2023-06-19 08:15:54,206 - INFO - [Eval] step: 8999, fid: 32.497251
2023-06-19 08:16:50,769 - INFO - [Train] step: 9099, loss_D: 0.929024, lr_D: 0.000800
2023-06-19 08:16:50,850 - INFO - [Train] step: 9099, loss_G: 2.507557, lr_G: 0.000800
2023-06-19 08:17:47,106 - INFO - [Train] step: 9199, loss_D: 0.265569, lr_D: 0.000800
2023-06-19 08:17:47,188 - INFO - [Train] step: 9199, loss_G: 4.884586, lr_G: 0.000800
2023-06-19 08:18:43,461 - INFO - [Train] step: 9299, loss_D: 0.000266, lr_D: 0.000800
2023-06-19 08:18:43,542 - INFO - [Train] step: 9299, loss_G: 10.169846, lr_G: 0.000800
2023-06-19 08:19:39,802 - INFO - [Train] step: 9399, loss_D: 0.031350, lr_D: 0.000800
2023-06-19 08:19:39,884 - INFO - [Train] step: 9399, loss_G: 5.980072, lr_G: 0.000800
2023-06-19 08:20:36,153 - INFO - [Train] step: 9499, loss_D: 0.027726, lr_D: 0.000800
2023-06-19 08:20:36,235 - INFO - [Train] step: 9499, loss_G: 5.035431, lr_G: 0.000800
2023-06-19 08:21:32,498 - INFO - [Train] step: 9599, loss_D: 0.000259, lr_D: 0.000800
2023-06-19 08:21:32,580 - INFO - [Train] step: 9599, loss_G: 9.871834, lr_G: 0.000800
2023-06-19 08:22:28,853 - INFO - [Train] step: 9699, loss_D: 0.000199, lr_D: 0.000800
2023-06-19 08:22:28,933 - INFO - [Train] step: 9699, loss_G: 9.494620, lr_G: 0.000800
2023-06-19 08:23:25,343 - INFO - [Train] step: 9799, loss_D: 0.000142, lr_D: 0.000800
2023-06-19 08:23:25,424 - INFO - [Train] step: 9799, loss_G: 11.081049, lr_G: 0.000800
2023-06-19 08:24:21,701 - INFO - [Train] step: 9899, loss_D: 0.000194, lr_D: 0.000800
2023-06-19 08:24:21,783 - INFO - [Train] step: 9899, loss_G: 14.348421, lr_G: 0.000800
2023-06-19 08:25:18,046 - INFO - [Train] step: 9999, loss_D: 0.023221, lr_D: 0.000800
2023-06-19 08:25:18,127 - INFO - [Train] step: 9999, loss_G: 8.355829, lr_G: 0.000800
2023-06-19 08:26:22,395 - INFO - [Eval] step: 9999, fid: 40.475013
2023-06-19 08:27:18,663 - INFO - [Train] step: 10099, loss_D: 0.039605, lr_D: 0.000800
2023-06-19 08:27:18,744 - INFO - [Train] step: 10099, loss_G: 4.941957, lr_G: 0.000800
2023-06-19 08:28:14,996 - INFO - [Train] step: 10199, loss_D: 0.037892, lr_D: 0.000800
2023-06-19 08:28:15,078 - INFO - [Train] step: 10199, loss_G: 5.328730, lr_G: 0.000800
2023-06-19 08:29:11,304 - INFO - [Train] step: 10299, loss_D: 0.053915, lr_D: 0.000800
2023-06-19 08:29:11,386 - INFO - [Train] step: 10299, loss_G: 5.666798, lr_G: 0.000800
2023-06-19 08:30:07,841 - INFO - [Train] step: 10399, loss_D: 0.029876, lr_D: 0.000800
2023-06-19 08:30:07,923 - INFO - [Train] step: 10399, loss_G: 5.185546, lr_G: 0.000800
2023-06-19 08:31:04,185 - INFO - [Train] step: 10499, loss_D: 0.125697, lr_D: 0.000800
2023-06-19 08:31:04,267 - INFO - [Train] step: 10499, loss_G: 4.291004, lr_G: 0.000800
2023-06-19 08:32:00,536 - INFO - [Train] step: 10599, loss_D: 0.507708, lr_D: 0.000800
2023-06-19 08:32:00,618 - INFO - [Train] step: 10599, loss_G: 1.926178, lr_G: 0.000800
2023-06-19 08:32:56,886 - INFO - [Train] step: 10699, loss_D: 0.000461, lr_D: 0.000800
2023-06-19 08:32:56,967 - INFO - [Train] step: 10699, loss_G: 9.017170, lr_G: 0.000800
2023-06-19 08:33:53,225 - INFO - [Train] step: 10799, loss_D: 0.000108, lr_D: 0.000800
2023-06-19 08:33:53,307 - INFO - [Train] step: 10799, loss_G: 9.532263, lr_G: 0.000800
2023-06-19 08:34:49,527 - INFO - [Train] step: 10899, loss_D: 0.000061, lr_D: 0.000800
2023-06-19 08:34:49,608 - INFO - [Train] step: 10899, loss_G: 10.892931, lr_G: 0.000800
2023-06-19 08:35:46,015 - INFO - [Train] step: 10999, loss_D: 0.002290, lr_D: 0.000800
2023-06-19 08:35:46,097 - INFO - [Train] step: 10999, loss_G: 15.626179, lr_G: 0.000800
2023-06-19 08:36:50,949 - INFO - [Eval] step: 10999, fid: 215.174965
2023-06-19 08:37:47,095 - INFO - [Train] step: 11099, loss_D: 0.022483, lr_D: 0.000800
2023-06-19 08:37:47,177 - INFO - [Train] step: 11099, loss_G: 8.285819, lr_G: 0.000800
2023-06-19 08:38:43,425 - INFO - [Train] step: 11199, loss_D: 0.026775, lr_D: 0.000800
2023-06-19 08:38:43,507 - INFO - [Train] step: 11199, loss_G: 6.539058, lr_G: 0.000800
2023-06-19 08:39:39,776 - INFO - [Train] step: 11299, loss_D: 0.043612, lr_D: 0.000800
2023-06-19 08:39:39,857 - INFO - [Train] step: 11299, loss_G: 5.061043, lr_G: 0.000800
2023-06-19 08:40:36,111 - INFO - [Train] step: 11399, loss_D: 0.033471, lr_D: 0.000800
2023-06-19 08:40:36,192 - INFO - [Train] step: 11399, loss_G: 5.674434, lr_G: 0.000800
2023-06-19 08:41:32,474 - INFO - [Train] step: 11499, loss_D: 0.026429, lr_D: 0.000800
2023-06-19 08:41:32,556 - INFO - [Train] step: 11499, loss_G: 5.672029, lr_G: 0.000800
2023-06-19 08:42:28,809 - INFO - [Train] step: 11599, loss_D: 0.142755, lr_D: 0.000800
2023-06-19 08:42:28,890 - INFO - [Train] step: 11599, loss_G: 5.084485, lr_G: 0.000800
2023-06-19 08:43:25,325 - INFO - [Train] step: 11699, loss_D: 0.185015, lr_D: 0.000800
2023-06-19 08:43:25,406 - INFO - [Train] step: 11699, loss_G: 5.503608, lr_G: 0.000800
2023-06-19 08:44:21,652 - INFO - [Train] step: 11799, loss_D: 0.042929, lr_D: 0.000800
2023-06-19 08:44:21,733 - INFO - [Train] step: 11799, loss_G: 5.564418, lr_G: 0.000800
2023-06-19 08:45:17,992 - INFO - [Train] step: 11899, loss_D: 0.024136, lr_D: 0.000800
2023-06-19 08:45:18,074 - INFO - [Train] step: 11899, loss_G: 5.376736, lr_G: 0.000800
2023-06-19 08:46:14,316 - INFO - [Train] step: 11999, loss_D: 0.244321, lr_D: 0.000800
2023-06-19 08:46:14,398 - INFO - [Train] step: 11999, loss_G: 2.660049, lr_G: 0.000800
2023-06-19 08:47:18,650 - INFO - [Eval] step: 11999, fid: 28.888697
2023-06-19 08:48:15,124 - INFO - [Train] step: 12099, loss_D: 0.022625, lr_D: 0.000800
2023-06-19 08:48:15,205 - INFO - [Train] step: 12099, loss_G: 4.904669, lr_G: 0.000800
2023-06-19 08:49:11,438 - INFO - [Train] step: 12199, loss_D: 0.061994, lr_D: 0.000800
2023-06-19 08:49:11,519 - INFO - [Train] step: 12199, loss_G: 6.860122, lr_G: 0.000800
2023-06-19 08:50:07,958 - INFO - [Train] step: 12299, loss_D: 0.000414, lr_D: 0.000800
2023-06-19 08:50:08,039 - INFO - [Train] step: 12299, loss_G: 9.327923, lr_G: 0.000800
2023-06-19 08:51:04,304 - INFO - [Train] step: 12399, loss_D: 0.000579, lr_D: 0.000800
2023-06-19 08:51:04,386 - INFO - [Train] step: 12399, loss_G: 12.291039, lr_G: 0.000800
2023-06-19 08:52:00,666 - INFO - [Train] step: 12499, loss_D: 0.017507, lr_D: 0.000800
2023-06-19 08:52:00,747 - INFO - [Train] step: 12499, loss_G: 6.256863, lr_G: 0.000800
2023-06-19 08:52:57,018 - INFO - [Train] step: 12599, loss_D: 0.029427, lr_D: 0.000800
2023-06-19 08:52:57,099 - INFO - [Train] step: 12599, loss_G: 5.838950, lr_G: 0.000800
2023-06-19 08:53:53,380 - INFO - [Train] step: 12699, loss_D: 0.009009, lr_D: 0.000800
2023-06-19 08:53:53,461 - INFO - [Train] step: 12699, loss_G: 6.909416, lr_G: 0.000800
2023-06-19 08:54:49,704 - INFO - [Train] step: 12799, loss_D: 0.025587, lr_D: 0.000800
2023-06-19 08:54:49,786 - INFO - [Train] step: 12799, loss_G: 6.415884, lr_G: 0.000800
2023-06-19 08:55:46,047 - INFO - [Train] step: 12899, loss_D: 0.003898, lr_D: 0.000800
2023-06-19 08:55:46,128 - INFO - [Train] step: 12899, loss_G: 8.048004, lr_G: 0.000800
2023-06-19 08:56:42,566 - INFO - [Train] step: 12999, loss_D: 0.030631, lr_D: 0.000800
2023-06-19 08:56:42,647 - INFO - [Train] step: 12999, loss_G: 4.843826, lr_G: 0.000800
2023-06-19 08:57:46,959 - INFO - [Eval] step: 12999, fid: 29.100301
2023-06-19 08:58:43,140 - INFO - [Train] step: 13099, loss_D: 0.019388, lr_D: 0.000800
2023-06-19 08:58:43,222 - INFO - [Train] step: 13099, loss_G: 5.473764, lr_G: 0.000800
2023-06-19 08:59:39,477 - INFO - [Train] step: 13199, loss_D: 0.577754, lr_D: 0.000800
2023-06-19 08:59:39,558 - INFO - [Train] step: 13199, loss_G: 4.489534, lr_G: 0.000800
2023-06-19 09:00:35,842 - INFO - [Train] step: 13299, loss_D: 0.002029, lr_D: 0.000800
2023-06-19 09:00:35,924 - INFO - [Train] step: 13299, loss_G: 8.638842, lr_G: 0.000800
2023-06-19 09:01:32,189 - INFO - [Train] step: 13399, loss_D: 0.007819, lr_D: 0.000800
2023-06-19 09:01:32,271 - INFO - [Train] step: 13399, loss_G: 7.257872, lr_G: 0.000800
2023-06-19 09:02:28,551 - INFO - [Train] step: 13499, loss_D: 0.014060, lr_D: 0.000800
2023-06-19 09:02:28,632 - INFO - [Train] step: 13499, loss_G: 6.161894, lr_G: 0.000800
2023-06-19 09:03:25,053 - INFO - [Train] step: 13599, loss_D: 0.011795, lr_D: 0.000800
2023-06-19 09:03:25,134 - INFO - [Train] step: 13599, loss_G: 6.126879, lr_G: 0.000800
2023-06-19 09:04:21,391 - INFO - [Train] step: 13699, loss_D: 0.031737, lr_D: 0.000800
2023-06-19 09:04:21,473 - INFO - [Train] step: 13699, loss_G: 6.573884, lr_G: 0.000800
2023-06-19 09:05:17,757 - INFO - [Train] step: 13799, loss_D: 0.031759, lr_D: 0.000800
2023-06-19 09:05:17,839 - INFO - [Train] step: 13799, loss_G: 6.990473, lr_G: 0.000800
2023-06-19 09:06:14,090 - INFO - [Train] step: 13899, loss_D: 0.407232, lr_D: 0.000800
2023-06-19 09:06:14,171 - INFO - [Train] step: 13899, loss_G: 4.700905, lr_G: 0.000800
2023-06-19 09:07:10,424 - INFO - [Train] step: 13999, loss_D: 0.166616, lr_D: 0.000800
2023-06-19 09:07:10,505 - INFO - [Train] step: 13999, loss_G: 3.873073, lr_G: 0.000800
2023-06-19 09:08:14,687 - INFO - [Eval] step: 13999, fid: 30.122273
2023-06-19 09:09:10,782 - INFO - [Train] step: 14099, loss_D: 0.011168, lr_D: 0.000800
2023-06-19 09:09:10,864 - INFO - [Train] step: 14099, loss_G: 6.435178, lr_G: 0.000800
2023-06-19 09:10:07,125 - INFO - [Train] step: 14199, loss_D: 0.022821, lr_D: 0.000800
2023-06-19 09:10:07,206 - INFO - [Train] step: 14199, loss_G: 6.191652, lr_G: 0.000800
2023-06-19 09:11:03,617 - INFO - [Train] step: 14299, loss_D: 0.012507, lr_D: 0.000800
2023-06-19 09:11:03,699 - INFO - [Train] step: 14299, loss_G: 6.257763, lr_G: 0.000800
2023-06-19 09:11:59,971 - INFO - [Train] step: 14399, loss_D: 0.714168, lr_D: 0.000800
2023-06-19 09:12:00,052 - INFO - [Train] step: 14399, loss_G: 2.642855, lr_G: 0.000800
2023-06-19 09:12:56,294 - INFO - [Train] step: 14499, loss_D: 0.017482, lr_D: 0.000800
2023-06-19 09:12:56,375 - INFO - [Train] step: 14499, loss_G: 5.921625, lr_G: 0.000800
2023-06-19 09:13:52,665 - INFO - [Train] step: 14599, loss_D: 0.099598, lr_D: 0.000800
2023-06-19 09:13:52,746 - INFO - [Train] step: 14599, loss_G: 5.173916, lr_G: 0.000800
2023-06-19 09:14:48,978 - INFO - [Train] step: 14699, loss_D: 0.004284, lr_D: 0.000800
2023-06-19 09:14:49,060 - INFO - [Train] step: 14699, loss_G: 7.982943, lr_G: 0.000800
2023-06-19 09:15:45,286 - INFO - [Train] step: 14799, loss_D: 0.011505, lr_D: 0.000800
2023-06-19 09:15:45,367 - INFO - [Train] step: 14799, loss_G: 6.908368, lr_G: 0.000800
2023-06-19 09:16:41,790 - INFO - [Train] step: 14899, loss_D: 0.327825, lr_D: 0.000800
2023-06-19 09:16:41,872 - INFO - [Train] step: 14899, loss_G: 3.189372, lr_G: 0.000800
2023-06-19 09:17:38,130 - INFO - [Train] step: 14999, loss_D: 0.010167, lr_D: 0.000800
2023-06-19 09:17:38,212 - INFO - [Train] step: 14999, loss_G: 6.653577, lr_G: 0.000800
2023-06-19 09:18:42,464 - INFO - [Eval] step: 14999, fid: 27.506807
2023-06-19 09:19:38,913 - INFO - [Train] step: 15099, loss_D: 0.037868, lr_D: 0.000800
2023-06-19 09:19:38,995 - INFO - [Train] step: 15099, loss_G: 5.994465, lr_G: 0.000800
2023-06-19 09:20:35,270 - INFO - [Train] step: 15199, loss_D: 0.000691, lr_D: 0.000800
2023-06-19 09:20:35,351 - INFO - [Train] step: 15199, loss_G: 10.475203, lr_G: 0.000800
2023-06-19 09:21:31,644 - INFO - [Train] step: 15299, loss_D: 0.017640, lr_D: 0.000800
2023-06-19 09:21:31,726 - INFO - [Train] step: 15299, loss_G: 7.131041, lr_G: 0.000800
2023-06-19 09:22:27,972 - INFO - [Train] step: 15399, loss_D: 0.010676, lr_D: 0.000800
2023-06-19 09:22:28,053 - INFO - [Train] step: 15399, loss_G: 7.494817, lr_G: 0.000800
2023-06-19 09:23:24,349 - INFO - [Train] step: 15499, loss_D: 0.028482, lr_D: 0.000800
2023-06-19 09:23:24,430 - INFO - [Train] step: 15499, loss_G: 6.060194, lr_G: 0.000800
2023-06-19 09:24:20,829 - INFO - [Train] step: 15599, loss_D: 0.013458, lr_D: 0.000800
2023-06-19 09:24:20,911 - INFO - [Train] step: 15599, loss_G: 6.910407, lr_G: 0.000800
2023-06-19 09:25:17,192 - INFO - [Train] step: 15699, loss_D: 0.075469, lr_D: 0.000800
2023-06-19 09:25:17,274 - INFO - [Train] step: 15699, loss_G: 7.791945, lr_G: 0.000800
2023-06-19 09:26:13,534 - INFO - [Train] step: 15799, loss_D: 0.022131, lr_D: 0.000800
2023-06-19 09:26:13,616 - INFO - [Train] step: 15799, loss_G: 6.624705, lr_G: 0.000800
2023-06-19 09:27:09,887 - INFO - [Train] step: 15899, loss_D: 0.019269, lr_D: 0.000800
2023-06-19 09:27:09,968 - INFO - [Train] step: 15899, loss_G: 5.893013, lr_G: 0.000800
2023-06-19 09:28:06,233 - INFO - [Train] step: 15999, loss_D: 0.004084, lr_D: 0.000800
2023-06-19 09:28:06,315 - INFO - [Train] step: 15999, loss_G: 7.806884, lr_G: 0.000800
2023-06-19 09:29:10,574 - INFO - [Eval] step: 15999, fid: 26.743954
2023-06-19 09:30:07,036 - INFO - [Train] step: 16099, loss_D: 0.085360, lr_D: 0.000800
2023-06-19 09:30:07,117 - INFO - [Train] step: 16099, loss_G: 5.665245, lr_G: 0.000800
2023-06-19 09:31:03,523 - INFO - [Train] step: 16199, loss_D: 0.021303, lr_D: 0.000800
2023-06-19 09:31:03,605 - INFO - [Train] step: 16199, loss_G: 7.096385, lr_G: 0.000800
2023-06-19 09:31:59,868 - INFO - [Train] step: 16299, loss_D: 0.002254, lr_D: 0.000800
2023-06-19 09:31:59,949 - INFO - [Train] step: 16299, loss_G: 9.358388, lr_G: 0.000800
2023-06-19 09:32:56,186 - INFO - [Train] step: 16399, loss_D: 0.004077, lr_D: 0.000800
2023-06-19 09:32:56,267 - INFO - [Train] step: 16399, loss_G: 9.084323, lr_G: 0.000800
2023-06-19 09:33:52,523 - INFO - [Train] step: 16499, loss_D: 0.053801, lr_D: 0.000800
2023-06-19 09:33:52,604 - INFO - [Train] step: 16499, loss_G: 5.032917, lr_G: 0.000800
2023-06-19 09:34:48,882 - INFO - [Train] step: 16599, loss_D: 0.008170, lr_D: 0.000800
2023-06-19 09:34:48,964 - INFO - [Train] step: 16599, loss_G: 7.771974, lr_G: 0.000800
2023-06-19 09:35:45,213 - INFO - [Train] step: 16699, loss_D: 0.000208, lr_D: 0.000800
2023-06-19 09:35:45,294 - INFO - [Train] step: 16699, loss_G: 10.715098, lr_G: 0.000800
2023-06-19 09:36:41,563 - INFO - [Train] step: 16799, loss_D: 0.023118, lr_D: 0.000800
2023-06-19 09:36:41,644 - INFO - [Train] step: 16799, loss_G: 7.210239, lr_G: 0.000800
2023-06-19 09:37:38,091 - INFO - [Train] step: 16899, loss_D: 0.101323, lr_D: 0.000800
2023-06-19 09:37:38,173 - INFO - [Train] step: 16899, loss_G: 5.057130, lr_G: 0.000800
2023-06-19 09:38:34,440 - INFO - [Train] step: 16999, loss_D: 0.010221, lr_D: 0.000800
2023-06-19 09:38:34,521 - INFO - [Train] step: 16999, loss_G: 7.213800, lr_G: 0.000800
2023-06-19 09:39:38,775 - INFO - [Eval] step: 16999, fid: 26.515884
2023-06-19 09:40:35,225 - INFO - [Train] step: 17099, loss_D: 0.001176, lr_D: 0.000800
2023-06-19 09:40:35,307 - INFO - [Train] step: 17099, loss_G: 9.803704, lr_G: 0.000800
2023-06-19 09:41:31,590 - INFO - [Train] step: 17199, loss_D: 0.014084, lr_D: 0.000800
2023-06-19 09:41:31,672 - INFO - [Train] step: 17199, loss_G: 7.598066, lr_G: 0.000800
2023-06-19 09:42:27,937 - INFO - [Train] step: 17299, loss_D: 0.009302, lr_D: 0.000800
2023-06-19 09:42:28,019 - INFO - [Train] step: 17299, loss_G: 8.007270, lr_G: 0.000800
2023-06-19 09:43:24,321 - INFO - [Train] step: 17399, loss_D: 0.014920, lr_D: 0.000800
2023-06-19 09:43:24,403 - INFO - [Train] step: 17399, loss_G: 8.444987, lr_G: 0.000800
2023-06-19 09:44:20,885 - INFO - [Train] step: 17499, loss_D: 0.334956, lr_D: 0.000800
2023-06-19 09:44:20,967 - INFO - [Train] step: 17499, loss_G: 2.592437, lr_G: 0.000800
2023-06-19 09:45:17,283 - INFO - [Train] step: 17599, loss_D: 0.017689, lr_D: 0.000800
2023-06-19 09:45:17,364 - INFO - [Train] step: 17599, loss_G: 6.750707, lr_G: 0.000800
2023-06-19 09:46:13,649 - INFO - [Train] step: 17699, loss_D: 0.005535, lr_D: 0.000800
2023-06-19 09:46:13,731 - INFO - [Train] step: 17699, loss_G: 7.371028, lr_G: 0.000800
2023-06-19 09:47:10,012 - INFO - [Train] step: 17799, loss_D: 0.129696, lr_D: 0.000800
2023-06-19 09:47:10,093 - INFO - [Train] step: 17799, loss_G: 5.671164, lr_G: 0.000800
2023-06-19 09:48:06,359 - INFO - [Train] step: 17899, loss_D: 0.004967, lr_D: 0.000800
2023-06-19 09:48:06,449 - INFO - [Train] step: 17899, loss_G: 8.622005, lr_G: 0.000800
2023-06-19 09:49:02,708 - INFO - [Train] step: 17999, loss_D: 0.008736, lr_D: 0.000800
2023-06-19 09:49:02,789 - INFO - [Train] step: 17999, loss_G: 7.477597, lr_G: 0.000800
2023-06-19 09:50:06,956 - INFO - [Eval] step: 17999, fid: 28.273360
2023-06-19 09:51:03,097 - INFO - [Train] step: 18099, loss_D: 0.294615, lr_D: 0.000800
2023-06-19 09:51:03,178 - INFO - [Train] step: 18099, loss_G: 3.543846, lr_G: 0.000800
2023-06-19 09:51:59,610 - INFO - [Train] step: 18199, loss_D: 0.013412, lr_D: 0.000800
2023-06-19 09:51:59,692 - INFO - [Train] step: 18199, loss_G: 6.912338, lr_G: 0.000800
2023-06-19 09:52:55,987 - INFO - [Train] step: 18299, loss_D: 0.000341, lr_D: 0.000800
2023-06-19 09:52:56,068 - INFO - [Train] step: 18299, loss_G: 10.787582, lr_G: 0.000800
2023-06-19 09:53:52,334 - INFO - [Train] step: 18399, loss_D: 0.002645, lr_D: 0.000800
2023-06-19 09:53:52,415 - INFO - [Train] step: 18399, loss_G: 10.226908, lr_G: 0.000800
2023-06-19 09:54:48,728 - INFO - [Train] step: 18499, loss_D: 0.000037, lr_D: 0.000800
2023-06-19 09:54:48,809 - INFO - [Train] step: 18499, loss_G: 14.196541, lr_G: 0.000800
2023-06-19 09:55:45,123 - INFO - [Train] step: 18599, loss_D: 0.000001, lr_D: 0.000800
2023-06-19 09:55:45,205 - INFO - [Train] step: 18599, loss_G: 19.047920, lr_G: 0.000800
2023-06-19 09:56:41,469 - INFO - [Train] step: 18699, loss_D: 0.002114, lr_D: 0.000800
2023-06-19 09:56:41,550 - INFO - [Train] step: 18699, loss_G: 50.957649, lr_G: 0.000800
2023-06-19 09:57:37,997 - INFO - [Train] step: 18799, loss_D: 0.027844, lr_D: 0.000800
2023-06-19 09:57:38,078 - INFO - [Train] step: 18799, loss_G: 10.140493, lr_G: 0.000800
2023-06-19 09:58:34,386 - INFO - [Train] step: 18899, loss_D: 0.021737, lr_D: 0.000800
2023-06-19 09:58:34,468 - INFO - [Train] step: 18899, loss_G: 7.396913, lr_G: 0.000800
2023-06-19 09:59:30,739 - INFO - [Train] step: 18999, loss_D: 0.017253, lr_D: 0.000800
2023-06-19 09:59:30,820 - INFO - [Train] step: 18999, loss_G: 7.064800, lr_G: 0.000800
2023-06-19 10:00:35,141 - INFO - [Eval] step: 18999, fid: 30.954746
2023-06-19 10:01:31,313 - INFO - [Train] step: 19099, loss_D: 0.306815, lr_D: 0.000800
2023-06-19 10:01:31,395 - INFO - [Train] step: 19099, loss_G: 4.023575, lr_G: 0.000800
2023-06-19 10:02:27,680 - INFO - [Train] step: 19199, loss_D: 0.008726, lr_D: 0.000800
2023-06-19 10:02:27,762 - INFO - [Train] step: 19199, loss_G: 6.844458, lr_G: 0.000800
2023-06-19 10:03:24,033 - INFO - [Train] step: 19299, loss_D: 0.037297, lr_D: 0.000800
2023-06-19 10:03:24,114 - INFO - [Train] step: 19299, loss_G: 5.857257, lr_G: 0.000800
2023-06-19 10:04:20,434 - INFO - [Train] step: 19399, loss_D: 0.007506, lr_D: 0.000800
2023-06-19 10:04:20,515 - INFO - [Train] step: 19399, loss_G: 7.727375, lr_G: 0.000800
2023-06-19 10:05:16,950 - INFO - [Train] step: 19499, loss_D: 0.165511, lr_D: 0.000800
2023-06-19 10:05:17,031 - INFO - [Train] step: 19499, loss_G: 6.270530, lr_G: 0.000800
2023-06-19 10:06:13,317 - INFO - [Train] step: 19599, loss_D: 0.015582, lr_D: 0.000800
2023-06-19 10:06:13,399 - INFO - [Train] step: 19599, loss_G: 6.429938, lr_G: 0.000800
2023-06-19 10:07:09,700 - INFO - [Train] step: 19699, loss_D: 0.008207, lr_D: 0.000800
2023-06-19 10:07:09,781 - INFO - [Train] step: 19699, loss_G: 7.206859, lr_G: 0.000800
2023-06-19 10:08:06,074 - INFO - [Train] step: 19799, loss_D: 0.206190, lr_D: 0.000800
2023-06-19 10:08:06,155 - INFO - [Train] step: 19799, loss_G: 4.818207, lr_G: 0.000800
2023-06-19 10:09:02,442 - INFO - [Train] step: 19899, loss_D: 0.054809, lr_D: 0.000800
2023-06-19 10:09:02,523 - INFO - [Train] step: 19899, loss_G: 6.096661, lr_G: 0.000800
2023-06-19 10:09:58,825 - INFO - [Train] step: 19999, loss_D: 0.015607, lr_D: 0.000800
2023-06-19 10:09:58,906 - INFO - [Train] step: 19999, loss_G: 7.729172, lr_G: 0.000800
2023-06-19 10:11:03,361 - INFO - [Eval] step: 19999, fid: 26.709874
2023-06-19 10:11:59,815 - INFO - [Train] step: 20099, loss_D: 0.004610, lr_D: 0.000800
2023-06-19 10:11:59,896 - INFO - [Train] step: 20099, loss_G: 8.992717, lr_G: 0.000800
2023-06-19 10:12:56,189 - INFO - [Train] step: 20199, loss_D: 0.010060, lr_D: 0.000800
2023-06-19 10:12:56,270 - INFO - [Train] step: 20199, loss_G: 7.224313, lr_G: 0.000800
2023-06-19 10:13:52,542 - INFO - [Train] step: 20299, loss_D: 0.071417, lr_D: 0.000800
2023-06-19 10:13:52,624 - INFO - [Train] step: 20299, loss_G: 5.728765, lr_G: 0.000800
2023-06-19 10:14:48,901 - INFO - [Train] step: 20399, loss_D: 0.002243, lr_D: 0.000800
2023-06-19 10:14:48,983 - INFO - [Train] step: 20399, loss_G: 10.067745, lr_G: 0.000800
2023-06-19 10:15:45,253 - INFO - [Train] step: 20499, loss_D: 0.029250, lr_D: 0.000800
2023-06-19 10:15:45,335 - INFO - [Train] step: 20499, loss_G: 6.081541, lr_G: 0.000800
2023-06-19 10:16:41,608 - INFO - [Train] step: 20599, loss_D: 0.007327, lr_D: 0.000800
2023-06-19 10:16:41,690 - INFO - [Train] step: 20599, loss_G: 9.448675, lr_G: 0.000800
2023-06-19 10:17:38,171 - INFO - [Train] step: 20699, loss_D: 0.011942, lr_D: 0.000800
2023-06-19 10:17:38,254 - INFO - [Train] step: 20699, loss_G: 7.802257, lr_G: 0.000800
2023-06-19 10:18:34,540 - INFO - [Train] step: 20799, loss_D: 0.003363, lr_D: 0.000800
2023-06-19 10:18:34,622 - INFO - [Train] step: 20799, loss_G: 8.547530, lr_G: 0.000800
2023-06-19 10:19:30,866 - INFO - [Train] step: 20899, loss_D: 0.035474, lr_D: 0.000800
2023-06-19 10:19:30,948 - INFO - [Train] step: 20899, loss_G: 6.378475, lr_G: 0.000800
2023-06-19 10:20:27,224 - INFO - [Train] step: 20999, loss_D: 0.006648, lr_D: 0.000800
2023-06-19 10:20:27,306 - INFO - [Train] step: 20999, loss_G: 8.762024, lr_G: 0.000800
2023-06-19 10:21:31,560 - INFO - [Eval] step: 20999, fid: 25.397414
2023-06-19 10:22:28,010 - INFO - [Train] step: 21099, loss_D: 0.003545, lr_D: 0.000800
2023-06-19 10:22:28,092 - INFO - [Train] step: 21099, loss_G: 9.304377, lr_G: 0.000800
2023-06-19 10:23:24,342 - INFO - [Train] step: 21199, loss_D: 0.020836, lr_D: 0.000800
2023-06-19 10:23:24,424 - INFO - [Train] step: 21199, loss_G: 7.370076, lr_G: 0.000800
2023-06-19 10:24:20,653 - INFO - [Train] step: 21299, loss_D: 0.060249, lr_D: 0.000800
2023-06-19 10:24:20,735 - INFO - [Train] step: 21299, loss_G: 5.891047, lr_G: 0.000800
2023-06-19 10:25:17,150 - INFO - [Train] step: 21399, loss_D: 0.007618, lr_D: 0.000800
2023-06-19 10:25:17,231 - INFO - [Train] step: 21399, loss_G: 7.760221, lr_G: 0.000800
2023-06-19 10:26:13,504 - INFO - [Train] step: 21499, loss_D: 0.000106, lr_D: 0.000800
2023-06-19 10:26:13,585 - INFO - [Train] step: 21499, loss_G: 14.872833, lr_G: 0.000800
2023-06-19 10:27:09,829 - INFO - [Train] step: 21599, loss_D: 0.017287, lr_D: 0.000800
2023-06-19 10:27:09,911 - INFO - [Train] step: 21599, loss_G: 8.061073, lr_G: 0.000800
2023-06-19 10:28:06,192 - INFO - [Train] step: 21699, loss_D: 0.039322, lr_D: 0.000800
2023-06-19 10:28:06,273 - INFO - [Train] step: 21699, loss_G: 7.578096, lr_G: 0.000800
2023-06-19 10:29:02,548 - INFO - [Train] step: 21799, loss_D: 0.001613, lr_D: 0.000800
2023-06-19 10:29:02,630 - INFO - [Train] step: 21799, loss_G: 9.368841, lr_G: 0.000800
2023-06-19 10:29:58,893 - INFO - [Train] step: 21899, loss_D: 0.013110, lr_D: 0.000800
2023-06-19 10:29:58,974 - INFO - [Train] step: 21899, loss_G: 6.986366, lr_G: 0.000800
2023-06-19 10:30:55,401 - INFO - [Train] step: 21999, loss_D: 0.001971, lr_D: 0.000800
2023-06-19 10:30:55,483 - INFO - [Train] step: 21999, loss_G: 9.163670, lr_G: 0.000800
2023-06-19 10:31:59,737 - INFO - [Eval] step: 21999, fid: 25.756052
2023-06-19 10:32:55,886 - INFO - [Train] step: 22099, loss_D: 0.051381, lr_D: 0.000800
2023-06-19 10:32:55,967 - INFO - [Train] step: 22099, loss_G: 6.960755, lr_G: 0.000800
2023-06-19 10:33:52,246 - INFO - [Train] step: 22199, loss_D: 0.005855, lr_D: 0.000800
2023-06-19 10:33:52,327 - INFO - [Train] step: 22199, loss_G: 8.768633, lr_G: 0.000800
2023-06-19 10:34:48,608 - INFO - [Train] step: 22299, loss_D: 0.005912, lr_D: 0.000800
2023-06-19 10:34:48,690 - INFO - [Train] step: 22299, loss_G: 8.356956, lr_G: 0.000800
2023-06-19 10:35:44,995 - INFO - [Train] step: 22399, loss_D: 0.310064, lr_D: 0.000800
2023-06-19 10:35:45,076 - INFO - [Train] step: 22399, loss_G: 2.828297, lr_G: 0.000800
2023-06-19 10:36:41,358 - INFO - [Train] step: 22499, loss_D: 0.010998, lr_D: 0.000800
2023-06-19 10:36:41,439 - INFO - [Train] step: 22499, loss_G: 8.017771, lr_G: 0.000800
2023-06-19 10:37:37,720 - INFO - [Train] step: 22599, loss_D: 0.003303, lr_D: 0.000800
2023-06-19 10:37:37,801 - INFO - [Train] step: 22599, loss_G: 8.769112, lr_G: 0.000800
2023-06-19 10:38:34,224 - INFO - [Train] step: 22699, loss_D: 0.190521, lr_D: 0.000800
2023-06-19 10:38:34,305 - INFO - [Train] step: 22699, loss_G: 3.065347, lr_G: 0.000800
2023-06-19 10:39:30,566 - INFO - [Train] step: 22799, loss_D: 0.021095, lr_D: 0.000800
2023-06-19 10:39:30,647 - INFO - [Train] step: 22799, loss_G: 6.967254, lr_G: 0.000800
2023-06-19 10:40:26,896 - INFO - [Train] step: 22899, loss_D: 0.009425, lr_D: 0.000800
2023-06-19 10:40:26,978 - INFO - [Train] step: 22899, loss_G: 8.613314, lr_G: 0.000800
2023-06-19 10:41:23,258 - INFO - [Train] step: 22999, loss_D: 0.008403, lr_D: 0.000800
2023-06-19 10:41:23,339 - INFO - [Train] step: 22999, loss_G: 8.406750, lr_G: 0.000800
2023-06-19 10:42:27,600 - INFO - [Eval] step: 22999, fid: 25.771838
2023-06-19 10:43:23,717 - INFO - [Train] step: 23099, loss_D: 0.020543, lr_D: 0.000800
2023-06-19 10:43:23,798 - INFO - [Train] step: 23099, loss_G: 7.145589, lr_G: 0.000800
2023-06-19 10:44:20,054 - INFO - [Train] step: 23199, loss_D: 0.005995, lr_D: 0.000800
2023-06-19 10:44:20,135 - INFO - [Train] step: 23199, loss_G: 8.671112, lr_G: 0.000800
2023-06-19 10:45:16,555 - INFO - [Train] step: 23299, loss_D: 0.003808, lr_D: 0.000800
2023-06-19 10:45:16,637 - INFO - [Train] step: 23299, loss_G: 8.970214, lr_G: 0.000800
2023-06-19 10:46:12,897 - INFO - [Train] step: 23399, loss_D: 0.003767, lr_D: 0.000800
2023-06-19 10:46:12,979 - INFO - [Train] step: 23399, loss_G: 8.721260, lr_G: 0.000800
2023-06-19 10:47:09,248 - INFO - [Train] step: 23499, loss_D: 0.013924, lr_D: 0.000800
2023-06-19 10:47:09,329 - INFO - [Train] step: 23499, loss_G: 8.125748, lr_G: 0.000800
2023-06-19 10:48:05,601 - INFO - [Train] step: 23599, loss_D: 0.005601, lr_D: 0.000800
2023-06-19 10:48:05,683 - INFO - [Train] step: 23599, loss_G: 8.969748, lr_G: 0.000800
2023-06-19 10:49:01,958 - INFO - [Train] step: 23699, loss_D: 0.002137, lr_D: 0.000800
2023-06-19 10:49:02,040 - INFO - [Train] step: 23699, loss_G: 9.643715, lr_G: 0.000800
2023-06-19 10:49:58,285 - INFO - [Train] step: 23799, loss_D: 0.007226, lr_D: 0.000800
2023-06-19 10:49:58,366 - INFO - [Train] step: 23799, loss_G: 8.375148, lr_G: 0.000800
2023-06-19 10:50:54,623 - INFO - [Train] step: 23899, loss_D: 0.021870, lr_D: 0.000800
2023-06-19 10:50:54,704 - INFO - [Train] step: 23899, loss_G: 7.694623, lr_G: 0.000800
2023-06-19 10:51:51,167 - INFO - [Train] step: 23999, loss_D: 0.004569, lr_D: 0.000800
2023-06-19 10:51:51,248 - INFO - [Train] step: 23999, loss_G: 9.047634, lr_G: 0.000800
2023-06-19 10:52:55,464 - INFO - [Eval] step: 23999, fid: 26.118727
2023-06-19 10:53:51,595 - INFO - [Train] step: 24099, loss_D: 0.009521, lr_D: 0.000800
2023-06-19 10:53:51,676 - INFO - [Train] step: 24099, loss_G: 8.060238, lr_G: 0.000800
2023-06-19 10:54:47,932 - INFO - [Train] step: 24199, loss_D: 5.293205, lr_D: 0.000800
2023-06-19 10:54:48,013 - INFO - [Train] step: 24199, loss_G: 10.166002, lr_G: 0.000800
2023-06-19 10:55:44,281 - INFO - [Train] step: 24299, loss_D: 0.006884, lr_D: 0.000800
2023-06-19 10:55:44,363 - INFO - [Train] step: 24299, loss_G: 8.022972, lr_G: 0.000800
2023-06-19 10:56:40,635 - INFO - [Train] step: 24399, loss_D: 0.004234, lr_D: 0.000800
2023-06-19 10:56:40,716 - INFO - [Train] step: 24399, loss_G: 9.634136, lr_G: 0.000800
2023-06-19 10:57:37,004 - INFO - [Train] step: 24499, loss_D: 0.002879, lr_D: 0.000800
2023-06-19 10:57:37,085 - INFO - [Train] step: 24499, loss_G: 8.980850, lr_G: 0.000800
2023-06-19 10:58:33,495 - INFO - [Train] step: 24599, loss_D: 0.093795, lr_D: 0.000800
2023-06-19 10:58:33,576 - INFO - [Train] step: 24599, loss_G: 5.437840, lr_G: 0.000800
2023-06-19 10:59:29,844 - INFO - [Train] step: 24699, loss_D: 0.007088, lr_D: 0.000800
2023-06-19 10:59:29,926 - INFO - [Train] step: 24699, loss_G: 8.685680, lr_G: 0.000800
2023-06-19 11:00:26,199 - INFO - [Train] step: 24799, loss_D: 0.009576, lr_D: 0.000800
2023-06-19 11:00:26,280 - INFO - [Train] step: 24799, loss_G: 8.529745, lr_G: 0.000800
2023-06-19 11:01:22,533 - INFO - [Train] step: 24899, loss_D: 0.000447, lr_D: 0.000800
2023-06-19 11:01:22,614 - INFO - [Train] step: 24899, loss_G: 11.849860, lr_G: 0.000800
2023-06-19 11:02:18,894 - INFO - [Train] step: 24999, loss_D: 0.021879, lr_D: 0.000800
2023-06-19 11:02:18,976 - INFO - [Train] step: 24999, loss_G: 8.313786, lr_G: 0.000800
2023-06-19 11:03:23,260 - INFO - [Eval] step: 24999, fid: 28.001092
2023-06-19 11:04:19,326 - INFO - [Train] step: 25099, loss_D: 0.004321, lr_D: 0.000800
2023-06-19 11:04:19,408 - INFO - [Train] step: 25099, loss_G: 9.784521, lr_G: 0.000800
2023-06-19 11:05:15,672 - INFO - [Train] step: 25199, loss_D: 0.061796, lr_D: 0.000800
2023-06-19 11:05:15,753 - INFO - [Train] step: 25199, loss_G: 5.563177, lr_G: 0.000800
2023-06-19 11:06:12,186 - INFO - [Train] step: 25299, loss_D: 0.003681, lr_D: 0.000800
2023-06-19 11:06:12,267 - INFO - [Train] step: 25299, loss_G: 8.598686, lr_G: 0.000800
2023-06-19 11:07:08,529 - INFO - [Train] step: 25399, loss_D: 0.008597, lr_D: 0.000800
2023-06-19 11:07:08,611 - INFO - [Train] step: 25399, loss_G: 8.085169, lr_G: 0.000800
2023-06-19 11:08:04,881 - INFO - [Train] step: 25499, loss_D: 0.000262, lr_D: 0.000800
2023-06-19 11:08:04,962 - INFO - [Train] step: 25499, loss_G: 11.739882, lr_G: 0.000800
2023-06-19 11:09:01,242 - INFO - [Train] step: 25599, loss_D: 0.000061, lr_D: 0.000800
2023-06-19 11:09:01,324 - INFO - [Train] step: 25599, loss_G: 16.554337, lr_G: 0.000800
2023-06-19 11:09:57,576 - INFO - [Train] step: 25699, loss_D: 0.013710, lr_D: 0.000800
2023-06-19 11:09:57,658 - INFO - [Train] step: 25699, loss_G: 8.769447, lr_G: 0.000800
2023-06-19 11:10:53,913 - INFO - [Train] step: 25799, loss_D: 0.018371, lr_D: 0.000800
2023-06-19 11:10:53,994 - INFO - [Train] step: 25799, loss_G: 7.862812, lr_G: 0.000800
2023-06-19 11:11:50,433 - INFO - [Train] step: 25899, loss_D: 0.363609, lr_D: 0.000800
2023-06-19 11:11:50,514 - INFO - [Train] step: 25899, loss_G: 4.302042, lr_G: 0.000800
2023-06-19 11:12:46,791 - INFO - [Train] step: 25999, loss_D: 0.004013, lr_D: 0.000800
2023-06-19 11:12:46,872 - INFO - [Train] step: 25999, loss_G: 8.095881, lr_G: 0.000800
2023-06-19 11:13:51,098 - INFO - [Eval] step: 25999, fid: 25.757806
2023-06-19 11:14:47,142 - INFO - [Train] step: 26099, loss_D: 0.003661, lr_D: 0.000800
2023-06-19 11:14:47,223 - INFO - [Train] step: 26099, loss_G: 9.544034, lr_G: 0.000800
2023-06-19 11:15:43,502 - INFO - [Train] step: 26199, loss_D: 0.003157, lr_D: 0.000800
2023-06-19 11:15:43,583 - INFO - [Train] step: 26199, loss_G: 8.982965, lr_G: 0.000800
2023-06-19 11:16:39,825 - INFO - [Train] step: 26299, loss_D: 0.018165, lr_D: 0.000800
2023-06-19 11:16:39,906 - INFO - [Train] step: 26299, loss_G: 7.508321, lr_G: 0.000800
2023-06-19 11:17:36,157 - INFO - [Train] step: 26399, loss_D: 0.003346, lr_D: 0.000800
2023-06-19 11:17:36,239 - INFO - [Train] step: 26399, loss_G: 9.692275, lr_G: 0.000800
2023-06-19 11:18:32,528 - INFO - [Train] step: 26499, loss_D: 0.002445, lr_D: 0.000800
2023-06-19 11:18:32,609 - INFO - [Train] step: 26499, loss_G: 9.254383, lr_G: 0.000800
2023-06-19 11:19:29,025 - INFO - [Train] step: 26599, loss_D: 0.174502, lr_D: 0.000800
2023-06-19 11:19:29,107 - INFO - [Train] step: 26599, loss_G: 5.490209, lr_G: 0.000800
2023-06-19 11:20:25,352 - INFO - [Train] step: 26699, loss_D: 0.004485, lr_D: 0.000800
2023-06-19 11:20:25,433 - INFO - [Train] step: 26699, loss_G: 9.488066, lr_G: 0.000800
2023-06-19 11:21:21,744 - INFO - [Train] step: 26799, loss_D: 0.007973, lr_D: 0.000800
2023-06-19 11:21:21,826 - INFO - [Train] step: 26799, loss_G: 8.609036, lr_G: 0.000800
2023-06-19 11:22:18,079 - INFO - [Train] step: 26899, loss_D: 0.008277, lr_D: 0.000800
2023-06-19 11:22:18,160 - INFO - [Train] step: 26899, loss_G: 8.023023, lr_G: 0.000800
2023-06-19 11:23:14,424 - INFO - [Train] step: 26999, loss_D: 0.121804, lr_D: 0.000800
2023-06-19 11:23:14,505 - INFO - [Train] step: 26999, loss_G: 5.195994, lr_G: 0.000800
2023-06-19 11:24:18,766 - INFO - [Eval] step: 26999, fid: 24.723112
2023-06-19 11:25:15,108 - INFO - [Train] step: 27099, loss_D: 0.006868, lr_D: 0.000800
2023-06-19 11:25:15,189 - INFO - [Train] step: 27099, loss_G: 8.795389, lr_G: 0.000800
2023-06-19 11:26:11,589 - INFO - [Train] step: 27199, loss_D: 0.005106, lr_D: 0.000800
2023-06-19 11:26:11,671 - INFO - [Train] step: 27199, loss_G: 9.769424, lr_G: 0.000800
2023-06-19 11:27:07,915 - INFO - [Train] step: 27299, loss_D: 0.005870, lr_D: 0.000800
2023-06-19 11:27:07,997 - INFO - [Train] step: 27299, loss_G: 9.289549, lr_G: 0.000800
2023-06-19 11:28:04,263 - INFO - [Train] step: 27399, loss_D: 0.003386, lr_D: 0.000800
2023-06-19 11:28:04,345 - INFO - [Train] step: 27399, loss_G: 8.780186, lr_G: 0.000800
2023-06-19 11:29:00,642 - INFO - [Train] step: 27499, loss_D: 0.000794, lr_D: 0.000800
2023-06-19 11:29:00,723 - INFO - [Train] step: 27499, loss_G: 10.100098, lr_G: 0.000800
2023-06-19 11:29:56,961 - INFO - [Train] step: 27599, loss_D: 0.020527, lr_D: 0.000800
2023-06-19 11:29:57,043 - INFO - [Train] step: 27599, loss_G: 7.842453, lr_G: 0.000800
2023-06-19 11:30:53,279 - INFO - [Train] step: 27699, loss_D: 0.003047, lr_D: 0.000800
2023-06-19 11:30:53,360 - INFO - [Train] step: 27699, loss_G: 7.742270, lr_G: 0.000800
2023-06-19 11:31:49,636 - INFO - [Train] step: 27799, loss_D: 0.055188, lr_D: 0.000800
2023-06-19 11:31:49,717 - INFO - [Train] step: 27799, loss_G: 7.827262, lr_G: 0.000800
2023-06-19 11:32:46,136 - INFO - [Train] step: 27899, loss_D: 0.002219, lr_D: 0.000800
2023-06-19 11:32:46,218 - INFO - [Train] step: 27899, loss_G: 9.040241, lr_G: 0.000800
2023-06-19 11:33:42,485 - INFO - [Train] step: 27999, loss_D: 0.011439, lr_D: 0.000800
2023-06-19 11:33:42,567 - INFO - [Train] step: 27999, loss_G: 8.140722, lr_G: 0.000800
2023-06-19 11:34:46,788 - INFO - [Eval] step: 27999, fid: 25.234811
2023-06-19 11:35:42,944 - INFO - [Train] step: 28099, loss_D: 0.000918, lr_D: 0.000800
2023-06-19 11:35:43,033 - INFO - [Train] step: 28099, loss_G: 11.322145, lr_G: 0.000800
2023-06-19 11:36:39,312 - INFO - [Train] step: 28199, loss_D: 0.009330, lr_D: 0.000800
2023-06-19 11:36:39,394 - INFO - [Train] step: 28199, loss_G: 8.585947, lr_G: 0.000800
2023-06-19 11:37:35,683 - INFO - [Train] step: 28299, loss_D: 0.007605, lr_D: 0.000800
2023-06-19 11:37:35,764 - INFO - [Train] step: 28299, loss_G: 9.822062, lr_G: 0.000800
2023-06-19 11:38:32,052 - INFO - [Train] step: 28399, loss_D: 0.002367, lr_D: 0.000800
2023-06-19 11:38:32,133 - INFO - [Train] step: 28399, loss_G: 8.717586, lr_G: 0.000800
2023-06-19 11:39:28,584 - INFO - [Train] step: 28499, loss_D: 0.096420, lr_D: 0.000800
2023-06-19 11:39:28,665 - INFO - [Train] step: 28499, loss_G: 6.096228, lr_G: 0.000800
2023-06-19 11:40:24,968 - INFO - [Train] step: 28599, loss_D: 0.005420, lr_D: 0.000800
2023-06-19 11:40:25,049 - INFO - [Train] step: 28599, loss_G: 8.772751, lr_G: 0.000800
2023-06-19 11:41:21,332 - INFO - [Train] step: 28699, loss_D: 0.004260, lr_D: 0.000800
2023-06-19 11:41:21,413 - INFO - [Train] step: 28699, loss_G: 9.933729, lr_G: 0.000800
2023-06-19 11:42:17,664 - INFO - [Train] step: 28799, loss_D: 0.006357, lr_D: 0.000800
2023-06-19 11:42:17,746 - INFO - [Train] step: 28799, loss_G: 8.441674, lr_G: 0.000800
2023-06-19 11:43:14,013 - INFO - [Train] step: 28899, loss_D: 0.007275, lr_D: 0.000800
2023-06-19 11:43:14,094 - INFO - [Train] step: 28899, loss_G: 9.099104, lr_G: 0.000800
2023-06-19 11:44:10,354 - INFO - [Train] step: 28999, loss_D: 0.038664, lr_D: 0.000800
2023-06-19 11:44:10,436 - INFO - [Train] step: 28999, loss_G: 7.379375, lr_G: 0.000800
2023-06-19 11:45:14,717 - INFO - [Eval] step: 28999, fid: 26.149791
2023-06-19 11:46:10,849 - INFO - [Train] step: 29099, loss_D: 0.004095, lr_D: 0.000800
2023-06-19 11:46:10,930 - INFO - [Train] step: 29099, loss_G: 9.334392, lr_G: 0.000800
2023-06-19 11:47:07,322 - INFO - [Train] step: 29199, loss_D: 0.002066, lr_D: 0.000800
2023-06-19 11:47:07,404 - INFO - [Train] step: 29199, loss_G: 10.648560, lr_G: 0.000800
2023-06-19 11:48:03,683 - INFO - [Train] step: 29299, loss_D: 0.015892, lr_D: 0.000800
2023-06-19 11:48:03,765 - INFO - [Train] step: 29299, loss_G: 9.506716, lr_G: 0.000800
2023-06-19 11:49:00,057 - INFO - [Train] step: 29399, loss_D: 0.003407, lr_D: 0.000800
2023-06-19 11:49:00,138 - INFO - [Train] step: 29399, loss_G: 10.270064, lr_G: 0.000800
2023-06-19 11:49:56,408 - INFO - [Train] step: 29499, loss_D: 0.006997, lr_D: 0.000800
2023-06-19 11:49:56,490 - INFO - [Train] step: 29499, loss_G: 9.503865, lr_G: 0.000800
2023-06-19 11:50:52,828 - INFO - [Train] step: 29599, loss_D: 0.004040, lr_D: 0.000800
2023-06-19 11:50:52,910 - INFO - [Train] step: 29599, loss_G: 9.636610, lr_G: 0.000800
2023-06-19 11:51:49,200 - INFO - [Train] step: 29699, loss_D: 0.048498, lr_D: 0.000800
2023-06-19 11:51:49,281 - INFO - [Train] step: 29699, loss_G: 7.963527, lr_G: 0.000800
2023-06-19 11:52:45,693 - INFO - [Train] step: 29799, loss_D: 0.002416, lr_D: 0.000800
2023-06-19 11:52:45,774 - INFO - [Train] step: 29799, loss_G: 8.374304, lr_G: 0.000800
2023-06-19 11:53:42,059 - INFO - [Train] step: 29899, loss_D: 0.004716, lr_D: 0.000800
2023-06-19 11:53:42,141 - INFO - [Train] step: 29899, loss_G: 8.465893, lr_G: 0.000800
2023-06-19 11:54:38,464 - INFO - [Train] step: 29999, loss_D: 0.008135, lr_D: 0.000800
2023-06-19 11:54:38,546 - INFO - [Train] step: 29999, loss_G: 8.660975, lr_G: 0.000800
2023-06-19 11:55:42,809 - INFO - [Eval] step: 29999, fid: 26.852624
2023-06-19 11:56:39,125 - INFO - [Train] step: 30099, loss_D: 0.003586, lr_D: 0.000800
2023-06-19 11:56:39,207 - INFO - [Train] step: 30099, loss_G: 10.338894, lr_G: 0.000800
2023-06-19 11:57:35,492 - INFO - [Train] step: 30199, loss_D: 0.020470, lr_D: 0.000800
2023-06-19 11:57:35,573 - INFO - [Train] step: 30199, loss_G: 8.378054, lr_G: 0.000800
2023-06-19 11:58:31,866 - INFO - [Train] step: 30299, loss_D: 0.001775, lr_D: 0.000800
2023-06-19 11:58:31,948 - INFO - [Train] step: 30299, loss_G: 10.728696, lr_G: 0.000800
2023-06-19 11:59:28,416 - INFO - [Train] step: 30399, loss_D: 0.001200, lr_D: 0.000800
2023-06-19 11:59:28,498 - INFO - [Train] step: 30399, loss_G: 11.781256, lr_G: 0.000800
2023-06-19 12:00:24,785 - INFO - [Train] step: 30499, loss_D: 0.224735, lr_D: 0.000800
2023-06-19 12:00:24,866 - INFO - [Train] step: 30499, loss_G: 5.669914, lr_G: 0.000800
2023-06-19 12:01:21,146 - INFO - [Train] step: 30599, loss_D: 0.006167, lr_D: 0.000800
2023-06-19 12:01:21,227 - INFO - [Train] step: 30599, loss_G: 10.104169, lr_G: 0.000800
2023-06-19 12:02:17,539 - INFO - [Train] step: 30699, loss_D: 0.001090, lr_D: 0.000800
2023-06-19 12:02:17,629 - INFO - [Train] step: 30699, loss_G: 11.915120, lr_G: 0.000800
2023-06-19 12:03:13,918 - INFO - [Train] step: 30799, loss_D: 0.004009, lr_D: 0.000800
2023-06-19 12:03:14,000 - INFO - [Train] step: 30799, loss_G: 10.667809, lr_G: 0.000800
2023-06-19 12:04:10,272 - INFO - [Train] step: 30899, loss_D: 0.014340, lr_D: 0.000800
2023-06-19 12:04:10,354 - INFO - [Train] step: 30899, loss_G: 8.695286, lr_G: 0.000800
2023-06-19 12:05:06,618 - INFO - [Train] step: 30999, loss_D: 0.003437, lr_D: 0.000800
2023-06-19 12:05:06,699 - INFO - [Train] step: 30999, loss_G: 9.217859, lr_G: 0.000800
2023-06-19 12:06:10,881 - INFO - [Eval] step: 30999, fid: 26.265511
2023-06-19 12:07:07,230 - INFO - [Train] step: 31099, loss_D: 0.005145, lr_D: 0.000800
2023-06-19 12:07:07,312 - INFO - [Train] step: 31099, loss_G: 9.676643, lr_G: 0.000800
2023-06-19 12:08:03,591 - INFO - [Train] step: 31199, loss_D: 0.009473, lr_D: 0.000800
2023-06-19 12:08:03,672 - INFO - [Train] step: 31199, loss_G: 9.445164, lr_G: 0.000800
2023-06-19 12:08:59,972 - INFO - [Train] step: 31299, loss_D: 0.004654, lr_D: 0.000800
2023-06-19 12:09:00,053 - INFO - [Train] step: 31299, loss_G: 10.288494, lr_G: 0.000800
2023-06-19 12:09:56,360 - INFO - [Train] step: 31399, loss_D: 0.004871, lr_D: 0.000800
2023-06-19 12:09:56,441 - INFO - [Train] step: 31399, loss_G: 10.122545, lr_G: 0.000800
2023-06-19 12:10:52,714 - INFO - [Train] step: 31499, loss_D: 0.021394, lr_D: 0.000800
2023-06-19 12:10:52,795 - INFO - [Train] step: 31499, loss_G: 8.525179, lr_G: 0.000800
2023-06-19 12:11:49,077 - INFO - [Train] step: 31599, loss_D: 0.003494, lr_D: 0.000800
2023-06-19 12:11:49,159 - INFO - [Train] step: 31599, loss_G: 10.136247, lr_G: 0.000800
2023-06-19 12:12:45,599 - INFO - [Train] step: 31699, loss_D: 0.006916, lr_D: 0.000800
2023-06-19 12:12:45,681 - INFO - [Train] step: 31699, loss_G: 10.488957, lr_G: 0.000800
2023-06-19 12:13:41,993 - INFO - [Train] step: 31799, loss_D: 0.000768, lr_D: 0.000800
2023-06-19 12:13:42,074 - INFO - [Train] step: 31799, loss_G: 11.945264, lr_G: 0.000800
2023-06-19 12:14:38,351 - INFO - [Train] step: 31899, loss_D: 0.011987, lr_D: 0.000800
2023-06-19 12:14:38,433 - INFO - [Train] step: 31899, loss_G: 10.492186, lr_G: 0.000800
2023-06-19 12:15:34,710 - INFO - [Train] step: 31999, loss_D: 0.005240, lr_D: 0.000800
2023-06-19 12:15:34,792 - INFO - [Train] step: 31999, loss_G: 10.031154, lr_G: 0.000800
2023-06-19 12:16:39,028 - INFO - [Eval] step: 31999, fid: 25.038627
2023-06-19 12:17:35,182 - INFO - [Train] step: 32099, loss_D: 0.000749, lr_D: 0.000800
2023-06-19 12:17:35,264 - INFO - [Train] step: 32099, loss_G: 11.330594, lr_G: 0.000800
2023-06-19 12:18:31,573 - INFO - [Train] step: 32199, loss_D: 0.024478, lr_D: 0.000800
2023-06-19 12:18:31,655 - INFO - [Train] step: 32199, loss_G: 9.056559, lr_G: 0.000800
2023-06-19 12:19:27,985 - INFO - [Train] step: 32299, loss_D: 0.000302, lr_D: 0.000800
2023-06-19 12:19:28,066 - INFO - [Train] step: 32299, loss_G: 12.596922, lr_G: 0.000800
2023-06-19 12:20:24,538 - INFO - [Train] step: 32399, loss_D: 0.013377, lr_D: 0.000800
2023-06-19 12:20:24,619 - INFO - [Train] step: 32399, loss_G: 8.407433, lr_G: 0.000800
2023-06-19 12:21:20,930 - INFO - [Train] step: 32499, loss_D: 0.006369, lr_D: 0.000800
2023-06-19 12:21:21,012 - INFO - [Train] step: 32499, loss_G: 9.876619, lr_G: 0.000800
2023-06-19 12:22:17,275 - INFO - [Train] step: 32599, loss_D: 0.000893, lr_D: 0.000800
2023-06-19 12:22:17,357 - INFO - [Train] step: 32599, loss_G: 11.647959, lr_G: 0.000800
2023-06-19 12:23:13,676 - INFO - [Train] step: 32699, loss_D: 0.053971, lr_D: 0.000800
2023-06-19 12:23:13,758 - INFO - [Train] step: 32699, loss_G: 8.137446, lr_G: 0.000800
2023-06-19 12:24:10,052 - INFO - [Train] step: 32799, loss_D: 0.005899, lr_D: 0.000800
2023-06-19 12:24:10,134 - INFO - [Train] step: 32799, loss_G: 8.866812, lr_G: 0.000800
2023-06-19 12:25:06,399 - INFO - [Train] step: 32899, loss_D: 0.004563, lr_D: 0.000800
2023-06-19 12:25:06,480 - INFO - [Train] step: 32899, loss_G: 9.923428, lr_G: 0.000800
2023-06-19 12:26:02,940 - INFO - [Train] step: 32999, loss_D: 0.000620, lr_D: 0.000800
2023-06-19 12:26:03,022 - INFO - [Train] step: 32999, loss_G: 13.718840, lr_G: 0.000800
2023-06-19 12:27:07,259 - INFO - [Eval] step: 32999, fid: 26.603290
2023-06-19 12:28:03,306 - INFO - [Train] step: 33099, loss_D: 0.007643, lr_D: 0.000800
2023-06-19 12:28:03,387 - INFO - [Train] step: 33099, loss_G: 9.105486, lr_G: 0.000800
2023-06-19 12:28:59,676 - INFO - [Train] step: 33199, loss_D: 0.002157, lr_D: 0.000800
2023-06-19 12:28:59,757 - INFO - [Train] step: 33199, loss_G: 11.308694, lr_G: 0.000800
2023-06-19 12:29:56,036 - INFO - [Train] step: 33299, loss_D: 0.002308, lr_D: 0.000800
2023-06-19 12:29:56,118 - INFO - [Train] step: 33299, loss_G: 10.692598, lr_G: 0.000800
2023-06-19 12:30:52,399 - INFO - [Train] step: 33399, loss_D: 0.008776, lr_D: 0.000800
2023-06-19 12:30:52,480 - INFO - [Train] step: 33399, loss_G: 8.519817, lr_G: 0.000800
2023-06-19 12:31:48,730 - INFO - [Train] step: 33499, loss_D: 0.003622, lr_D: 0.000800
2023-06-19 12:31:48,811 - INFO - [Train] step: 33499, loss_G: 9.664337, lr_G: 0.000800
2023-06-19 12:32:45,048 - INFO - [Train] step: 33599, loss_D: 0.003138, lr_D: 0.000800
2023-06-19 12:32:45,129 - INFO - [Train] step: 33599, loss_G: 10.477394, lr_G: 0.000800
2023-06-19 12:33:41,600 - INFO - [Train] step: 33699, loss_D: 0.005032, lr_D: 0.000800
2023-06-19 12:33:41,681 - INFO - [Train] step: 33699, loss_G: 10.819779, lr_G: 0.000800
2023-06-19 12:34:37,938 - INFO - [Train] step: 33799, loss_D: 0.005506, lr_D: 0.000800
2023-06-19 12:34:38,019 - INFO - [Train] step: 33799, loss_G: 10.194463, lr_G: 0.000800
2023-06-19 12:35:34,265 - INFO - [Train] step: 33899, loss_D: 0.003714, lr_D: 0.000800
2023-06-19 12:35:34,346 - INFO - [Train] step: 33899, loss_G: 9.746992, lr_G: 0.000800
2023-06-19 12:36:30,577 - INFO - [Train] step: 33999, loss_D: 0.002382, lr_D: 0.000800
2023-06-19 12:36:30,658 - INFO - [Train] step: 33999, loss_G: 10.747717, lr_G: 0.000800
2023-06-19 12:37:34,822 - INFO - [Eval] step: 33999, fid: 25.654502
2023-06-19 12:38:30,863 - INFO - [Train] step: 34099, loss_D: 0.001848, lr_D: 0.000800
2023-06-19 12:38:30,945 - INFO - [Train] step: 34099, loss_G: 10.710767, lr_G: 0.000800
2023-06-19 12:39:27,158 - INFO - [Train] step: 34199, loss_D: 0.007456, lr_D: 0.000800
2023-06-19 12:39:27,240 - INFO - [Train] step: 34199, loss_G: 10.304926, lr_G: 0.000800
2023-06-19 12:40:23,695 - INFO - [Train] step: 34299, loss_D: 0.003262, lr_D: 0.000800
2023-06-19 12:40:23,777 - INFO - [Train] step: 34299, loss_G: 10.961802, lr_G: 0.000800
2023-06-19 12:41:20,038 - INFO - [Train] step: 34399, loss_D: 0.366804, lr_D: 0.000800
2023-06-19 12:41:20,120 - INFO - [Train] step: 34399, loss_G: 4.456480, lr_G: 0.000800
2023-06-19 12:42:16,394 - INFO - [Train] step: 34499, loss_D: 0.017164, lr_D: 0.000800
2023-06-19 12:42:16,475 - INFO - [Train] step: 34499, loss_G: 8.946349, lr_G: 0.000800
2023-06-19 12:43:12,725 - INFO - [Train] step: 34599, loss_D: 0.009574, lr_D: 0.000800
2023-06-19 12:43:12,807 - INFO - [Train] step: 34599, loss_G: 10.685642, lr_G: 0.000800
2023-06-19 12:44:09,055 - INFO - [Train] step: 34699, loss_D: 0.002727, lr_D: 0.000800
2023-06-19 12:44:09,136 - INFO - [Train] step: 34699, loss_G: 10.573668, lr_G: 0.000800
2023-06-19 12:45:05,384 - INFO - [Train] step: 34799, loss_D: 0.003997, lr_D: 0.000800
2023-06-19 12:45:05,466 - INFO - [Train] step: 34799, loss_G: 10.150403, lr_G: 0.000800
2023-06-19 12:46:01,715 - INFO - [Train] step: 34899, loss_D: 0.027277, lr_D: 0.000800
2023-06-19 12:46:01,796 - INFO - [Train] step: 34899, loss_G: 8.610386, lr_G: 0.000800
2023-06-19 12:46:58,234 - INFO - [Train] step: 34999, loss_D: 0.006121, lr_D: 0.000800
2023-06-19 12:46:58,315 - INFO - [Train] step: 34999, loss_G: 10.681460, lr_G: 0.000800
2023-06-19 12:48:02,450 - INFO - [Eval] step: 34999, fid: 25.471493
2023-06-19 12:48:58,480 - INFO - [Train] step: 35099, loss_D: 0.000111, lr_D: 0.000800
2023-06-19 12:48:58,561 - INFO - [Train] step: 35099, loss_G: 13.410505, lr_G: 0.000800
2023-06-19 12:49:54,821 - INFO - [Train] step: 35199, loss_D: 0.014391, lr_D: 0.000800
2023-06-19 12:49:54,903 - INFO - [Train] step: 35199, loss_G: 9.602562, lr_G: 0.000800
2023-06-19 12:50:51,166 - INFO - [Train] step: 35299, loss_D: 0.000349, lr_D: 0.000800
2023-06-19 12:50:51,247 - INFO - [Train] step: 35299, loss_G: 13.211998, lr_G: 0.000800
2023-06-19 12:51:47,492 - INFO - [Train] step: 35399, loss_D: 0.004499, lr_D: 0.000800
2023-06-19 12:51:47,574 - INFO - [Train] step: 35399, loss_G: 11.220223, lr_G: 0.000800
2023-06-19 12:52:43,829 - INFO - [Train] step: 35499, loss_D: 0.024201, lr_D: 0.000800
2023-06-19 12:52:43,910 - INFO - [Train] step: 35499, loss_G: 8.887648, lr_G: 0.000800
2023-06-19 12:53:40,364 - INFO - [Train] step: 35599, loss_D: 0.001907, lr_D: 0.000800
2023-06-19 12:53:40,446 - INFO - [Train] step: 35599, loss_G: 11.214556, lr_G: 0.000800
2023-06-19 12:54:36,698 - INFO - [Train] step: 35699, loss_D: 0.007069, lr_D: 0.000800
2023-06-19 12:54:36,780 - INFO - [Train] step: 35699, loss_G: 10.153419, lr_G: 0.000800
2023-06-19 12:55:33,029 - INFO - [Train] step: 35799, loss_D: 0.003130, lr_D: 0.000800
2023-06-19 12:55:33,111 - INFO - [Train] step: 35799, loss_G: 11.044560, lr_G: 0.000800
2023-06-19 12:56:29,359 - INFO - [Train] step: 35899, loss_D: 0.087296, lr_D: 0.000800
2023-06-19 12:56:29,440 - INFO - [Train] step: 35899, loss_G: 6.357778, lr_G: 0.000800
2023-06-19 12:57:25,683 - INFO - [Train] step: 35999, loss_D: 0.004684, lr_D: 0.000800
2023-06-19 12:57:25,765 - INFO - [Train] step: 35999, loss_G: 10.256704, lr_G: 0.000800
2023-06-19 12:58:29,929 - INFO - [Eval] step: 35999, fid: 26.434470
2023-06-19 12:59:25,971 - INFO - [Train] step: 36099, loss_D: 0.000226, lr_D: 0.000800
2023-06-19 12:59:26,053 - INFO - [Train] step: 36099, loss_G: 11.894859, lr_G: 0.000800
2023-06-19 13:00:22,302 - INFO - [Train] step: 36199, loss_D: 0.008387, lr_D: 0.000800
2023-06-19 13:00:22,384 - INFO - [Train] step: 36199, loss_G: 8.659111, lr_G: 0.000800
2023-06-19 13:01:18,801 - INFO - [Train] step: 36299, loss_D: 0.002968, lr_D: 0.000800
2023-06-19 13:01:18,883 - INFO - [Train] step: 36299, loss_G: 9.986971, lr_G: 0.000800
2023-06-19 13:02:15,140 - INFO - [Train] step: 36399, loss_D: 0.002641, lr_D: 0.000800
2023-06-19 13:02:15,222 - INFO - [Train] step: 36399, loss_G: 11.220833, lr_G: 0.000800
2023-06-19 13:03:11,468 - INFO - [Train] step: 36499, loss_D: 0.003678, lr_D: 0.000800
2023-06-19 13:03:11,550 - INFO - [Train] step: 36499, loss_G: 11.765590, lr_G: 0.000800
2023-06-19 13:04:07,799 - INFO - [Train] step: 36599, loss_D: 0.048867, lr_D: 0.000800
2023-06-19 13:04:07,880 - INFO - [Train] step: 36599, loss_G: 7.117158, lr_G: 0.000800
2023-06-19 13:05:04,135 - INFO - [Train] step: 36699, loss_D: 0.006428, lr_D: 0.000800
2023-06-19 13:05:04,216 - INFO - [Train] step: 36699, loss_G: 10.462889, lr_G: 0.000800
2023-06-19 13:06:00,453 - INFO - [Train] step: 36799, loss_D: 0.005732, lr_D: 0.000800
2023-06-19 13:06:00,535 - INFO - [Train] step: 36799, loss_G: 10.994985, lr_G: 0.000800
2023-06-19 13:06:56,966 - INFO - [Train] step: 36899, loss_D: 0.001270, lr_D: 0.000800
2023-06-19 13:06:57,047 - INFO - [Train] step: 36899, loss_G: 9.580173, lr_G: 0.000800
2023-06-19 13:07:53,283 - INFO - [Train] step: 36999, loss_D: 0.000198, lr_D: 0.000800
2023-06-19 13:07:53,365 - INFO - [Train] step: 36999, loss_G: 16.088608, lr_G: 0.000800
2023-06-19 13:08:57,515 - INFO - [Eval] step: 36999, fid: 26.905189
2023-06-19 13:09:53,587 - INFO - [Train] step: 37099, loss_D: 0.006555, lr_D: 0.000800
2023-06-19 13:09:53,669 - INFO - [Train] step: 37099, loss_G: 10.930222, lr_G: 0.000800
2023-06-19 13:10:49,926 - INFO - [Train] step: 37199, loss_D: 0.003756, lr_D: 0.000800
2023-06-19 13:10:50,008 - INFO - [Train] step: 37199, loss_G: 10.169632, lr_G: 0.000800
2023-06-19 13:11:46,273 - INFO - [Train] step: 37299, loss_D: 0.000409, lr_D: 0.000800
2023-06-19 13:11:46,354 - INFO - [Train] step: 37299, loss_G: 13.813995, lr_G: 0.000800
2023-06-19 13:12:42,636 - INFO - [Train] step: 37399, loss_D: 0.006838, lr_D: 0.000800
2023-06-19 13:12:42,718 - INFO - [Train] step: 37399, loss_G: 10.233875, lr_G: 0.000800
2023-06-19 13:13:38,967 - INFO - [Train] step: 37499, loss_D: 0.001814, lr_D: 0.000800
2023-06-19 13:13:39,048 - INFO - [Train] step: 37499, loss_G: 10.526942, lr_G: 0.000800
2023-06-19 13:14:35,506 - INFO - [Train] step: 37599, loss_D: 0.002768, lr_D: 0.000800
2023-06-19 13:14:35,588 - INFO - [Train] step: 37599, loss_G: 10.788388, lr_G: 0.000800
2023-06-19 13:15:31,829 - INFO - [Train] step: 37699, loss_D: 0.028728, lr_D: 0.000800
2023-06-19 13:15:31,910 - INFO - [Train] step: 37699, loss_G: 7.048328, lr_G: 0.000800
2023-06-19 13:16:28,142 - INFO - [Train] step: 37799, loss_D: 0.007823, lr_D: 0.000800
2023-06-19 13:16:28,223 - INFO - [Train] step: 37799, loss_G: 10.002163, lr_G: 0.000800
2023-06-19 13:17:24,469 - INFO - [Train] step: 37899, loss_D: 0.003296, lr_D: 0.000800
2023-06-19 13:17:24,551 - INFO - [Train] step: 37899, loss_G: 11.203371, lr_G: 0.000800
2023-06-19 13:18:20,788 - INFO - [Train] step: 37999, loss_D: 0.283328, lr_D: 0.000800
2023-06-19 13:18:20,870 - INFO - [Train] step: 37999, loss_G: 4.848392, lr_G: 0.000800
2023-06-19 13:19:25,027 - INFO - [Eval] step: 37999, fid: 25.891414
2023-06-19 13:20:21,126 - INFO - [Train] step: 38099, loss_D: 0.004343, lr_D: 0.000800
2023-06-19 13:20:21,207 - INFO - [Train] step: 38099, loss_G: 10.519534, lr_G: 0.000800
2023-06-19 13:21:17,646 - INFO - [Train] step: 38199, loss_D: 0.000419, lr_D: 0.000800
2023-06-19 13:21:17,728 - INFO - [Train] step: 38199, loss_G: 12.088963, lr_G: 0.000800
2023-06-19 13:22:13,989 - INFO - [Train] step: 38299, loss_D: 0.004370, lr_D: 0.000800
2023-06-19 13:22:14,070 - INFO - [Train] step: 38299, loss_G: 9.642187, lr_G: 0.000800
2023-06-19 13:23:10,323 - INFO - [Train] step: 38399, loss_D: 0.002401, lr_D: 0.000800
2023-06-19 13:23:10,404 - INFO - [Train] step: 38399, loss_G: 10.852392, lr_G: 0.000800
2023-06-19 13:24:06,637 - INFO - [Train] step: 38499, loss_D: 0.004303, lr_D: 0.000800
2023-06-19 13:24:06,719 - INFO - [Train] step: 38499, loss_G: 10.922167, lr_G: 0.000800
2023-06-19 13:25:02,935 - INFO - [Train] step: 38599, loss_D: 0.012653, lr_D: 0.000800
2023-06-19 13:25:03,016 - INFO - [Train] step: 38599, loss_G: 9.314722, lr_G: 0.000800
2023-06-19 13:25:59,287 - INFO - [Train] step: 38699, loss_D: 0.000816, lr_D: 0.000800
2023-06-19 13:25:59,369 - INFO - [Train] step: 38699, loss_G: 13.816206, lr_G: 0.000800
2023-06-19 13:26:55,656 - INFO - [Train] step: 38799, loss_D: 0.010324, lr_D: 0.000800
2023-06-19 13:26:55,737 - INFO - [Train] step: 38799, loss_G: 10.052263, lr_G: 0.000800
2023-06-19 13:27:52,138 - INFO - [Train] step: 38899, loss_D: 0.001867, lr_D: 0.000800
2023-06-19 13:27:52,219 - INFO - [Train] step: 38899, loss_G: 10.489579, lr_G: 0.000800
2023-06-19 13:28:48,484 - INFO - [Train] step: 38999, loss_D: 0.002701, lr_D: 0.000800
2023-06-19 13:28:48,565 - INFO - [Train] step: 38999, loss_G: 10.951891, lr_G: 0.000800
2023-06-19 13:29:52,710 - INFO - [Eval] step: 38999, fid: 26.558577
2023-06-19 13:30:48,837 - INFO - [Train] step: 39099, loss_D: 0.010560, lr_D: 0.000800
2023-06-19 13:30:48,918 - INFO - [Train] step: 39099, loss_G: 10.920807, lr_G: 0.000800
2023-06-19 13:31:45,282 - INFO - [Train] step: 39199, loss_D: 0.008760, lr_D: 0.000800
2023-06-19 13:31:45,364 - INFO - [Train] step: 39199, loss_G: 10.707663, lr_G: 0.000800
2023-06-19 13:32:41,719 - INFO - [Train] step: 39299, loss_D: 0.002721, lr_D: 0.000800
2023-06-19 13:32:41,800 - INFO - [Train] step: 39299, loss_G: 10.049867, lr_G: 0.000800
2023-06-19 13:33:38,184 - INFO - [Train] step: 39399, loss_D: 0.003659, lr_D: 0.000800
2023-06-19 13:33:38,265 - INFO - [Train] step: 39399, loss_G: 10.855670, lr_G: 0.000800
2023-06-19 13:34:34,787 - INFO - [Train] step: 39499, loss_D: 0.002592, lr_D: 0.000800
2023-06-19 13:34:34,868 - INFO - [Train] step: 39499, loss_G: 11.214542, lr_G: 0.000800
2023-06-19 13:35:31,249 - INFO - [Train] step: 39599, loss_D: 0.004099, lr_D: 0.000800
2023-06-19 13:35:31,330 - INFO - [Train] step: 39599, loss_G: 10.776371, lr_G: 0.000800
2023-06-19 13:36:27,686 - INFO - [Train] step: 39699, loss_D: 0.000111, lr_D: 0.000800
2023-06-19 13:36:27,768 - INFO - [Train] step: 39699, loss_G: 13.308062, lr_G: 0.000800
2023-06-19 13:37:24,131 - INFO - [Train] step: 39799, loss_D: 0.003119, lr_D: 0.000800
2023-06-19 13:37:24,213 - INFO - [Train] step: 39799, loss_G: 10.641691, lr_G: 0.000800
2023-06-19 13:38:20,565 - INFO - [Train] step: 39899, loss_D: 0.000917, lr_D: 0.000800
2023-06-19 13:38:20,646 - INFO - [Train] step: 39899, loss_G: 10.842619, lr_G: 0.000800
2023-06-19 13:39:17,004 - INFO - [Train] step: 39999, loss_D: 0.002382, lr_D: 0.000800
2023-06-19 13:39:17,086 - INFO - [Train] step: 39999, loss_G: 11.631386, lr_G: 0.000800
2023-06-19 13:40:21,221 - INFO - [Eval] step: 39999, fid: 27.618482
2023-06-19 13:40:21,391 - INFO - Best FID score: 24.723112072729634
2023-06-19 13:40:21,391 - INFO - End of training