GANs-Implementations / sngan_hinge_cifar10 /output-2023-06-24-05-59-05.log
xyfJASON's picture
Upload sngan_hinge_cifar10 checkpoints and training logs
4870d91
raw history blame
No virus
72.8 kB
2023-06-24 05:59:06,033 - INFO - Experiment directory: ./runs/sngan_hinge_cifar10/
2023-06-24 05:59:06,034 - INFO - Number of processes: 1
2023-06-24 05:59:06,034 - INFO - Distributed type: DistributedType.NO
2023-06-24 05:59:06,034 - INFO - Mixed precision: no
2023-06-24 05:59:06,034 - INFO - ==============================
2023-06-24 05:59:06,536 - INFO - Size of training set: 50000
2023-06-24 05:59:06,536 - INFO - Batch size per process: 512
2023-06-24 05:59:06,536 - INFO - Total batch size: 512
2023-06-24 05:59:06,536 - INFO - ==============================
2023-06-24 05:59:07,529 - INFO - Start training...
2023-06-24 06:00:06,199 - INFO - [Train] step: 99, loss_D: 0.040965, lr_D: 0.000800
2023-06-24 06:00:06,281 - INFO - [Train] step: 99, loss_G: 3.007102, lr_G: 0.000800
2023-06-24 06:01:03,649 - INFO - [Train] step: 199, loss_D: 0.008264, lr_D: 0.000800
2023-06-24 06:01:03,732 - INFO - [Train] step: 199, loss_G: 3.227652, lr_G: 0.000800
2023-06-24 06:02:01,248 - INFO - [Train] step: 299, loss_D: 0.005113, lr_D: 0.000800
2023-06-24 06:02:01,331 - INFO - [Train] step: 299, loss_G: 2.343576, lr_G: 0.000800
2023-06-24 06:02:58,889 - INFO - [Train] step: 399, loss_D: 0.013030, lr_D: 0.000800
2023-06-24 06:02:58,972 - INFO - [Train] step: 399, loss_G: 2.385419, lr_G: 0.000800
2023-06-24 06:03:56,515 - INFO - [Train] step: 499, loss_D: 0.004518, lr_D: 0.000800
2023-06-24 06:03:56,598 - INFO - [Train] step: 499, loss_G: 2.686864, lr_G: 0.000800
2023-06-24 06:04:54,158 - INFO - [Train] step: 599, loss_D: 0.109882, lr_D: 0.000800
2023-06-24 06:04:54,241 - INFO - [Train] step: 599, loss_G: 2.886936, lr_G: 0.000800
2023-06-24 06:05:51,959 - INFO - [Train] step: 699, loss_D: 0.021981, lr_D: 0.000800
2023-06-24 06:05:52,042 - INFO - [Train] step: 699, loss_G: 4.502441, lr_G: 0.000800
2023-06-24 06:06:49,657 - INFO - [Train] step: 799, loss_D: 0.000029, lr_D: 0.000800
2023-06-24 06:06:49,740 - INFO - [Train] step: 799, loss_G: 2.913847, lr_G: 0.000800
2023-06-24 06:07:47,353 - INFO - [Train] step: 899, loss_D: 0.002508, lr_D: 0.000800
2023-06-24 06:07:47,436 - INFO - [Train] step: 899, loss_G: 2.751092, lr_G: 0.000800
2023-06-24 06:08:45,020 - INFO - [Train] step: 999, loss_D: 0.006100, lr_D: 0.000800
2023-06-24 06:08:45,102 - INFO - [Train] step: 999, loss_G: 2.678624, lr_G: 0.000800
2023-06-24 06:09:49,386 - INFO - [Eval] step: 999, fid: 124.136445
2023-06-24 06:10:46,971 - INFO - [Train] step: 1099, loss_D: 0.012124, lr_D: 0.000800
2023-06-24 06:10:47,054 - INFO - [Train] step: 1099, loss_G: 2.428777, lr_G: 0.000800
2023-06-24 06:11:44,628 - INFO - [Train] step: 1199, loss_D: 0.010001, lr_D: 0.000800
2023-06-24 06:11:44,711 - INFO - [Train] step: 1199, loss_G: 2.660393, lr_G: 0.000800
2023-06-24 06:12:42,411 - INFO - [Train] step: 1299, loss_D: 0.028743, lr_D: 0.000800
2023-06-24 06:12:42,494 - INFO - [Train] step: 1299, loss_G: 2.515914, lr_G: 0.000800
2023-06-24 06:13:40,049 - INFO - [Train] step: 1399, loss_D: 0.004259, lr_D: 0.000800
2023-06-24 06:13:40,132 - INFO - [Train] step: 1399, loss_G: 2.510591, lr_G: 0.000800
2023-06-24 06:14:37,696 - INFO - [Train] step: 1499, loss_D: 0.000780, lr_D: 0.000800
2023-06-24 06:14:37,779 - INFO - [Train] step: 1499, loss_G: 2.704337, lr_G: 0.000800
2023-06-24 06:15:35,314 - INFO - [Train] step: 1599, loss_D: 0.027211, lr_D: 0.000800
2023-06-24 06:15:35,397 - INFO - [Train] step: 1599, loss_G: 2.262630, lr_G: 0.000800
2023-06-24 06:16:32,933 - INFO - [Train] step: 1699, loss_D: 0.038394, lr_D: 0.000800
2023-06-24 06:16:33,016 - INFO - [Train] step: 1699, loss_G: 2.879716, lr_G: 0.000800
2023-06-24 06:17:30,586 - INFO - [Train] step: 1799, loss_D: 0.003609, lr_D: 0.000800
2023-06-24 06:17:30,669 - INFO - [Train] step: 1799, loss_G: 2.244955, lr_G: 0.000800
2023-06-24 06:18:28,205 - INFO - [Train] step: 1899, loss_D: 0.028515, lr_D: 0.000800
2023-06-24 06:18:28,288 - INFO - [Train] step: 1899, loss_G: 2.068867, lr_G: 0.000800
2023-06-24 06:19:25,984 - INFO - [Train] step: 1999, loss_D: 0.069955, lr_D: 0.000800
2023-06-24 06:19:26,067 - INFO - [Train] step: 1999, loss_G: 3.311665, lr_G: 0.000800
2023-06-24 06:20:30,068 - INFO - [Eval] step: 1999, fid: 69.904567
2023-06-24 06:21:27,877 - INFO - [Train] step: 2099, loss_D: 0.002303, lr_D: 0.000800
2023-06-24 06:21:27,960 - INFO - [Train] step: 2099, loss_G: 2.851679, lr_G: 0.000800
2023-06-24 06:22:25,516 - INFO - [Train] step: 2199, loss_D: 0.010915, lr_D: 0.000800
2023-06-24 06:22:25,598 - INFO - [Train] step: 2199, loss_G: 2.089725, lr_G: 0.000800
2023-06-24 06:23:23,182 - INFO - [Train] step: 2299, loss_D: 0.047362, lr_D: 0.000800
2023-06-24 06:23:23,265 - INFO - [Train] step: 2299, loss_G: 2.335963, lr_G: 0.000800
2023-06-24 06:24:20,824 - INFO - [Train] step: 2399, loss_D: 0.115611, lr_D: 0.000800
2023-06-24 06:24:20,906 - INFO - [Train] step: 2399, loss_G: 3.034634, lr_G: 0.000800
2023-06-24 06:25:18,460 - INFO - [Train] step: 2499, loss_D: 0.049634, lr_D: 0.000800
2023-06-24 06:25:18,543 - INFO - [Train] step: 2499, loss_G: 2.224216, lr_G: 0.000800
2023-06-24 06:26:16,245 - INFO - [Train] step: 2599, loss_D: 0.004573, lr_D: 0.000800
2023-06-24 06:26:16,328 - INFO - [Train] step: 2599, loss_G: 2.044859, lr_G: 0.000800
2023-06-24 06:27:13,865 - INFO - [Train] step: 2699, loss_D: 0.013148, lr_D: 0.000800
2023-06-24 06:27:13,948 - INFO - [Train] step: 2699, loss_G: 4.107662, lr_G: 0.000800
2023-06-24 06:28:11,507 - INFO - [Train] step: 2799, loss_D: 0.030386, lr_D: 0.000800
2023-06-24 06:28:11,590 - INFO - [Train] step: 2799, loss_G: 3.404032, lr_G: 0.000800
2023-06-24 06:29:09,116 - INFO - [Train] step: 2899, loss_D: 0.013516, lr_D: 0.000800
2023-06-24 06:29:09,199 - INFO - [Train] step: 2899, loss_G: 2.009358, lr_G: 0.000800
2023-06-24 06:30:06,763 - INFO - [Train] step: 2999, loss_D: 0.005278, lr_D: 0.000800
2023-06-24 06:30:06,846 - INFO - [Train] step: 2999, loss_G: 2.194857, lr_G: 0.000800
2023-06-24 06:31:10,969 - INFO - [Eval] step: 2999, fid: 51.701543
2023-06-24 06:32:08,706 - INFO - [Train] step: 3099, loss_D: 0.006956, lr_D: 0.000800
2023-06-24 06:32:08,789 - INFO - [Train] step: 3099, loss_G: 2.212254, lr_G: 0.000800
2023-06-24 06:33:06,352 - INFO - [Train] step: 3199, loss_D: 0.000042, lr_D: 0.000800
2023-06-24 06:33:06,435 - INFO - [Train] step: 3199, loss_G: 2.443330, lr_G: 0.000800
2023-06-24 06:34:04,171 - INFO - [Train] step: 3299, loss_D: 0.083323, lr_D: 0.000800
2023-06-24 06:34:04,254 - INFO - [Train] step: 3299, loss_G: 2.901815, lr_G: 0.000800
2023-06-24 06:35:01,821 - INFO - [Train] step: 3399, loss_D: 0.099295, lr_D: 0.000800
2023-06-24 06:35:01,904 - INFO - [Train] step: 3399, loss_G: 2.065339, lr_G: 0.000800
2023-06-24 06:35:59,450 - INFO - [Train] step: 3499, loss_D: 0.088668, lr_D: 0.000800
2023-06-24 06:35:59,533 - INFO - [Train] step: 3499, loss_G: 2.072626, lr_G: 0.000800
2023-06-24 06:36:57,124 - INFO - [Train] step: 3599, loss_D: 0.055610, lr_D: 0.000800
2023-06-24 06:36:57,208 - INFO - [Train] step: 3599, loss_G: 2.126562, lr_G: 0.000800
2023-06-24 06:37:54,772 - INFO - [Train] step: 3699, loss_D: 0.056371, lr_D: 0.000800
2023-06-24 06:37:54,855 - INFO - [Train] step: 3699, loss_G: 2.710742, lr_G: 0.000800
2023-06-24 06:38:52,434 - INFO - [Train] step: 3799, loss_D: 0.003928, lr_D: 0.000800
2023-06-24 06:38:52,517 - INFO - [Train] step: 3799, loss_G: 2.618015, lr_G: 0.000800
2023-06-24 06:39:50,253 - INFO - [Train] step: 3899, loss_D: 0.028277, lr_D: 0.000800
2023-06-24 06:39:50,336 - INFO - [Train] step: 3899, loss_G: 2.101349, lr_G: 0.000800
2023-06-24 06:40:47,881 - INFO - [Train] step: 3999, loss_D: 0.001005, lr_D: 0.000800
2023-06-24 06:40:47,963 - INFO - [Train] step: 3999, loss_G: 2.902733, lr_G: 0.000800
2023-06-24 06:41:52,071 - INFO - [Eval] step: 3999, fid: 42.537853
2023-06-24 06:42:49,844 - INFO - [Train] step: 4099, loss_D: 0.076505, lr_D: 0.000800
2023-06-24 06:42:49,927 - INFO - [Train] step: 4099, loss_G: 1.539781, lr_G: 0.000800
2023-06-24 06:43:47,491 - INFO - [Train] step: 4199, loss_D: 0.042574, lr_D: 0.000800
2023-06-24 06:43:47,574 - INFO - [Train] step: 4199, loss_G: 1.767089, lr_G: 0.000800
2023-06-24 06:44:45,203 - INFO - [Train] step: 4299, loss_D: 0.000683, lr_D: 0.000800
2023-06-24 06:44:45,286 - INFO - [Train] step: 4299, loss_G: 2.226498, lr_G: 0.000800
2023-06-24 06:45:42,893 - INFO - [Train] step: 4399, loss_D: 0.076552, lr_D: 0.000800
2023-06-24 06:45:42,976 - INFO - [Train] step: 4399, loss_G: 1.658642, lr_G: 0.000800
2023-06-24 06:46:40,561 - INFO - [Train] step: 4499, loss_D: 0.066510, lr_D: 0.000800
2023-06-24 06:46:40,644 - INFO - [Train] step: 4499, loss_G: 1.831834, lr_G: 0.000800
2023-06-24 06:47:38,369 - INFO - [Train] step: 4599, loss_D: 0.048514, lr_D: 0.000800
2023-06-24 06:47:38,452 - INFO - [Train] step: 4599, loss_G: 1.586165, lr_G: 0.000800
2023-06-24 06:48:36,057 - INFO - [Train] step: 4699, loss_D: 0.006909, lr_D: 0.000800
2023-06-24 06:48:36,140 - INFO - [Train] step: 4699, loss_G: 2.376070, lr_G: 0.000800
2023-06-24 06:49:33,758 - INFO - [Train] step: 4799, loss_D: 0.009328, lr_D: 0.000800
2023-06-24 06:49:33,841 - INFO - [Train] step: 4799, loss_G: 1.971624, lr_G: 0.000800
2023-06-24 06:50:31,424 - INFO - [Train] step: 4899, loss_D: 0.125548, lr_D: 0.000800
2023-06-24 06:50:31,506 - INFO - [Train] step: 4899, loss_G: 1.502912, lr_G: 0.000800
2023-06-24 06:51:29,077 - INFO - [Train] step: 4999, loss_D: 0.017999, lr_D: 0.000800
2023-06-24 06:51:29,160 - INFO - [Train] step: 4999, loss_G: 1.522721, lr_G: 0.000800
2023-06-24 06:52:33,351 - INFO - [Eval] step: 4999, fid: 36.895958
2023-06-24 06:53:31,091 - INFO - [Train] step: 5099, loss_D: 0.017094, lr_D: 0.000800
2023-06-24 06:53:31,173 - INFO - [Train] step: 5099, loss_G: 2.372721, lr_G: 0.000800
2023-06-24 06:54:28,889 - INFO - [Train] step: 5199, loss_D: 0.001553, lr_D: 0.000800
2023-06-24 06:54:28,972 - INFO - [Train] step: 5199, loss_G: 2.257927, lr_G: 0.000800
2023-06-24 06:55:26,532 - INFO - [Train] step: 5299, loss_D: 0.592671, lr_D: 0.000800
2023-06-24 06:55:26,615 - INFO - [Train] step: 5299, loss_G: 2.676058, lr_G: 0.000800
2023-06-24 06:56:24,214 - INFO - [Train] step: 5399, loss_D: 0.043384, lr_D: 0.000800
2023-06-24 06:56:24,297 - INFO - [Train] step: 5399, loss_G: 1.096886, lr_G: 0.000800
2023-06-24 06:57:21,857 - INFO - [Train] step: 5499, loss_D: 0.224546, lr_D: 0.000800
2023-06-24 06:57:21,940 - INFO - [Train] step: 5499, loss_G: 2.683320, lr_G: 0.000800
2023-06-24 06:58:19,490 - INFO - [Train] step: 5599, loss_D: 0.028778, lr_D: 0.000800
2023-06-24 06:58:19,573 - INFO - [Train] step: 5599, loss_G: 1.815423, lr_G: 0.000800
2023-06-24 06:59:17,130 - INFO - [Train] step: 5699, loss_D: 0.027707, lr_D: 0.000800
2023-06-24 06:59:17,213 - INFO - [Train] step: 5699, loss_G: 1.822573, lr_G: 0.000800
2023-06-24 07:00:14,749 - INFO - [Train] step: 5799, loss_D: 0.001595, lr_D: 0.000800
2023-06-24 07:00:14,831 - INFO - [Train] step: 5799, loss_G: 2.149586, lr_G: 0.000800
2023-06-24 07:01:12,505 - INFO - [Train] step: 5899, loss_D: 0.044832, lr_D: 0.000800
2023-06-24 07:01:12,588 - INFO - [Train] step: 5899, loss_G: 1.750704, lr_G: 0.000800
2023-06-24 07:02:10,193 - INFO - [Train] step: 5999, loss_D: 0.318704, lr_D: 0.000800
2023-06-24 07:02:10,276 - INFO - [Train] step: 5999, loss_G: 1.062759, lr_G: 0.000800
2023-06-24 07:03:14,419 - INFO - [Eval] step: 5999, fid: 36.474227
2023-06-24 07:04:12,130 - INFO - [Train] step: 6099, loss_D: 0.100320, lr_D: 0.000800
2023-06-24 07:04:12,213 - INFO - [Train] step: 6099, loss_G: 1.011782, lr_G: 0.000800
2023-06-24 07:05:09,750 - INFO - [Train] step: 6199, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 07:05:09,832 - INFO - [Train] step: 6199, loss_G: 2.938220, lr_G: 0.000800
2023-06-24 07:06:07,389 - INFO - [Train] step: 6299, loss_D: 0.001982, lr_D: 0.000800
2023-06-24 07:06:07,472 - INFO - [Train] step: 6299, loss_G: 5.294497, lr_G: 0.000800
2023-06-24 07:07:05,028 - INFO - [Train] step: 6399, loss_D: 0.029834, lr_D: 0.000800
2023-06-24 07:07:05,111 - INFO - [Train] step: 6399, loss_G: 1.816101, lr_G: 0.000800
2023-06-24 07:08:02,865 - INFO - [Train] step: 6499, loss_D: 0.113520, lr_D: 0.000800
2023-06-24 07:08:02,948 - INFO - [Train] step: 6499, loss_G: 1.381809, lr_G: 0.000800
2023-06-24 07:09:00,519 - INFO - [Train] step: 6599, loss_D: 0.000711, lr_D: 0.000800
2023-06-24 07:09:00,602 - INFO - [Train] step: 6599, loss_G: 2.333935, lr_G: 0.000800
2023-06-24 07:09:58,148 - INFO - [Train] step: 6699, loss_D: 0.019381, lr_D: 0.000800
2023-06-24 07:09:58,230 - INFO - [Train] step: 6699, loss_G: 2.324218, lr_G: 0.000800
2023-06-24 07:10:55,773 - INFO - [Train] step: 6799, loss_D: 0.017052, lr_D: 0.000800
2023-06-24 07:10:55,855 - INFO - [Train] step: 6799, loss_G: 1.327323, lr_G: 0.000800
2023-06-24 07:11:53,418 - INFO - [Train] step: 6899, loss_D: 0.000079, lr_D: 0.000800
2023-06-24 07:11:53,501 - INFO - [Train] step: 6899, loss_G: 2.233047, lr_G: 0.000800
2023-06-24 07:12:51,032 - INFO - [Train] step: 6999, loss_D: 0.029508, lr_D: 0.000800
2023-06-24 07:12:51,115 - INFO - [Train] step: 6999, loss_G: 1.533518, lr_G: 0.000800
2023-06-24 07:13:55,366 - INFO - [Eval] step: 6999, fid: 32.723038
2023-06-24 07:14:53,072 - INFO - [Train] step: 7099, loss_D: 0.079937, lr_D: 0.000800
2023-06-24 07:14:53,155 - INFO - [Train] step: 7099, loss_G: 1.475883, lr_G: 0.000800
2023-06-24 07:15:50,912 - INFO - [Train] step: 7199, loss_D: 0.040170, lr_D: 0.000800
2023-06-24 07:15:50,996 - INFO - [Train] step: 7199, loss_G: 1.989485, lr_G: 0.000800
2023-06-24 07:16:48,515 - INFO - [Train] step: 7299, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 07:16:48,598 - INFO - [Train] step: 7299, loss_G: 2.006089, lr_G: 0.000800
2023-06-24 07:17:46,100 - INFO - [Train] step: 7399, loss_D: 0.099274, lr_D: 0.000800
2023-06-24 07:17:46,183 - INFO - [Train] step: 7399, loss_G: 2.125863, lr_G: 0.000800
2023-06-24 07:18:43,753 - INFO - [Train] step: 7499, loss_D: 0.002618, lr_D: 0.000800
2023-06-24 07:18:43,836 - INFO - [Train] step: 7499, loss_G: 1.789046, lr_G: 0.000800
2023-06-24 07:19:41,376 - INFO - [Train] step: 7599, loss_D: 0.035303, lr_D: 0.000800
2023-06-24 07:19:41,460 - INFO - [Train] step: 7599, loss_G: 1.709033, lr_G: 0.000800
2023-06-24 07:20:38,999 - INFO - [Train] step: 7699, loss_D: 0.004853, lr_D: 0.000800
2023-06-24 07:20:39,082 - INFO - [Train] step: 7699, loss_G: 2.452032, lr_G: 0.000800
2023-06-24 07:21:36,791 - INFO - [Train] step: 7799, loss_D: 0.017674, lr_D: 0.000800
2023-06-24 07:21:36,874 - INFO - [Train] step: 7799, loss_G: 2.216591, lr_G: 0.000800
2023-06-24 07:22:34,369 - INFO - [Train] step: 7899, loss_D: 0.001904, lr_D: 0.000800
2023-06-24 07:22:34,452 - INFO - [Train] step: 7899, loss_G: 2.137690, lr_G: 0.000800
2023-06-24 07:23:32,025 - INFO - [Train] step: 7999, loss_D: 0.024065, lr_D: 0.000800
2023-06-24 07:23:32,108 - INFO - [Train] step: 7999, loss_G: 1.749671, lr_G: 0.000800
2023-06-24 07:24:36,316 - INFO - [Eval] step: 7999, fid: 32.289921
2023-06-24 07:25:34,057 - INFO - [Train] step: 8099, loss_D: 0.142223, lr_D: 0.000800
2023-06-24 07:25:34,140 - INFO - [Train] step: 8099, loss_G: 2.030174, lr_G: 0.000800
2023-06-24 07:26:31,742 - INFO - [Train] step: 8199, loss_D: 0.150466, lr_D: 0.000800
2023-06-24 07:26:31,825 - INFO - [Train] step: 8199, loss_G: 1.422742, lr_G: 0.000800
2023-06-24 07:27:29,345 - INFO - [Train] step: 8299, loss_D: 0.000788, lr_D: 0.000800
2023-06-24 07:27:29,428 - INFO - [Train] step: 8299, loss_G: 2.359755, lr_G: 0.000800
2023-06-24 07:28:27,022 - INFO - [Train] step: 8399, loss_D: 0.112005, lr_D: 0.000800
2023-06-24 07:28:27,104 - INFO - [Train] step: 8399, loss_G: 1.622546, lr_G: 0.000800
2023-06-24 07:29:24,855 - INFO - [Train] step: 8499, loss_D: 0.027277, lr_D: 0.000800
2023-06-24 07:29:24,938 - INFO - [Train] step: 8499, loss_G: 1.569931, lr_G: 0.000800
2023-06-24 07:30:22,563 - INFO - [Train] step: 8599, loss_D: 0.003585, lr_D: 0.000800
2023-06-24 07:30:22,646 - INFO - [Train] step: 8599, loss_G: 2.128253, lr_G: 0.000800
2023-06-24 07:31:20,242 - INFO - [Train] step: 8699, loss_D: 0.003402, lr_D: 0.000800
2023-06-24 07:31:20,325 - INFO - [Train] step: 8699, loss_G: 1.730481, lr_G: 0.000800
2023-06-24 07:32:17,938 - INFO - [Train] step: 8799, loss_D: 0.030107, lr_D: 0.000800
2023-06-24 07:32:18,022 - INFO - [Train] step: 8799, loss_G: 1.502315, lr_G: 0.000800
2023-06-24 07:33:15,691 - INFO - [Train] step: 8899, loss_D: 0.031722, lr_D: 0.000800
2023-06-24 07:33:15,774 - INFO - [Train] step: 8899, loss_G: 1.771855, lr_G: 0.000800
2023-06-24 07:34:13,322 - INFO - [Train] step: 8999, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 07:34:13,405 - INFO - [Train] step: 8999, loss_G: 2.411224, lr_G: 0.000800
2023-06-24 07:35:17,955 - INFO - [Eval] step: 8999, fid: 59.801727
2023-06-24 07:36:15,468 - INFO - [Train] step: 9099, loss_D: 0.005382, lr_D: 0.000800
2023-06-24 07:36:15,551 - INFO - [Train] step: 9099, loss_G: 1.783913, lr_G: 0.000800
2023-06-24 07:37:13,105 - INFO - [Train] step: 9199, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 07:37:13,188 - INFO - [Train] step: 9199, loss_G: 3.186535, lr_G: 0.000800
2023-06-24 07:38:10,793 - INFO - [Train] step: 9299, loss_D: 0.280778, lr_D: 0.000800
2023-06-24 07:38:10,876 - INFO - [Train] step: 9299, loss_G: 1.494758, lr_G: 0.000800
2023-06-24 07:39:08,520 - INFO - [Train] step: 9399, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 07:39:08,603 - INFO - [Train] step: 9399, loss_G: 3.874270, lr_G: 0.000800
2023-06-24 07:40:06,110 - INFO - [Train] step: 9499, loss_D: 0.014312, lr_D: 0.000800
2023-06-24 07:40:06,193 - INFO - [Train] step: 9499, loss_G: 2.019365, lr_G: 0.000800
2023-06-24 07:41:03,781 - INFO - [Train] step: 9599, loss_D: 0.010850, lr_D: 0.000800
2023-06-24 07:41:03,864 - INFO - [Train] step: 9599, loss_G: 1.808839, lr_G: 0.000800
2023-06-24 07:42:01,518 - INFO - [Train] step: 9699, loss_D: 0.002291, lr_D: 0.000800
2023-06-24 07:42:01,601 - INFO - [Train] step: 9699, loss_G: 1.776842, lr_G: 0.000800
2023-06-24 07:42:59,356 - INFO - [Train] step: 9799, loss_D: 0.008143, lr_D: 0.000800
2023-06-24 07:42:59,438 - INFO - [Train] step: 9799, loss_G: 2.113112, lr_G: 0.000800
2023-06-24 07:43:57,079 - INFO - [Train] step: 9899, loss_D: 0.175233, lr_D: 0.000800
2023-06-24 07:43:57,161 - INFO - [Train] step: 9899, loss_G: 1.374557, lr_G: 0.000800
2023-06-24 07:44:54,729 - INFO - [Train] step: 9999, loss_D: 0.000740, lr_D: 0.000800
2023-06-24 07:44:54,812 - INFO - [Train] step: 9999, loss_G: 2.566845, lr_G: 0.000800
2023-06-24 07:45:59,320 - INFO - [Eval] step: 9999, fid: 32.194120
2023-06-24 07:46:57,327 - INFO - [Train] step: 10099, loss_D: 0.084756, lr_D: 0.000800
2023-06-24 07:46:57,410 - INFO - [Train] step: 10099, loss_G: 1.712602, lr_G: 0.000800
2023-06-24 07:47:55,080 - INFO - [Train] step: 10199, loss_D: 0.000806, lr_D: 0.000800
2023-06-24 07:47:55,163 - INFO - [Train] step: 10199, loss_G: 3.205657, lr_G: 0.000800
2023-06-24 07:48:52,685 - INFO - [Train] step: 10299, loss_D: 1.056300, lr_D: 0.000800
2023-06-24 07:48:52,767 - INFO - [Train] step: 10299, loss_G: 4.988603, lr_G: 0.000800
2023-06-24 07:49:50,578 - INFO - [Train] step: 10399, loss_D: 0.012509, lr_D: 0.000800
2023-06-24 07:49:50,661 - INFO - [Train] step: 10399, loss_G: 2.203422, lr_G: 0.000800
2023-06-24 07:50:48,255 - INFO - [Train] step: 10499, loss_D: 0.001243, lr_D: 0.000800
2023-06-24 07:50:48,337 - INFO - [Train] step: 10499, loss_G: 3.114604, lr_G: 0.000800
2023-06-24 07:51:45,961 - INFO - [Train] step: 10599, loss_D: 0.002230, lr_D: 0.000800
2023-06-24 07:51:46,044 - INFO - [Train] step: 10599, loss_G: 1.641430, lr_G: 0.000800
2023-06-24 07:52:43,995 - INFO - [Train] step: 10699, loss_D: 0.015711, lr_D: 0.000800
2023-06-24 07:52:44,078 - INFO - [Train] step: 10699, loss_G: 1.770595, lr_G: 0.000800
2023-06-24 07:53:41,729 - INFO - [Train] step: 10799, loss_D: 0.213274, lr_D: 0.000800
2023-06-24 07:53:41,812 - INFO - [Train] step: 10799, loss_G: 1.894109, lr_G: 0.000800
2023-06-24 07:54:39,431 - INFO - [Train] step: 10899, loss_D: 0.023109, lr_D: 0.000800
2023-06-24 07:54:39,514 - INFO - [Train] step: 10899, loss_G: 1.991219, lr_G: 0.000800
2023-06-24 07:55:37,307 - INFO - [Train] step: 10999, loss_D: 0.003403, lr_D: 0.000800
2023-06-24 07:55:37,390 - INFO - [Train] step: 10999, loss_G: 1.865720, lr_G: 0.000800
2023-06-24 07:56:51,398 - INFO - [Eval] step: 10999, fid: 29.340322
2023-06-24 07:57:55,314 - INFO - [Train] step: 11099, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 07:57:55,397 - INFO - [Train] step: 11099, loss_G: 4.105007, lr_G: 0.000800
2023-06-24 07:58:52,922 - INFO - [Train] step: 11199, loss_D: 0.081343, lr_D: 0.000800
2023-06-24 07:58:53,005 - INFO - [Train] step: 11199, loss_G: 1.918891, lr_G: 0.000800
2023-06-24 07:59:56,875 - INFO - [Train] step: 11299, loss_D: 0.004160, lr_D: 0.000800
2023-06-24 07:59:56,958 - INFO - [Train] step: 11299, loss_G: 1.746233, lr_G: 0.000800
2023-06-24 08:01:02,364 - INFO - [Train] step: 11399, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 08:01:02,446 - INFO - [Train] step: 11399, loss_G: 2.817867, lr_G: 0.000800
2023-06-24 08:02:00,026 - INFO - [Train] step: 11499, loss_D: 0.278458, lr_D: 0.000800
2023-06-24 08:02:00,109 - INFO - [Train] step: 11499, loss_G: 2.537877, lr_G: 0.000800
2023-06-24 08:02:57,729 - INFO - [Train] step: 11599, loss_D: 0.107359, lr_D: 0.000800
2023-06-24 08:02:57,811 - INFO - [Train] step: 11599, loss_G: 2.660808, lr_G: 0.000800
2023-06-24 08:03:55,613 - INFO - [Train] step: 11699, loss_D: 1.593971, lr_D: 0.000800
2023-06-24 08:03:55,696 - INFO - [Train] step: 11699, loss_G: -0.003739, lr_G: 0.000800
2023-06-24 08:04:53,196 - INFO - [Train] step: 11799, loss_D: 0.002705, lr_D: 0.000800
2023-06-24 08:04:53,279 - INFO - [Train] step: 11799, loss_G: 2.057855, lr_G: 0.000800
2023-06-24 08:05:50,907 - INFO - [Train] step: 11899, loss_D: 0.014260, lr_D: 0.000800
2023-06-24 08:05:50,990 - INFO - [Train] step: 11899, loss_G: 1.858308, lr_G: 0.000800
2023-06-24 08:06:59,712 - INFO - [Train] step: 11999, loss_D: 0.016784, lr_D: 0.000800
2023-06-24 08:06:59,795 - INFO - [Train] step: 11999, loss_G: 1.750255, lr_G: 0.000800
2023-06-24 08:08:11,111 - INFO - [Eval] step: 11999, fid: 28.736824
2023-06-24 08:09:18,859 - INFO - [Train] step: 12099, loss_D: 0.004806, lr_D: 0.000800
2023-06-24 08:09:18,942 - INFO - [Train] step: 12099, loss_G: 1.694473, lr_G: 0.000800
2023-06-24 08:10:23,309 - INFO - [Train] step: 12199, loss_D: 0.159157, lr_D: 0.000800
2023-06-24 08:10:23,399 - INFO - [Train] step: 12199, loss_G: 0.772779, lr_G: 0.000800
2023-06-24 08:11:21,173 - INFO - [Train] step: 12299, loss_D: 0.006041, lr_D: 0.000800
2023-06-24 08:11:21,256 - INFO - [Train] step: 12299, loss_G: 2.264250, lr_G: 0.000800
2023-06-24 08:12:18,778 - INFO - [Train] step: 12399, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 08:12:18,860 - INFO - [Train] step: 12399, loss_G: 2.324852, lr_G: 0.000800
2023-06-24 08:13:16,396 - INFO - [Train] step: 12499, loss_D: 0.017785, lr_D: 0.000800
2023-06-24 08:13:16,480 - INFO - [Train] step: 12499, loss_G: 2.097902, lr_G: 0.000800
2023-06-24 08:14:14,127 - INFO - [Train] step: 12599, loss_D: 0.002726, lr_D: 0.000800
2023-06-24 08:14:14,210 - INFO - [Train] step: 12599, loss_G: 2.704345, lr_G: 0.000800
2023-06-24 08:15:11,806 - INFO - [Train] step: 12699, loss_D: 0.000306, lr_D: 0.000800
2023-06-24 08:15:11,889 - INFO - [Train] step: 12699, loss_G: 2.193798, lr_G: 0.000800
2023-06-24 08:16:09,346 - INFO - [Train] step: 12799, loss_D: 0.006068, lr_D: 0.000800
2023-06-24 08:16:09,429 - INFO - [Train] step: 12799, loss_G: 2.409989, lr_G: 0.000800
2023-06-24 08:17:07,070 - INFO - [Train] step: 12899, loss_D: 0.008999, lr_D: 0.000800
2023-06-24 08:17:07,152 - INFO - [Train] step: 12899, loss_G: 2.025202, lr_G: 0.000800
2023-06-24 08:18:04,962 - INFO - [Train] step: 12999, loss_D: 0.012731, lr_D: 0.000800
2023-06-24 08:18:05,046 - INFO - [Train] step: 12999, loss_G: 1.731906, lr_G: 0.000800
2023-06-24 08:19:09,610 - INFO - [Eval] step: 12999, fid: 29.608720
2023-06-24 08:20:07,094 - INFO - [Train] step: 13099, loss_D: 0.000539, lr_D: 0.000800
2023-06-24 08:20:07,177 - INFO - [Train] step: 13099, loss_G: 1.770076, lr_G: 0.000800
2023-06-24 08:21:04,787 - INFO - [Train] step: 13199, loss_D: 0.002220, lr_D: 0.000800
2023-06-24 08:21:04,870 - INFO - [Train] step: 13199, loss_G: 2.129249, lr_G: 0.000800
2023-06-24 08:22:02,476 - INFO - [Train] step: 13299, loss_D: 0.008825, lr_D: 0.000800
2023-06-24 08:22:02,559 - INFO - [Train] step: 13299, loss_G: 2.028738, lr_G: 0.000800
2023-06-24 08:23:00,212 - INFO - [Train] step: 13399, loss_D: 0.000680, lr_D: 0.000800
2023-06-24 08:23:00,295 - INFO - [Train] step: 13399, loss_G: 1.941353, lr_G: 0.000800
2023-06-24 08:23:57,707 - INFO - [Train] step: 13499, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 08:23:57,790 - INFO - [Train] step: 13499, loss_G: 3.092343, lr_G: 0.000800
2023-06-24 08:24:55,506 - INFO - [Train] step: 13599, loss_D: 0.024727, lr_D: 0.000800
2023-06-24 08:24:55,589 - INFO - [Train] step: 13599, loss_G: 1.428987, lr_G: 0.000800
2023-06-24 08:25:53,257 - INFO - [Train] step: 13699, loss_D: 0.028166, lr_D: 0.000800
2023-06-24 08:25:53,340 - INFO - [Train] step: 13699, loss_G: 1.903133, lr_G: 0.000800
2023-06-24 08:26:51,007 - INFO - [Train] step: 13799, loss_D: 0.032669, lr_D: 0.000800
2023-06-24 08:26:51,089 - INFO - [Train] step: 13799, loss_G: 1.944023, lr_G: 0.000800
2023-06-24 08:27:48,615 - INFO - [Train] step: 13899, loss_D: 0.008512, lr_D: 0.000800
2023-06-24 08:27:48,698 - INFO - [Train] step: 13899, loss_G: 2.415901, lr_G: 0.000800
2023-06-24 08:28:49,586 - INFO - [Train] step: 13999, loss_D: 0.008177, lr_D: 0.000800
2023-06-24 08:28:49,669 - INFO - [Train] step: 13999, loss_G: 1.941721, lr_G: 0.000800
2023-06-24 08:29:54,357 - INFO - [Eval] step: 13999, fid: 29.089243
2023-06-24 08:30:51,790 - INFO - [Train] step: 14099, loss_D: 0.003099, lr_D: 0.000800
2023-06-24 08:30:51,873 - INFO - [Train] step: 14099, loss_G: 1.815368, lr_G: 0.000800
2023-06-24 08:31:49,444 - INFO - [Train] step: 14199, loss_D: 0.002758, lr_D: 0.000800
2023-06-24 08:31:49,527 - INFO - [Train] step: 14199, loss_G: 1.861000, lr_G: 0.000800
2023-06-24 08:32:47,320 - INFO - [Train] step: 14299, loss_D: 0.000515, lr_D: 0.000800
2023-06-24 08:32:47,402 - INFO - [Train] step: 14299, loss_G: 2.351765, lr_G: 0.000800
2023-06-24 08:33:45,003 - INFO - [Train] step: 14399, loss_D: 0.011560, lr_D: 0.000800
2023-06-24 08:33:45,085 - INFO - [Train] step: 14399, loss_G: 1.721178, lr_G: 0.000800
2023-06-24 08:34:42,696 - INFO - [Train] step: 14499, loss_D: 0.003565, lr_D: 0.000800
2023-06-24 08:34:42,779 - INFO - [Train] step: 14499, loss_G: 2.006239, lr_G: 0.000800
2023-06-24 08:35:47,137 - INFO - [Train] step: 14599, loss_D: 0.005746, lr_D: 0.000800
2023-06-24 08:35:47,227 - INFO - [Train] step: 14599, loss_G: 1.924257, lr_G: 0.000800
2023-06-24 08:36:44,801 - INFO - [Train] step: 14699, loss_D: 0.002951, lr_D: 0.000800
2023-06-24 08:36:44,884 - INFO - [Train] step: 14699, loss_G: 1.970745, lr_G: 0.000800
2023-06-24 08:37:55,294 - INFO - [Train] step: 14799, loss_D: 0.032895, lr_D: 0.000800
2023-06-24 08:37:55,377 - INFO - [Train] step: 14799, loss_G: 2.193574, lr_G: 0.000800
2023-06-24 08:39:04,227 - INFO - [Train] step: 14899, loss_D: 0.015084, lr_D: 0.000800
2023-06-24 08:39:04,310 - INFO - [Train] step: 14899, loss_G: 1.639083, lr_G: 0.000800
2023-06-24 08:40:04,655 - INFO - [Train] step: 14999, loss_D: 0.002974, lr_D: 0.000800
2023-06-24 08:40:04,738 - INFO - [Train] step: 14999, loss_G: 2.079194, lr_G: 0.000800
2023-06-24 08:41:09,371 - INFO - [Eval] step: 14999, fid: 30.129603
2023-06-24 08:42:06,716 - INFO - [Train] step: 15099, loss_D: 0.004970, lr_D: 0.000800
2023-06-24 08:42:06,799 - INFO - [Train] step: 15099, loss_G: 2.020197, lr_G: 0.000800
2023-06-24 08:43:04,421 - INFO - [Train] step: 15199, loss_D: 0.000542, lr_D: 0.000800
2023-06-24 08:43:04,504 - INFO - [Train] step: 15199, loss_G: 2.075629, lr_G: 0.000800
2023-06-24 08:44:07,460 - INFO - [Train] step: 15299, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 08:44:07,551 - INFO - [Train] step: 15299, loss_G: 3.688422, lr_G: 0.000800
2023-06-24 08:45:14,879 - INFO - [Train] step: 15399, loss_D: 0.031257, lr_D: 0.000800
2023-06-24 08:45:15,054 - INFO - [Train] step: 15399, loss_G: 1.888204, lr_G: 0.000800
2023-06-24 08:46:31,807 - INFO - [Train] step: 15499, loss_D: 0.004457, lr_D: 0.000800
2023-06-24 08:46:31,890 - INFO - [Train] step: 15499, loss_G: 1.946784, lr_G: 0.000800
2023-06-24 08:47:29,555 - INFO - [Train] step: 15599, loss_D: 0.007229, lr_D: 0.000800
2023-06-24 08:47:29,638 - INFO - [Train] step: 15599, loss_G: 2.277501, lr_G: 0.000800
2023-06-24 08:48:38,461 - INFO - [Train] step: 15699, loss_D: 0.026225, lr_D: 0.000800
2023-06-24 08:48:38,634 - INFO - [Train] step: 15699, loss_G: 2.520179, lr_G: 0.000800
2023-06-24 08:49:47,966 - INFO - [Train] step: 15799, loss_D: 0.004876, lr_D: 0.000800
2023-06-24 08:49:48,048 - INFO - [Train] step: 15799, loss_G: 1.800578, lr_G: 0.000800
2023-06-24 08:50:52,436 - INFO - [Train] step: 15899, loss_D: 0.016300, lr_D: 0.000800
2023-06-24 08:50:52,610 - INFO - [Train] step: 15899, loss_G: 1.847960, lr_G: 0.000800
2023-06-24 08:52:08,366 - INFO - [Train] step: 15999, loss_D: 0.002955, lr_D: 0.000800
2023-06-24 08:52:08,449 - INFO - [Train] step: 15999, loss_G: 1.990665, lr_G: 0.000800
2023-06-24 08:53:20,643 - INFO - [Eval] step: 15999, fid: 28.657544
2023-06-24 08:54:18,257 - INFO - [Train] step: 16099, loss_D: 0.052443, lr_D: 0.000800
2023-06-24 08:54:18,340 - INFO - [Train] step: 16099, loss_G: 1.827744, lr_G: 0.000800
2023-06-24 08:55:16,083 - INFO - [Train] step: 16199, loss_D: 0.004628, lr_D: 0.000800
2023-06-24 08:55:16,166 - INFO - [Train] step: 16199, loss_G: 1.892340, lr_G: 0.000800
2023-06-24 08:56:17,124 - INFO - [Train] step: 16299, loss_D: 0.000265, lr_D: 0.000800
2023-06-24 08:56:17,207 - INFO - [Train] step: 16299, loss_G: 1.965976, lr_G: 0.000800
2023-06-24 08:57:24,719 - INFO - [Train] step: 16399, loss_D: 0.000323, lr_D: 0.000800
2023-06-24 08:57:24,802 - INFO - [Train] step: 16399, loss_G: 2.907073, lr_G: 0.000800
2023-06-24 08:58:35,030 - INFO - [Train] step: 16499, loss_D: 0.001231, lr_D: 0.000800
2023-06-24 08:58:35,113 - INFO - [Train] step: 16499, loss_G: 2.047818, lr_G: 0.000800
2023-06-24 08:59:42,821 - INFO - [Train] step: 16599, loss_D: 0.011455, lr_D: 0.000800
2023-06-24 08:59:42,920 - INFO - [Train] step: 16599, loss_G: 1.865376, lr_G: 0.000800
2023-06-24 09:00:56,512 - INFO - [Train] step: 16699, loss_D: 0.008789, lr_D: 0.000800
2023-06-24 09:00:56,595 - INFO - [Train] step: 16699, loss_G: 2.123920, lr_G: 0.000800
2023-06-24 09:02:03,892 - INFO - [Train] step: 16799, loss_D: 0.001377, lr_D: 0.000800
2023-06-24 09:02:03,974 - INFO - [Train] step: 16799, loss_G: 1.811577, lr_G: 0.000800
2023-06-24 09:03:04,915 - INFO - [Train] step: 16899, loss_D: 0.002461, lr_D: 0.000800
2023-06-24 09:03:04,998 - INFO - [Train] step: 16899, loss_G: 1.913319, lr_G: 0.000800
2023-06-24 09:04:02,566 - INFO - [Train] step: 16999, loss_D: 0.000406, lr_D: 0.000800
2023-06-24 09:04:02,648 - INFO - [Train] step: 16999, loss_G: 2.206974, lr_G: 0.000800
2023-06-24 09:05:07,190 - INFO - [Eval] step: 16999, fid: 29.254861
2023-06-24 09:06:04,533 - INFO - [Train] step: 17099, loss_D: 0.003440, lr_D: 0.000800
2023-06-24 09:06:04,616 - INFO - [Train] step: 17099, loss_G: 2.092119, lr_G: 0.000800
2023-06-24 09:07:02,015 - INFO - [Train] step: 17199, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 09:07:02,098 - INFO - [Train] step: 17199, loss_G: 3.009755, lr_G: 0.000800
2023-06-24 09:07:59,593 - INFO - [Train] step: 17299, loss_D: 0.066875, lr_D: 0.000800
2023-06-24 09:07:59,676 - INFO - [Train] step: 17299, loss_G: 2.457030, lr_G: 0.000800
2023-06-24 09:08:57,226 - INFO - [Train] step: 17399, loss_D: 0.000726, lr_D: 0.000800
2023-06-24 09:08:57,309 - INFO - [Train] step: 17399, loss_G: 1.844542, lr_G: 0.000800
2023-06-24 09:09:59,128 - INFO - [Train] step: 17499, loss_D: 0.004878, lr_D: 0.000800
2023-06-24 09:09:59,210 - INFO - [Train] step: 17499, loss_G: 1.721754, lr_G: 0.000800
2023-06-24 09:11:08,879 - INFO - [Train] step: 17599, loss_D: 0.000041, lr_D: 0.000800
2023-06-24 09:11:08,970 - INFO - [Train] step: 17599, loss_G: 2.237502, lr_G: 0.000800
2023-06-24 09:12:06,474 - INFO - [Train] step: 17699, loss_D: 0.143837, lr_D: 0.000800
2023-06-24 09:12:06,557 - INFO - [Train] step: 17699, loss_G: 2.008480, lr_G: 0.000800
2023-06-24 09:13:04,147 - INFO - [Train] step: 17799, loss_D: 0.004315, lr_D: 0.000800
2023-06-24 09:13:04,229 - INFO - [Train] step: 17799, loss_G: 2.038524, lr_G: 0.000800
2023-06-24 09:14:01,789 - INFO - [Train] step: 17899, loss_D: 0.001881, lr_D: 0.000800
2023-06-24 09:14:01,872 - INFO - [Train] step: 17899, loss_G: 1.982356, lr_G: 0.000800
2023-06-24 09:14:59,393 - INFO - [Train] step: 17999, loss_D: 0.010101, lr_D: 0.000800
2023-06-24 09:14:59,476 - INFO - [Train] step: 17999, loss_G: 1.846895, lr_G: 0.000800
2023-06-24 09:16:03,835 - INFO - [Eval] step: 17999, fid: 28.840112
2023-06-24 09:17:01,236 - INFO - [Train] step: 18099, loss_D: 0.006433, lr_D: 0.000800
2023-06-24 09:17:01,318 - INFO - [Train] step: 18099, loss_G: 1.755970, lr_G: 0.000800
2023-06-24 09:17:59,021 - INFO - [Train] step: 18199, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 09:17:59,104 - INFO - [Train] step: 18199, loss_G: 2.447080, lr_G: 0.000800
2023-06-24 09:18:56,552 - INFO - [Train] step: 18299, loss_D: 0.179504, lr_D: 0.000800
2023-06-24 09:18:56,635 - INFO - [Train] step: 18299, loss_G: 2.290441, lr_G: 0.000800
2023-06-24 09:19:54,235 - INFO - [Train] step: 18399, loss_D: 0.004229, lr_D: 0.000800
2023-06-24 09:19:54,318 - INFO - [Train] step: 18399, loss_G: 2.054770, lr_G: 0.000800
2023-06-24 09:20:51,911 - INFO - [Train] step: 18499, loss_D: 0.003240, lr_D: 0.000800
2023-06-24 09:20:51,994 - INFO - [Train] step: 18499, loss_G: 2.118809, lr_G: 0.000800
2023-06-24 09:21:49,528 - INFO - [Train] step: 18599, loss_D: 0.001048, lr_D: 0.000800
2023-06-24 09:21:49,610 - INFO - [Train] step: 18599, loss_G: 2.044218, lr_G: 0.000800
2023-06-24 09:22:47,151 - INFO - [Train] step: 18699, loss_D: 0.097057, lr_D: 0.000800
2023-06-24 09:22:47,233 - INFO - [Train] step: 18699, loss_G: 2.777635, lr_G: 0.000800
2023-06-24 09:23:44,980 - INFO - [Train] step: 18799, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 09:23:45,063 - INFO - [Train] step: 18799, loss_G: 2.369704, lr_G: 0.000800
2023-06-24 09:24:42,648 - INFO - [Train] step: 18899, loss_D: 0.003971, lr_D: 0.000800
2023-06-24 09:24:42,730 - INFO - [Train] step: 18899, loss_G: 1.989398, lr_G: 0.000800
2023-06-24 09:25:40,269 - INFO - [Train] step: 18999, loss_D: 0.000247, lr_D: 0.000800
2023-06-24 09:25:40,352 - INFO - [Train] step: 18999, loss_G: 2.159938, lr_G: 0.000800
2023-06-24 09:26:44,540 - INFO - [Eval] step: 18999, fid: 30.849840
2023-06-24 09:27:41,951 - INFO - [Train] step: 19099, loss_D: 0.003133, lr_D: 0.000800
2023-06-24 09:27:42,034 - INFO - [Train] step: 19099, loss_G: 1.948247, lr_G: 0.000800
2023-06-24 09:28:39,592 - INFO - [Train] step: 19199, loss_D: 0.001542, lr_D: 0.000800
2023-06-24 09:28:39,676 - INFO - [Train] step: 19199, loss_G: 1.847425, lr_G: 0.000800
2023-06-24 09:29:37,199 - INFO - [Train] step: 19299, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 09:29:37,281 - INFO - [Train] step: 19299, loss_G: 2.027490, lr_G: 0.000800
2023-06-24 09:30:34,851 - INFO - [Train] step: 19399, loss_D: 0.004066, lr_D: 0.000800
2023-06-24 09:30:34,934 - INFO - [Train] step: 19399, loss_G: 1.978914, lr_G: 0.000800
2023-06-24 09:31:32,658 - INFO - [Train] step: 19499, loss_D: 0.002344, lr_D: 0.000800
2023-06-24 09:31:32,741 - INFO - [Train] step: 19499, loss_G: 2.099999, lr_G: 0.000800
2023-06-24 09:32:30,297 - INFO - [Train] step: 19599, loss_D: 0.000077, lr_D: 0.000800
2023-06-24 09:32:30,380 - INFO - [Train] step: 19599, loss_G: 2.083792, lr_G: 0.000800
2023-06-24 09:33:27,873 - INFO - [Train] step: 19699, loss_D: 0.002951, lr_D: 0.000800
2023-06-24 09:33:27,956 - INFO - [Train] step: 19699, loss_G: 2.121438, lr_G: 0.000800
2023-06-24 09:34:25,538 - INFO - [Train] step: 19799, loss_D: 0.015989, lr_D: 0.000800
2023-06-24 09:34:25,621 - INFO - [Train] step: 19799, loss_G: 3.025670, lr_G: 0.000800
2023-06-24 09:35:23,204 - INFO - [Train] step: 19899, loss_D: 0.010858, lr_D: 0.000800
2023-06-24 09:35:23,286 - INFO - [Train] step: 19899, loss_G: 1.874336, lr_G: 0.000800
2023-06-24 09:36:20,811 - INFO - [Train] step: 19999, loss_D: 0.084943, lr_D: 0.000800
2023-06-24 09:36:20,893 - INFO - [Train] step: 19999, loss_G: 1.991785, lr_G: 0.000800
2023-06-24 09:37:25,115 - INFO - [Eval] step: 19999, fid: 28.842768
2023-06-24 09:38:22,762 - INFO - [Train] step: 20099, loss_D: 0.000670, lr_D: 0.000800
2023-06-24 09:38:22,845 - INFO - [Train] step: 20099, loss_G: 1.650180, lr_G: 0.000800
2023-06-24 09:39:20,375 - INFO - [Train] step: 20199, loss_D: 0.002863, lr_D: 0.000800
2023-06-24 09:39:20,457 - INFO - [Train] step: 20199, loss_G: 2.149275, lr_G: 0.000800
2023-06-24 09:40:17,969 - INFO - [Train] step: 20299, loss_D: 0.000607, lr_D: 0.000800
2023-06-24 09:40:18,051 - INFO - [Train] step: 20299, loss_G: 1.985850, lr_G: 0.000800
2023-06-24 09:41:15,557 - INFO - [Train] step: 20399, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 09:41:15,639 - INFO - [Train] step: 20399, loss_G: 2.206936, lr_G: 0.000800
2023-06-24 09:42:13,217 - INFO - [Train] step: 20499, loss_D: 0.004406, lr_D: 0.000800
2023-06-24 09:42:13,300 - INFO - [Train] step: 20499, loss_G: 2.249248, lr_G: 0.000800
2023-06-24 09:43:10,852 - INFO - [Train] step: 20599, loss_D: 0.000709, lr_D: 0.000800
2023-06-24 09:43:10,934 - INFO - [Train] step: 20599, loss_G: 2.072781, lr_G: 0.000800
2023-06-24 09:44:08,622 - INFO - [Train] step: 20699, loss_D: 0.000077, lr_D: 0.000800
2023-06-24 09:44:08,705 - INFO - [Train] step: 20699, loss_G: 1.975131, lr_G: 0.000800
2023-06-24 09:45:06,247 - INFO - [Train] step: 20799, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 09:45:06,330 - INFO - [Train] step: 20799, loss_G: 2.299527, lr_G: 0.000800
2023-06-24 09:46:03,881 - INFO - [Train] step: 20899, loss_D: 0.002217, lr_D: 0.000800
2023-06-24 09:46:03,964 - INFO - [Train] step: 20899, loss_G: 1.960616, lr_G: 0.000800
2023-06-24 09:47:01,498 - INFO - [Train] step: 20999, loss_D: 0.000480, lr_D: 0.000800
2023-06-24 09:47:01,581 - INFO - [Train] step: 20999, loss_G: 2.055939, lr_G: 0.000800
2023-06-24 09:48:05,857 - INFO - [Eval] step: 20999, fid: 30.936698
2023-06-24 09:49:03,253 - INFO - [Train] step: 21099, loss_D: 0.003674, lr_D: 0.000800
2023-06-24 09:49:03,335 - INFO - [Train] step: 21099, loss_G: 2.010959, lr_G: 0.000800
2023-06-24 09:50:00,874 - INFO - [Train] step: 21199, loss_D: 0.004095, lr_D: 0.000800
2023-06-24 09:50:00,957 - INFO - [Train] step: 21199, loss_G: 2.256111, lr_G: 0.000800
2023-06-24 09:50:58,433 - INFO - [Train] step: 21299, loss_D: 0.003742, lr_D: 0.000800
2023-06-24 09:50:58,515 - INFO - [Train] step: 21299, loss_G: 2.509910, lr_G: 0.000800
2023-06-24 09:51:56,246 - INFO - [Train] step: 21399, loss_D: 0.000196, lr_D: 0.000800
2023-06-24 09:51:56,328 - INFO - [Train] step: 21399, loss_G: 2.213987, lr_G: 0.000800
2023-06-24 09:52:53,869 - INFO - [Train] step: 21499, loss_D: 0.001557, lr_D: 0.000800
2023-06-24 09:52:53,952 - INFO - [Train] step: 21499, loss_G: 2.237796, lr_G: 0.000800
2023-06-24 09:53:51,491 - INFO - [Train] step: 21599, loss_D: 0.001694, lr_D: 0.000800
2023-06-24 09:53:51,574 - INFO - [Train] step: 21599, loss_G: 2.207937, lr_G: 0.000800
2023-06-24 09:54:49,153 - INFO - [Train] step: 21699, loss_D: 0.003550, lr_D: 0.000800
2023-06-24 09:54:49,236 - INFO - [Train] step: 21699, loss_G: 1.939348, lr_G: 0.000800
2023-06-24 09:55:46,766 - INFO - [Train] step: 21799, loss_D: 0.003241, lr_D: 0.000800
2023-06-24 09:55:46,848 - INFO - [Train] step: 21799, loss_G: 2.103376, lr_G: 0.000800
2023-06-24 09:56:44,370 - INFO - [Train] step: 21899, loss_D: 0.006256, lr_D: 0.000800
2023-06-24 09:56:44,453 - INFO - [Train] step: 21899, loss_G: 1.915474, lr_G: 0.000800
2023-06-24 09:57:42,097 - INFO - [Train] step: 21999, loss_D: 1.105238, lr_D: 0.000800
2023-06-24 09:57:42,180 - INFO - [Train] step: 21999, loss_G: 2.050287, lr_G: 0.000800
2023-06-24 09:58:46,410 - INFO - [Eval] step: 21999, fid: 30.043119
2023-06-24 09:59:43,825 - INFO - [Train] step: 22099, loss_D: 0.013948, lr_D: 0.000800
2023-06-24 09:59:43,908 - INFO - [Train] step: 22099, loss_G: 2.004405, lr_G: 0.000800
2023-06-24 10:00:41,416 - INFO - [Train] step: 22199, loss_D: 0.003150, lr_D: 0.000800
2023-06-24 10:00:41,498 - INFO - [Train] step: 22199, loss_G: 1.937884, lr_G: 0.000800
2023-06-24 10:01:39,068 - INFO - [Train] step: 22299, loss_D: 0.008532, lr_D: 0.000800
2023-06-24 10:01:39,151 - INFO - [Train] step: 22299, loss_G: 1.958115, lr_G: 0.000800
2023-06-24 10:02:36,683 - INFO - [Train] step: 22399, loss_D: 0.000080, lr_D: 0.000800
2023-06-24 10:02:36,766 - INFO - [Train] step: 22399, loss_G: 2.072523, lr_G: 0.000800
2023-06-24 10:03:34,322 - INFO - [Train] step: 22499, loss_D: 0.022494, lr_D: 0.000800
2023-06-24 10:03:34,405 - INFO - [Train] step: 22499, loss_G: 2.564107, lr_G: 0.000800
2023-06-24 10:04:31,939 - INFO - [Train] step: 22599, loss_D: 0.002246, lr_D: 0.000800
2023-06-24 10:04:32,022 - INFO - [Train] step: 22599, loss_G: 1.998611, lr_G: 0.000800
2023-06-24 10:05:29,716 - INFO - [Train] step: 22699, loss_D: 0.002007, lr_D: 0.000800
2023-06-24 10:05:29,798 - INFO - [Train] step: 22699, loss_G: 1.959649, lr_G: 0.000800
2023-06-24 10:06:27,502 - INFO - [Train] step: 22799, loss_D: 0.001886, lr_D: 0.000800
2023-06-24 10:06:27,585 - INFO - [Train] step: 22799, loss_G: 1.876245, lr_G: 0.000800
2023-06-24 10:07:25,127 - INFO - [Train] step: 22899, loss_D: 0.000626, lr_D: 0.000800
2023-06-24 10:07:25,209 - INFO - [Train] step: 22899, loss_G: 2.174791, lr_G: 0.000800
2023-06-24 10:08:22,787 - INFO - [Train] step: 22999, loss_D: 0.003045, lr_D: 0.000800
2023-06-24 10:08:22,869 - INFO - [Train] step: 22999, loss_G: 2.127417, lr_G: 0.000800
2023-06-24 10:09:27,171 - INFO - [Eval] step: 22999, fid: 29.400759
2023-06-24 10:10:24,519 - INFO - [Train] step: 23099, loss_D: 0.002641, lr_D: 0.000800
2023-06-24 10:10:24,602 - INFO - [Train] step: 23099, loss_G: 2.669292, lr_G: 0.000800
2023-06-24 10:11:22,045 - INFO - [Train] step: 23199, loss_D: 0.003498, lr_D: 0.000800
2023-06-24 10:11:22,128 - INFO - [Train] step: 23199, loss_G: 1.887536, lr_G: 0.000800
2023-06-24 10:12:19,808 - INFO - [Train] step: 23299, loss_D: 0.007652, lr_D: 0.000800
2023-06-24 10:12:19,891 - INFO - [Train] step: 23299, loss_G: 2.382352, lr_G: 0.000800
2023-06-24 10:13:17,456 - INFO - [Train] step: 23399, loss_D: 0.002390, lr_D: 0.000800
2023-06-24 10:13:17,539 - INFO - [Train] step: 23399, loss_G: 1.931977, lr_G: 0.000800
2023-06-24 10:14:15,019 - INFO - [Train] step: 23499, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:14:15,101 - INFO - [Train] step: 23499, loss_G: 1.781476, lr_G: 0.000800
2023-06-24 10:15:12,656 - INFO - [Train] step: 23599, loss_D: 0.008210, lr_D: 0.000800
2023-06-24 10:15:12,739 - INFO - [Train] step: 23599, loss_G: 2.076257, lr_G: 0.000800
2023-06-24 10:16:10,238 - INFO - [Train] step: 23699, loss_D: 0.024216, lr_D: 0.000800
2023-06-24 10:16:10,321 - INFO - [Train] step: 23699, loss_G: 2.701771, lr_G: 0.000800
2023-06-24 10:17:07,802 - INFO - [Train] step: 23799, loss_D: 0.006594, lr_D: 0.000800
2023-06-24 10:17:07,885 - INFO - [Train] step: 23799, loss_G: 2.239897, lr_G: 0.000800
2023-06-24 10:18:05,419 - INFO - [Train] step: 23899, loss_D: 0.000315, lr_D: 0.000800
2023-06-24 10:18:05,502 - INFO - [Train] step: 23899, loss_G: 1.943853, lr_G: 0.000800
2023-06-24 10:19:03,175 - INFO - [Train] step: 23999, loss_D: 0.002781, lr_D: 0.000800
2023-06-24 10:19:03,257 - INFO - [Train] step: 23999, loss_G: 1.960666, lr_G: 0.000800
2023-06-24 10:20:07,477 - INFO - [Eval] step: 23999, fid: 33.312955
2023-06-24 10:21:04,725 - INFO - [Train] step: 24099, loss_D: 0.002459, lr_D: 0.000800
2023-06-24 10:21:04,807 - INFO - [Train] step: 24099, loss_G: 1.914969, lr_G: 0.000800
2023-06-24 10:22:02,302 - INFO - [Train] step: 24199, loss_D: 0.024366, lr_D: 0.000800
2023-06-24 10:22:02,384 - INFO - [Train] step: 24199, loss_G: 1.813182, lr_G: 0.000800
2023-06-24 10:22:59,923 - INFO - [Train] step: 24299, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:23:00,006 - INFO - [Train] step: 24299, loss_G: 2.249192, lr_G: 0.000800
2023-06-24 10:23:57,515 - INFO - [Train] step: 24399, loss_D: 0.000637, lr_D: 0.000800
2023-06-24 10:23:57,597 - INFO - [Train] step: 24399, loss_G: 2.192516, lr_G: 0.000800
2023-06-24 10:24:55,112 - INFO - [Train] step: 24499, loss_D: 0.008558, lr_D: 0.000800
2023-06-24 10:24:55,195 - INFO - [Train] step: 24499, loss_G: 2.016378, lr_G: 0.000800
2023-06-24 10:25:52,902 - INFO - [Train] step: 24599, loss_D: 0.001483, lr_D: 0.000800
2023-06-24 10:25:52,985 - INFO - [Train] step: 24599, loss_G: 2.296594, lr_G: 0.000800
2023-06-24 10:26:50,521 - INFO - [Train] step: 24699, loss_D: 0.000982, lr_D: 0.000800
2023-06-24 10:26:50,603 - INFO - [Train] step: 24699, loss_G: 2.078632, lr_G: 0.000800
2023-06-24 10:27:48,124 - INFO - [Train] step: 24799, loss_D: 0.001149, lr_D: 0.000800
2023-06-24 10:27:48,206 - INFO - [Train] step: 24799, loss_G: 2.187417, lr_G: 0.000800
2023-06-24 10:28:45,715 - INFO - [Train] step: 24899, loss_D: 0.001663, lr_D: 0.000800
2023-06-24 10:28:45,798 - INFO - [Train] step: 24899, loss_G: 1.847698, lr_G: 0.000800
2023-06-24 10:29:43,323 - INFO - [Train] step: 24999, loss_D: 0.021173, lr_D: 0.000800
2023-06-24 10:29:43,405 - INFO - [Train] step: 24999, loss_G: 2.107336, lr_G: 0.000800
2023-06-24 10:30:47,594 - INFO - [Eval] step: 24999, fid: 33.902752
2023-06-24 10:31:44,836 - INFO - [Train] step: 25099, loss_D: 0.003536, lr_D: 0.000800
2023-06-24 10:31:44,918 - INFO - [Train] step: 25099, loss_G: 2.157039, lr_G: 0.000800
2023-06-24 10:32:42,341 - INFO - [Train] step: 25199, loss_D: 0.009934, lr_D: 0.000800
2023-06-24 10:32:42,423 - INFO - [Train] step: 25199, loss_G: 2.219370, lr_G: 0.000800
2023-06-24 10:33:40,105 - INFO - [Train] step: 25299, loss_D: 0.001063, lr_D: 0.000800
2023-06-24 10:33:40,188 - INFO - [Train] step: 25299, loss_G: 2.402507, lr_G: 0.000800
2023-06-24 10:34:37,711 - INFO - [Train] step: 25399, loss_D: 0.038536, lr_D: 0.000800
2023-06-24 10:34:37,794 - INFO - [Train] step: 25399, loss_G: 2.075335, lr_G: 0.000800
2023-06-24 10:35:35,311 - INFO - [Train] step: 25499, loss_D: 0.013813, lr_D: 0.000800
2023-06-24 10:35:35,394 - INFO - [Train] step: 25499, loss_G: 2.639925, lr_G: 0.000800
2023-06-24 10:36:32,943 - INFO - [Train] step: 25599, loss_D: 0.006319, lr_D: 0.000800
2023-06-24 10:36:33,025 - INFO - [Train] step: 25599, loss_G: 2.032532, lr_G: 0.000800
2023-06-24 10:37:30,473 - INFO - [Train] step: 25699, loss_D: 0.000332, lr_D: 0.000800
2023-06-24 10:37:30,557 - INFO - [Train] step: 25699, loss_G: 2.397039, lr_G: 0.000800
2023-06-24 10:38:28,084 - INFO - [Train] step: 25799, loss_D: 0.001457, lr_D: 0.000800
2023-06-24 10:38:28,168 - INFO - [Train] step: 25799, loss_G: 2.063603, lr_G: 0.000800
2023-06-24 10:39:25,771 - INFO - [Train] step: 25899, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:39:25,854 - INFO - [Train] step: 25899, loss_G: 2.362744, lr_G: 0.000800
2023-06-24 10:40:23,400 - INFO - [Train] step: 25999, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:40:23,482 - INFO - [Train] step: 25999, loss_G: 2.065271, lr_G: 0.000800
2023-06-24 10:41:27,648 - INFO - [Eval] step: 25999, fid: 30.185325
2023-06-24 10:42:24,910 - INFO - [Train] step: 26099, loss_D: 0.000804, lr_D: 0.000800
2023-06-24 10:42:24,992 - INFO - [Train] step: 26099, loss_G: 2.050341, lr_G: 0.000800
2023-06-24 10:43:22,374 - INFO - [Train] step: 26199, loss_D: 0.013844, lr_D: 0.000800
2023-06-24 10:43:22,457 - INFO - [Train] step: 26199, loss_G: 1.910295, lr_G: 0.000800
2023-06-24 10:44:19,980 - INFO - [Train] step: 26299, loss_D: 0.006334, lr_D: 0.000800
2023-06-24 10:44:20,063 - INFO - [Train] step: 26299, loss_G: 2.439908, lr_G: 0.000800
2023-06-24 10:45:17,570 - INFO - [Train] step: 26399, loss_D: 0.001828, lr_D: 0.000800
2023-06-24 10:45:17,653 - INFO - [Train] step: 26399, loss_G: 2.060815, lr_G: 0.000800
2023-06-24 10:46:15,175 - INFO - [Train] step: 26499, loss_D: 0.000864, lr_D: 0.000800
2023-06-24 10:46:15,258 - INFO - [Train] step: 26499, loss_G: 2.323548, lr_G: 0.000800
2023-06-24 10:47:12,875 - INFO - [Train] step: 26599, loss_D: 0.092226, lr_D: 0.000800
2023-06-24 10:47:12,958 - INFO - [Train] step: 26599, loss_G: 1.815181, lr_G: 0.000800
2023-06-24 10:48:10,493 - INFO - [Train] step: 26699, loss_D: 0.001840, lr_D: 0.000800
2023-06-24 10:48:10,576 - INFO - [Train] step: 26699, loss_G: 2.081642, lr_G: 0.000800
2023-06-24 10:49:08,092 - INFO - [Train] step: 26799, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:49:08,174 - INFO - [Train] step: 26799, loss_G: 2.209961, lr_G: 0.000800
2023-06-24 10:50:05,618 - INFO - [Train] step: 26899, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:50:05,701 - INFO - [Train] step: 26899, loss_G: 2.509407, lr_G: 0.000800
2023-06-24 10:51:03,276 - INFO - [Train] step: 26999, loss_D: 0.000103, lr_D: 0.000800
2023-06-24 10:51:03,359 - INFO - [Train] step: 26999, loss_G: 2.196766, lr_G: 0.000800
2023-06-24 10:52:07,539 - INFO - [Eval] step: 26999, fid: 30.541323
2023-06-24 10:53:04,840 - INFO - [Train] step: 27099, loss_D: 0.001627, lr_D: 0.000800
2023-06-24 10:53:04,923 - INFO - [Train] step: 27099, loss_G: 2.271342, lr_G: 0.000800
2023-06-24 10:54:02,548 - INFO - [Train] step: 27199, loss_D: 0.001411, lr_D: 0.000800
2023-06-24 10:54:02,632 - INFO - [Train] step: 27199, loss_G: 1.977915, lr_G: 0.000800
2023-06-24 10:55:00,167 - INFO - [Train] step: 27299, loss_D: 0.019171, lr_D: 0.000800
2023-06-24 10:55:00,250 - INFO - [Train] step: 27299, loss_G: 2.095257, lr_G: 0.000800
2023-06-24 10:55:57,748 - INFO - [Train] step: 27399, loss_D: 0.002799, lr_D: 0.000800
2023-06-24 10:55:57,831 - INFO - [Train] step: 27399, loss_G: 2.490192, lr_G: 0.000800
2023-06-24 10:56:55,344 - INFO - [Train] step: 27499, loss_D: 0.005670, lr_D: 0.000800
2023-06-24 10:56:55,427 - INFO - [Train] step: 27499, loss_G: 2.288400, lr_G: 0.000800
2023-06-24 10:57:52,943 - INFO - [Train] step: 27599, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 10:57:53,026 - INFO - [Train] step: 27599, loss_G: 1.993601, lr_G: 0.000800
2023-06-24 10:58:50,485 - INFO - [Train] step: 27699, loss_D: 0.001580, lr_D: 0.000800
2023-06-24 10:58:50,569 - INFO - [Train] step: 27699, loss_G: 2.052696, lr_G: 0.000800
2023-06-24 10:59:48,120 - INFO - [Train] step: 27799, loss_D: 0.081174, lr_D: 0.000800
2023-06-24 10:59:48,202 - INFO - [Train] step: 27799, loss_G: 1.586314, lr_G: 0.000800
2023-06-24 11:00:45,893 - INFO - [Train] step: 27899, loss_D: 0.000740, lr_D: 0.000800
2023-06-24 11:00:45,976 - INFO - [Train] step: 27899, loss_G: 2.351087, lr_G: 0.000800
2023-06-24 11:01:43,463 - INFO - [Train] step: 27999, loss_D: 0.015762, lr_D: 0.000800
2023-06-24 11:01:43,546 - INFO - [Train] step: 27999, loss_G: 2.184485, lr_G: 0.000800
2023-06-24 11:02:47,764 - INFO - [Eval] step: 27999, fid: 34.748710
2023-06-24 11:03:45,030 - INFO - [Train] step: 28099, loss_D: 0.000874, lr_D: 0.000800
2023-06-24 11:03:45,113 - INFO - [Train] step: 28099, loss_G: 2.065600, lr_G: 0.000800
2023-06-24 11:04:42,648 - INFO - [Train] step: 28199, loss_D: 0.000561, lr_D: 0.000800
2023-06-24 11:04:42,730 - INFO - [Train] step: 28199, loss_G: 2.220154, lr_G: 0.000800
2023-06-24 11:05:40,218 - INFO - [Train] step: 28299, loss_D: 0.004155, lr_D: 0.000800
2023-06-24 11:05:40,301 - INFO - [Train] step: 28299, loss_G: 2.094390, lr_G: 0.000800
2023-06-24 11:06:37,780 - INFO - [Train] step: 28399, loss_D: 0.007095, lr_D: 0.000800
2023-06-24 11:06:37,863 - INFO - [Train] step: 28399, loss_G: 1.960853, lr_G: 0.000800
2023-06-24 11:07:35,506 - INFO - [Train] step: 28499, loss_D: 0.002951, lr_D: 0.000800
2023-06-24 11:07:35,589 - INFO - [Train] step: 28499, loss_G: 1.705821, lr_G: 0.000800
2023-06-24 11:08:33,048 - INFO - [Train] step: 28599, loss_D: 0.006198, lr_D: 0.000800
2023-06-24 11:08:33,131 - INFO - [Train] step: 28599, loss_G: 2.528619, lr_G: 0.000800
2023-06-24 11:09:30,697 - INFO - [Train] step: 28699, loss_D: 0.000684, lr_D: 0.000800
2023-06-24 11:09:30,780 - INFO - [Train] step: 28699, loss_G: 2.168381, lr_G: 0.000800
2023-06-24 11:10:28,245 - INFO - [Train] step: 28799, loss_D: 0.011288, lr_D: 0.000800
2023-06-24 11:10:28,328 - INFO - [Train] step: 28799, loss_G: 2.099341, lr_G: 0.000800
2023-06-24 11:11:25,843 - INFO - [Train] step: 28899, loss_D: 0.015581, lr_D: 0.000800
2023-06-24 11:11:25,926 - INFO - [Train] step: 28899, loss_G: 2.101902, lr_G: 0.000800
2023-06-24 11:12:23,390 - INFO - [Train] step: 28999, loss_D: 0.000474, lr_D: 0.000800
2023-06-24 11:12:23,473 - INFO - [Train] step: 28999, loss_G: 1.937342, lr_G: 0.000800
2023-06-24 11:13:27,709 - INFO - [Eval] step: 28999, fid: 34.010789
2023-06-24 11:14:25,015 - INFO - [Train] step: 29099, loss_D: 0.064161, lr_D: 0.000800
2023-06-24 11:14:25,098 - INFO - [Train] step: 29099, loss_G: 2.118757, lr_G: 0.000800
2023-06-24 11:15:22,517 - INFO - [Train] step: 29199, loss_D: 0.005770, lr_D: 0.000800
2023-06-24 11:15:22,600 - INFO - [Train] step: 29199, loss_G: 2.337688, lr_G: 0.000800
2023-06-24 11:16:19,870 - INFO - [Train] step: 29299, loss_D: 0.000136, lr_D: 0.000800
2023-06-24 11:16:19,952 - INFO - [Train] step: 29299, loss_G: 2.185154, lr_G: 0.000800
2023-06-24 11:17:17,212 - INFO - [Train] step: 29399, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:17:17,295 - INFO - [Train] step: 29399, loss_G: 2.223881, lr_G: 0.000800
2023-06-24 11:18:14,629 - INFO - [Train] step: 29499, loss_D: 0.004634, lr_D: 0.000800
2023-06-24 11:18:14,711 - INFO - [Train] step: 29499, loss_G: 2.087013, lr_G: 0.000800
2023-06-24 11:19:12,020 - INFO - [Train] step: 29599, loss_D: 0.001361, lr_D: 0.000800
2023-06-24 11:19:12,102 - INFO - [Train] step: 29599, loss_G: 2.330678, lr_G: 0.000800
2023-06-24 11:20:09,330 - INFO - [Train] step: 29699, loss_D: 0.051172, lr_D: 0.000800
2023-06-24 11:20:09,412 - INFO - [Train] step: 29699, loss_G: 2.551911, lr_G: 0.000800
2023-06-24 11:21:06,876 - INFO - [Train] step: 29799, loss_D: 0.001332, lr_D: 0.000800
2023-06-24 11:21:06,959 - INFO - [Train] step: 29799, loss_G: 2.231091, lr_G: 0.000800
2023-06-24 11:22:04,235 - INFO - [Train] step: 29899, loss_D: 0.000859, lr_D: 0.000800
2023-06-24 11:22:04,318 - INFO - [Train] step: 29899, loss_G: 2.165933, lr_G: 0.000800
2023-06-24 11:23:01,618 - INFO - [Train] step: 29999, loss_D: 0.006938, lr_D: 0.000800
2023-06-24 11:23:01,701 - INFO - [Train] step: 29999, loss_G: 2.302606, lr_G: 0.000800
2023-06-24 11:24:05,930 - INFO - [Eval] step: 29999, fid: 32.628818
2023-06-24 11:25:03,321 - INFO - [Train] step: 30099, loss_D: 0.019718, lr_D: 0.000800
2023-06-24 11:25:03,404 - INFO - [Train] step: 30099, loss_G: 2.016864, lr_G: 0.000800
2023-06-24 11:26:00,692 - INFO - [Train] step: 30199, loss_D: 0.001035, lr_D: 0.000800
2023-06-24 11:26:00,775 - INFO - [Train] step: 30199, loss_G: 2.276348, lr_G: 0.000800
2023-06-24 11:26:58,089 - INFO - [Train] step: 30299, loss_D: 0.010700, lr_D: 0.000800
2023-06-24 11:26:58,172 - INFO - [Train] step: 30299, loss_G: 1.918990, lr_G: 0.000800
2023-06-24 11:27:55,592 - INFO - [Train] step: 30399, loss_D: 0.001089, lr_D: 0.000800
2023-06-24 11:27:55,675 - INFO - [Train] step: 30399, loss_G: 2.162908, lr_G: 0.000800
2023-06-24 11:28:52,945 - INFO - [Train] step: 30499, loss_D: 0.028317, lr_D: 0.000800
2023-06-24 11:28:53,027 - INFO - [Train] step: 30499, loss_G: 3.335033, lr_G: 0.000800
2023-06-24 11:29:50,365 - INFO - [Train] step: 30599, loss_D: 0.001975, lr_D: 0.000800
2023-06-24 11:29:50,448 - INFO - [Train] step: 30599, loss_G: 2.477669, lr_G: 0.000800
2023-06-24 11:30:47,742 - INFO - [Train] step: 30699, loss_D: 0.001634, lr_D: 0.000800
2023-06-24 11:30:47,824 - INFO - [Train] step: 30699, loss_G: 2.143751, lr_G: 0.000800
2023-06-24 11:31:45,100 - INFO - [Train] step: 30799, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:31:45,182 - INFO - [Train] step: 30799, loss_G: 2.208802, lr_G: 0.000800
2023-06-24 11:32:42,494 - INFO - [Train] step: 30899, loss_D: 0.004879, lr_D: 0.000800
2023-06-24 11:32:42,577 - INFO - [Train] step: 30899, loss_G: 2.424818, lr_G: 0.000800
2023-06-24 11:33:39,873 - INFO - [Train] step: 30999, loss_D: 0.000457, lr_D: 0.000800
2023-06-24 11:33:39,955 - INFO - [Train] step: 30999, loss_G: 2.147835, lr_G: 0.000800
2023-06-24 11:34:44,131 - INFO - [Eval] step: 30999, fid: 29.325340
2023-06-24 11:35:41,484 - INFO - [Train] step: 31099, loss_D: 0.000011, lr_D: 0.000800
2023-06-24 11:35:41,566 - INFO - [Train] step: 31099, loss_G: 2.240451, lr_G: 0.000800
2023-06-24 11:36:38,786 - INFO - [Train] step: 31199, loss_D: 0.000497, lr_D: 0.000800
2023-06-24 11:36:38,869 - INFO - [Train] step: 31199, loss_G: 2.707833, lr_G: 0.000800
2023-06-24 11:37:36,180 - INFO - [Train] step: 31299, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:37:36,262 - INFO - [Train] step: 31299, loss_G: 2.415171, lr_G: 0.000800
2023-06-24 11:38:33,570 - INFO - [Train] step: 31399, loss_D: 0.002412, lr_D: 0.000800
2023-06-24 11:38:33,653 - INFO - [Train] step: 31399, loss_G: 2.328855, lr_G: 0.000800
2023-06-24 11:39:30,890 - INFO - [Train] step: 31499, loss_D: 0.000282, lr_D: 0.000800
2023-06-24 11:39:30,973 - INFO - [Train] step: 31499, loss_G: 2.316280, lr_G: 0.000800
2023-06-24 11:40:28,250 - INFO - [Train] step: 31599, loss_D: 0.001769, lr_D: 0.000800
2023-06-24 11:40:28,332 - INFO - [Train] step: 31599, loss_G: 2.461125, lr_G: 0.000800
2023-06-24 11:41:25,833 - INFO - [Train] step: 31699, loss_D: 0.006275, lr_D: 0.000800
2023-06-24 11:41:25,916 - INFO - [Train] step: 31699, loss_G: 1.972717, lr_G: 0.000800
2023-06-24 11:42:23,155 - INFO - [Train] step: 31799, loss_D: 0.000427, lr_D: 0.000800
2023-06-24 11:42:23,237 - INFO - [Train] step: 31799, loss_G: 2.444126, lr_G: 0.000800
2023-06-24 11:43:20,533 - INFO - [Train] step: 31899, loss_D: 0.009909, lr_D: 0.000800
2023-06-24 11:43:20,616 - INFO - [Train] step: 31899, loss_G: 2.585755, lr_G: 0.000800
2023-06-24 11:44:17,844 - INFO - [Train] step: 31999, loss_D: 0.000068, lr_D: 0.000800
2023-06-24 11:44:17,926 - INFO - [Train] step: 31999, loss_G: 1.908824, lr_G: 0.000800
2023-06-24 11:45:22,074 - INFO - [Eval] step: 31999, fid: 32.613787
2023-06-24 11:46:19,307 - INFO - [Train] step: 32099, loss_D: 0.128589, lr_D: 0.000800
2023-06-24 11:46:19,390 - INFO - [Train] step: 32099, loss_G: 2.456143, lr_G: 0.000800
2023-06-24 11:47:16,677 - INFO - [Train] step: 32199, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:47:16,759 - INFO - [Train] step: 32199, loss_G: 2.376409, lr_G: 0.000800
2023-06-24 11:48:14,051 - INFO - [Train] step: 32299, loss_D: 0.001111, lr_D: 0.000800
2023-06-24 11:48:14,133 - INFO - [Train] step: 32299, loss_G: 2.174556, lr_G: 0.000800
2023-06-24 11:49:11,568 - INFO - [Train] step: 32399, loss_D: 0.000267, lr_D: 0.000800
2023-06-24 11:49:11,650 - INFO - [Train] step: 32399, loss_G: 2.245586, lr_G: 0.000800
2023-06-24 11:50:08,953 - INFO - [Train] step: 32499, loss_D: 0.001059, lr_D: 0.000800
2023-06-24 11:50:09,035 - INFO - [Train] step: 32499, loss_G: 2.238291, lr_G: 0.000800
2023-06-24 11:51:06,338 - INFO - [Train] step: 32599, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:51:06,420 - INFO - [Train] step: 32599, loss_G: 1.882176, lr_G: 0.000800
2023-06-24 11:52:03,656 - INFO - [Train] step: 32699, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:52:03,738 - INFO - [Train] step: 32699, loss_G: 2.223881, lr_G: 0.000800
2023-06-24 11:53:01,031 - INFO - [Train] step: 32799, loss_D: 0.000588, lr_D: 0.000800
2023-06-24 11:53:01,114 - INFO - [Train] step: 32799, loss_G: 2.318131, lr_G: 0.000800
2023-06-24 11:53:58,384 - INFO - [Train] step: 32899, loss_D: 0.006438, lr_D: 0.000800
2023-06-24 11:53:58,474 - INFO - [Train] step: 32899, loss_G: 2.441819, lr_G: 0.000800
2023-06-24 11:54:55,918 - INFO - [Train] step: 32999, loss_D: 0.000367, lr_D: 0.000800
2023-06-24 11:54:56,001 - INFO - [Train] step: 32999, loss_G: 2.112702, lr_G: 0.000800
2023-06-24 11:56:00,132 - INFO - [Eval] step: 32999, fid: 30.122267
2023-06-24 11:56:57,357 - INFO - [Train] step: 33099, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:56:57,440 - INFO - [Train] step: 33099, loss_G: 2.385844, lr_G: 0.000800
2023-06-24 11:57:54,754 - INFO - [Train] step: 33199, loss_D: 0.000390, lr_D: 0.000800
2023-06-24 11:57:54,837 - INFO - [Train] step: 33199, loss_G: 2.143423, lr_G: 0.000800
2023-06-24 11:58:52,303 - INFO - [Train] step: 33299, loss_D: 0.000779, lr_D: 0.000800
2023-06-24 11:58:52,385 - INFO - [Train] step: 33299, loss_G: 1.906966, lr_G: 0.000800
2023-06-24 11:59:49,837 - INFO - [Train] step: 33399, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 11:59:49,920 - INFO - [Train] step: 33399, loss_G: 2.523175, lr_G: 0.000800
2023-06-24 12:00:47,351 - INFO - [Train] step: 33499, loss_D: 0.002218, lr_D: 0.000800
2023-06-24 12:00:47,434 - INFO - [Train] step: 33499, loss_G: 2.610176, lr_G: 0.000800
2023-06-24 12:01:44,994 - INFO - [Train] step: 33599, loss_D: 0.001383, lr_D: 0.000800
2023-06-24 12:01:45,076 - INFO - [Train] step: 33599, loss_G: 2.100917, lr_G: 0.000800
2023-06-24 12:02:42,719 - INFO - [Train] step: 33699, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:02:42,801 - INFO - [Train] step: 33699, loss_G: 2.111158, lr_G: 0.000800
2023-06-24 12:03:40,296 - INFO - [Train] step: 33799, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:03:40,379 - INFO - [Train] step: 33799, loss_G: 2.180873, lr_G: 0.000800
2023-06-24 12:04:37,811 - INFO - [Train] step: 33899, loss_D: 0.000966, lr_D: 0.000800
2023-06-24 12:04:37,894 - INFO - [Train] step: 33899, loss_G: 2.337418, lr_G: 0.000800
2023-06-24 12:05:35,195 - INFO - [Train] step: 33999, loss_D: 0.000286, lr_D: 0.000800
2023-06-24 12:05:35,277 - INFO - [Train] step: 33999, loss_G: 2.212513, lr_G: 0.000800
2023-06-24 12:06:39,495 - INFO - [Eval] step: 33999, fid: 31.679373
2023-06-24 12:07:36,730 - INFO - [Train] step: 34099, loss_D: 0.000974, lr_D: 0.000800
2023-06-24 12:07:36,813 - INFO - [Train] step: 34099, loss_G: 2.149127, lr_G: 0.000800
2023-06-24 12:08:34,109 - INFO - [Train] step: 34199, loss_D: 0.006572, lr_D: 0.000800
2023-06-24 12:08:34,192 - INFO - [Train] step: 34199, loss_G: 2.166146, lr_G: 0.000800
2023-06-24 12:09:31,786 - INFO - [Train] step: 34299, loss_D: 0.003095, lr_D: 0.000800
2023-06-24 12:09:31,868 - INFO - [Train] step: 34299, loss_G: 2.373413, lr_G: 0.000800
2023-06-24 12:10:29,328 - INFO - [Train] step: 34399, loss_D: 0.000082, lr_D: 0.000800
2023-06-24 12:10:29,410 - INFO - [Train] step: 34399, loss_G: 2.158774, lr_G: 0.000800
2023-06-24 12:11:26,907 - INFO - [Train] step: 34499, loss_D: 0.001837, lr_D: 0.000800
2023-06-24 12:11:26,990 - INFO - [Train] step: 34499, loss_G: 1.774444, lr_G: 0.000800
2023-06-24 12:12:24,522 - INFO - [Train] step: 34599, loss_D: 0.006414, lr_D: 0.000800
2023-06-24 12:12:24,605 - INFO - [Train] step: 34599, loss_G: 2.018260, lr_G: 0.000800
2023-06-24 12:13:21,923 - INFO - [Train] step: 34699, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:13:22,005 - INFO - [Train] step: 34699, loss_G: 2.378462, lr_G: 0.000800
2023-06-24 12:14:19,311 - INFO - [Train] step: 34799, loss_D: 0.001256, lr_D: 0.000800
2023-06-24 12:14:19,394 - INFO - [Train] step: 34799, loss_G: 2.124247, lr_G: 0.000800
2023-06-24 12:15:16,675 - INFO - [Train] step: 34899, loss_D: 0.021262, lr_D: 0.000800
2023-06-24 12:15:16,757 - INFO - [Train] step: 34899, loss_G: 2.308048, lr_G: 0.000800
2023-06-24 12:16:14,151 - INFO - [Train] step: 34999, loss_D: 0.002062, lr_D: 0.000800
2023-06-24 12:16:14,234 - INFO - [Train] step: 34999, loss_G: 2.323404, lr_G: 0.000800
2023-06-24 12:17:18,489 - INFO - [Eval] step: 34999, fid: 30.424349
2023-06-24 12:18:15,760 - INFO - [Train] step: 35099, loss_D: 0.002040, lr_D: 0.000800
2023-06-24 12:18:15,842 - INFO - [Train] step: 35099, loss_G: 2.308157, lr_G: 0.000800
2023-06-24 12:19:13,098 - INFO - [Train] step: 35199, loss_D: 0.000285, lr_D: 0.000800
2023-06-24 12:19:13,181 - INFO - [Train] step: 35199, loss_G: 2.240875, lr_G: 0.000800
2023-06-24 12:20:10,476 - INFO - [Train] step: 35299, loss_D: 0.000163, lr_D: 0.000800
2023-06-24 12:20:10,558 - INFO - [Train] step: 35299, loss_G: 2.147072, lr_G: 0.000800
2023-06-24 12:21:07,848 - INFO - [Train] step: 35399, loss_D: 0.000237, lr_D: 0.000800
2023-06-24 12:21:07,930 - INFO - [Train] step: 35399, loss_G: 1.942272, lr_G: 0.000800
2023-06-24 12:22:05,184 - INFO - [Train] step: 35499, loss_D: 0.000282, lr_D: 0.000800
2023-06-24 12:22:05,267 - INFO - [Train] step: 35499, loss_G: 2.126554, lr_G: 0.000800
2023-06-24 12:23:02,739 - INFO - [Train] step: 35599, loss_D: 0.000335, lr_D: 0.000800
2023-06-24 12:23:02,821 - INFO - [Train] step: 35599, loss_G: 2.170200, lr_G: 0.000800
2023-06-24 12:23:59,998 - INFO - [Train] step: 35699, loss_D: 0.000838, lr_D: 0.000800
2023-06-24 12:24:00,080 - INFO - [Train] step: 35699, loss_G: 2.444325, lr_G: 0.000800
2023-06-24 12:24:57,361 - INFO - [Train] step: 35799, loss_D: 0.005745, lr_D: 0.000800
2023-06-24 12:24:57,444 - INFO - [Train] step: 35799, loss_G: 2.083773, lr_G: 0.000800
2023-06-24 12:25:54,717 - INFO - [Train] step: 35899, loss_D: 0.000124, lr_D: 0.000800
2023-06-24 12:25:54,800 - INFO - [Train] step: 35899, loss_G: 2.398322, lr_G: 0.000800
2023-06-24 12:26:52,109 - INFO - [Train] step: 35999, loss_D: 0.000555, lr_D: 0.000800
2023-06-24 12:26:52,192 - INFO - [Train] step: 35999, loss_G: 2.151057, lr_G: 0.000800
2023-06-24 12:27:56,306 - INFO - [Eval] step: 35999, fid: 30.894534
2023-06-24 12:28:53,498 - INFO - [Train] step: 36099, loss_D: 0.000485, lr_D: 0.000800
2023-06-24 12:28:53,581 - INFO - [Train] step: 36099, loss_G: 2.092917, lr_G: 0.000800
2023-06-24 12:29:50,838 - INFO - [Train] step: 36199, loss_D: 0.001604, lr_D: 0.000800
2023-06-24 12:29:50,920 - INFO - [Train] step: 36199, loss_G: 2.403778, lr_G: 0.000800
2023-06-24 12:30:48,374 - INFO - [Train] step: 36299, loss_D: 0.006791, lr_D: 0.000800
2023-06-24 12:30:48,456 - INFO - [Train] step: 36299, loss_G: 2.248544, lr_G: 0.000800
2023-06-24 12:31:45,675 - INFO - [Train] step: 36399, loss_D: 0.000876, lr_D: 0.000800
2023-06-24 12:31:45,757 - INFO - [Train] step: 36399, loss_G: 2.223669, lr_G: 0.000800
2023-06-24 12:32:43,032 - INFO - [Train] step: 36499, loss_D: 0.001137, lr_D: 0.000800
2023-06-24 12:32:43,114 - INFO - [Train] step: 36499, loss_G: 1.920998, lr_G: 0.000800
2023-06-24 12:33:40,398 - INFO - [Train] step: 36599, loss_D: 0.000501, lr_D: 0.000800
2023-06-24 12:33:40,480 - INFO - [Train] step: 36599, loss_G: 2.273502, lr_G: 0.000800
2023-06-24 12:34:37,736 - INFO - [Train] step: 36699, loss_D: 0.000519, lr_D: 0.000800
2023-06-24 12:34:37,818 - INFO - [Train] step: 36699, loss_G: 2.388361, lr_G: 0.000800
2023-06-24 12:35:35,115 - INFO - [Train] step: 36799, loss_D: 0.005972, lr_D: 0.000800
2023-06-24 12:35:35,198 - INFO - [Train] step: 36799, loss_G: 3.191918, lr_G: 0.000800
2023-06-24 12:36:32,645 - INFO - [Train] step: 36899, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:36:32,728 - INFO - [Train] step: 36899, loss_G: 2.304679, lr_G: 0.000800
2023-06-24 12:37:29,955 - INFO - [Train] step: 36999, loss_D: 0.001113, lr_D: 0.000800
2023-06-24 12:37:30,037 - INFO - [Train] step: 36999, loss_G: 2.378490, lr_G: 0.000800
2023-06-24 12:38:34,266 - INFO - [Eval] step: 36999, fid: 33.652028
2023-06-24 12:39:31,549 - INFO - [Train] step: 37099, loss_D: 0.000209, lr_D: 0.000800
2023-06-24 12:39:31,632 - INFO - [Train] step: 37099, loss_G: 2.093653, lr_G: 0.000800
2023-06-24 12:40:28,889 - INFO - [Train] step: 37199, loss_D: 0.002477, lr_D: 0.000800
2023-06-24 12:40:28,972 - INFO - [Train] step: 37199, loss_G: 2.426035, lr_G: 0.000800
2023-06-24 12:41:26,253 - INFO - [Train] step: 37299, loss_D: 0.000762, lr_D: 0.000800
2023-06-24 12:41:26,336 - INFO - [Train] step: 37299, loss_G: 2.470501, lr_G: 0.000800
2023-06-24 12:42:23,668 - INFO - [Train] step: 37399, loss_D: 0.001697, lr_D: 0.000800
2023-06-24 12:42:23,751 - INFO - [Train] step: 37399, loss_G: 2.493346, lr_G: 0.000800
2023-06-24 12:43:20,981 - INFO - [Train] step: 37499, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:43:21,062 - INFO - [Train] step: 37499, loss_G: 2.280356, lr_G: 0.000800
2023-06-24 12:44:18,502 - INFO - [Train] step: 37599, loss_D: 0.001269, lr_D: 0.000800
2023-06-24 12:44:18,585 - INFO - [Train] step: 37599, loss_G: 2.217932, lr_G: 0.000800
2023-06-24 12:45:15,798 - INFO - [Train] step: 37699, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:45:15,880 - INFO - [Train] step: 37699, loss_G: 2.416156, lr_G: 0.000800
2023-06-24 12:46:13,157 - INFO - [Train] step: 37799, loss_D: 0.001168, lr_D: 0.000800
2023-06-24 12:46:13,240 - INFO - [Train] step: 37799, loss_G: 2.249489, lr_G: 0.000800
2023-06-24 12:47:10,488 - INFO - [Train] step: 37899, loss_D: 0.002675, lr_D: 0.000800
2023-06-24 12:47:10,570 - INFO - [Train] step: 37899, loss_G: 2.155948, lr_G: 0.000800
2023-06-24 12:48:07,844 - INFO - [Train] step: 37999, loss_D: 0.000571, lr_D: 0.000800
2023-06-24 12:48:07,927 - INFO - [Train] step: 37999, loss_G: 2.240991, lr_G: 0.000800
2023-06-24 12:49:12,149 - INFO - [Eval] step: 37999, fid: 29.063193
2023-06-24 12:50:09,338 - INFO - [Train] step: 38099, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:50:09,420 - INFO - [Train] step: 38099, loss_G: 2.276155, lr_G: 0.000800
2023-06-24 12:51:06,849 - INFO - [Train] step: 38199, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:51:06,931 - INFO - [Train] step: 38199, loss_G: 2.377781, lr_G: 0.000800
2023-06-24 12:52:04,155 - INFO - [Train] step: 38299, loss_D: 0.008478, lr_D: 0.000800
2023-06-24 12:52:04,237 - INFO - [Train] step: 38299, loss_G: 2.177687, lr_G: 0.000800
2023-06-24 12:53:01,526 - INFO - [Train] step: 38399, loss_D: 0.010802, lr_D: 0.000800
2023-06-24 12:53:01,609 - INFO - [Train] step: 38399, loss_G: 2.033100, lr_G: 0.000800
2023-06-24 12:53:58,873 - INFO - [Train] step: 38499, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 12:53:58,955 - INFO - [Train] step: 38499, loss_G: 1.901830, lr_G: 0.000800
2023-06-24 12:54:56,186 - INFO - [Train] step: 38599, loss_D: 0.001097, lr_D: 0.000800
2023-06-24 12:54:56,268 - INFO - [Train] step: 38599, loss_G: 2.622901, lr_G: 0.000800
2023-06-24 12:55:53,554 - INFO - [Train] step: 38699, loss_D: 0.003515, lr_D: 0.000800
2023-06-24 12:55:53,637 - INFO - [Train] step: 38699, loss_G: 2.151928, lr_G: 0.000800
2023-06-24 12:56:50,906 - INFO - [Train] step: 38799, loss_D: 0.000548, lr_D: 0.000800
2023-06-24 12:56:50,987 - INFO - [Train] step: 38799, loss_G: 2.323481, lr_G: 0.000800
2023-06-24 12:57:48,358 - INFO - [Train] step: 38899, loss_D: 0.000232, lr_D: 0.000800
2023-06-24 12:57:48,441 - INFO - [Train] step: 38899, loss_G: 2.472898, lr_G: 0.000800
2023-06-24 12:58:45,669 - INFO - [Train] step: 38999, loss_D: 0.021749, lr_D: 0.000800
2023-06-24 12:58:45,752 - INFO - [Train] step: 38999, loss_G: 2.574162, lr_G: 0.000800
2023-06-24 12:59:49,866 - INFO - [Eval] step: 38999, fid: 32.682053
2023-06-24 13:00:47,118 - INFO - [Train] step: 39099, loss_D: 0.024872, lr_D: 0.000800
2023-06-24 13:00:47,200 - INFO - [Train] step: 39099, loss_G: 1.967647, lr_G: 0.000800
2023-06-24 13:01:44,432 - INFO - [Train] step: 39199, loss_D: 0.000082, lr_D: 0.000800
2023-06-24 13:01:44,514 - INFO - [Train] step: 39199, loss_G: 2.308794, lr_G: 0.000800
2023-06-24 13:02:41,764 - INFO - [Train] step: 39299, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 13:02:41,846 - INFO - [Train] step: 39299, loss_G: 2.326561, lr_G: 0.000800
2023-06-24 13:03:39,115 - INFO - [Train] step: 39399, loss_D: 0.000517, lr_D: 0.000800
2023-06-24 13:03:39,197 - INFO - [Train] step: 39399, loss_G: 2.417474, lr_G: 0.000800
2023-06-24 13:04:36,664 - INFO - [Train] step: 39499, loss_D: 0.158072, lr_D: 0.000800
2023-06-24 13:04:36,747 - INFO - [Train] step: 39499, loss_G: 3.514744, lr_G: 0.000800
2023-06-24 13:05:33,984 - INFO - [Train] step: 39599, loss_D: 0.002005, lr_D: 0.000800
2023-06-24 13:05:34,066 - INFO - [Train] step: 39599, loss_G: 1.972220, lr_G: 0.000800
2023-06-24 13:06:31,256 - INFO - [Train] step: 39699, loss_D: 0.000051, lr_D: 0.000800
2023-06-24 13:06:31,339 - INFO - [Train] step: 39699, loss_G: 2.168754, lr_G: 0.000800
2023-06-24 13:07:28,622 - INFO - [Train] step: 39799, loss_D: 0.000554, lr_D: 0.000800
2023-06-24 13:07:28,705 - INFO - [Train] step: 39799, loss_G: 2.466946, lr_G: 0.000800
2023-06-24 13:08:25,947 - INFO - [Train] step: 39899, loss_D: 0.000000, lr_D: 0.000800
2023-06-24 13:08:26,029 - INFO - [Train] step: 39899, loss_G: 2.265419, lr_G: 0.000800
2023-06-24 13:09:23,260 - INFO - [Train] step: 39999, loss_D: 0.045577, lr_D: 0.000800
2023-06-24 13:09:23,342 - INFO - [Train] step: 39999, loss_G: 1.405491, lr_G: 0.000800
2023-06-24 13:10:27,522 - INFO - [Eval] step: 39999, fid: 31.766321
2023-06-24 13:10:27,702 - INFO - Best FID score: 28.65754368089648
2023-06-24 13:10:27,702 - INFO - End of training