2023-06-20 16:41:39,353 - INFO - Experiment directory: ./runs/sngan_cifar10/ 2023-06-20 16:41:39,354 - INFO - Number of processes: 1 2023-06-20 16:41:39,354 - INFO - Distributed type: DistributedType.NO 2023-06-20 16:41:39,354 - INFO - Mixed precision: no 2023-06-20 16:41:39,354 - INFO - ============================== 2023-06-20 16:41:39,794 - INFO - Size of training set: 50000 2023-06-20 16:41:39,794 - INFO - Batch size per process: 512 2023-06-20 16:41:39,794 - INFO - Total batch size: 512 2023-06-20 16:41:39,795 - INFO - ============================== 2023-06-20 16:41:40,767 - INFO - Start training... 2023-06-20 16:42:39,787 - INFO - [Train] step: 99, loss_D: 0.003373, lr_D: 0.000800 2023-06-20 16:42:39,870 - INFO - [Train] step: 99, loss_G: 6.961355, lr_G: 0.000800 2023-06-20 16:43:37,263 - INFO - [Train] step: 199, loss_D: 0.014733, lr_D: 0.000800 2023-06-20 16:43:37,345 - INFO - [Train] step: 199, loss_G: 5.651341, lr_G: 0.000800 2023-06-20 16:44:34,768 - INFO - [Train] step: 299, loss_D: 0.015492, lr_D: 0.000800 2023-06-20 16:44:34,851 - INFO - [Train] step: 299, loss_G: 4.101239, lr_G: 0.000800 2023-06-20 16:45:32,299 - INFO - [Train] step: 399, loss_D: 0.031005, lr_D: 0.000800 2023-06-20 16:45:32,382 - INFO - [Train] step: 399, loss_G: 4.400643, lr_G: 0.000800 2023-06-20 16:46:29,850 - INFO - [Train] step: 499, loss_D: 0.010739, lr_D: 0.000800 2023-06-20 16:46:29,932 - INFO - [Train] step: 499, loss_G: 5.871856, lr_G: 0.000800 2023-06-20 16:47:27,400 - INFO - [Train] step: 599, loss_D: 0.009506, lr_D: 0.000800 2023-06-20 16:47:27,482 - INFO - [Train] step: 599, loss_G: 5.897307, lr_G: 0.000800 2023-06-20 16:48:25,092 - INFO - [Train] step: 699, loss_D: 0.011540, lr_D: 0.000800 2023-06-20 16:48:25,175 - INFO - [Train] step: 699, loss_G: 5.408056, lr_G: 0.000800 2023-06-20 16:49:22,629 - INFO - [Train] step: 799, loss_D: 0.020851, lr_D: 0.000800 2023-06-20 16:49:22,711 - INFO - [Train] step: 799, loss_G: 5.332964, lr_G: 0.000800 2023-06-20 16:50:20,166 - INFO - [Train] step: 899, loss_D: 0.021700, lr_D: 0.000800 2023-06-20 16:50:20,249 - INFO - [Train] step: 899, loss_G: 5.575073, lr_G: 0.000800 2023-06-20 16:51:17,713 - INFO - [Train] step: 999, loss_D: 0.005004, lr_D: 0.000800 2023-06-20 16:51:17,795 - INFO - [Train] step: 999, loss_G: 6.107444, lr_G: 0.000800 2023-06-20 16:52:21,786 - INFO - [Eval] step: 999, fid: 137.491986 2023-06-20 16:53:19,417 - INFO - [Train] step: 1099, loss_D: 0.015063, lr_D: 0.000800 2023-06-20 16:53:19,500 - INFO - [Train] step: 1099, loss_G: 4.441256, lr_G: 0.000800 2023-06-20 16:54:17,040 - INFO - [Train] step: 1199, loss_D: 0.016915, lr_D: 0.000800 2023-06-20 16:54:17,123 - INFO - [Train] step: 1199, loss_G: 4.694931, lr_G: 0.000800 2023-06-20 16:55:14,854 - INFO - [Train] step: 1299, loss_D: 0.021883, lr_D: 0.000800 2023-06-20 16:55:14,937 - INFO - [Train] step: 1299, loss_G: 4.051022, lr_G: 0.000800 2023-06-20 16:56:12,462 - INFO - [Train] step: 1399, loss_D: 0.018794, lr_D: 0.000800 2023-06-20 16:56:12,544 - INFO - [Train] step: 1399, loss_G: 6.741831, lr_G: 0.000800 2023-06-20 16:57:10,079 - INFO - [Train] step: 1499, loss_D: 0.021743, lr_D: 0.000800 2023-06-20 16:57:10,161 - INFO - [Train] step: 1499, loss_G: 4.321163, lr_G: 0.000800 2023-06-20 16:58:07,699 - INFO - [Train] step: 1599, loss_D: 0.028445, lr_D: 0.000800 2023-06-20 16:58:07,782 - INFO - [Train] step: 1599, loss_G: 4.884105, lr_G: 0.000800 2023-06-20 16:59:05,325 - INFO - [Train] step: 1699, loss_D: 0.034752, lr_D: 0.000800 2023-06-20 16:59:05,407 - INFO - [Train] step: 1699, loss_G: 4.409259, lr_G: 0.000800 2023-06-20 17:00:02,930 - INFO - [Train] step: 1799, loss_D: 0.015242, lr_D: 0.000800 2023-06-20 17:00:03,013 - INFO - [Train] step: 1799, loss_G: 4.640846, lr_G: 0.000800 2023-06-20 17:01:00,515 - INFO - [Train] step: 1899, loss_D: 0.040465, lr_D: 0.000800 2023-06-20 17:01:00,597 - INFO - [Train] step: 1899, loss_G: 4.490809, lr_G: 0.000800 2023-06-20 17:01:58,273 - INFO - [Train] step: 1999, loss_D: 0.024751, lr_D: 0.000800 2023-06-20 17:01:58,356 - INFO - [Train] step: 1999, loss_G: 5.270306, lr_G: 0.000800 2023-06-20 17:03:02,247 - INFO - [Eval] step: 1999, fid: 79.368236 2023-06-20 17:04:00,030 - INFO - [Train] step: 2099, loss_D: 0.041266, lr_D: 0.000800 2023-06-20 17:04:00,112 - INFO - [Train] step: 2099, loss_G: 4.233968, lr_G: 0.000800 2023-06-20 17:04:57,639 - INFO - [Train] step: 2199, loss_D: 0.012923, lr_D: 0.000800 2023-06-20 17:04:57,722 - INFO - [Train] step: 2199, loss_G: 4.992471, lr_G: 0.000800 2023-06-20 17:05:55,267 - INFO - [Train] step: 2299, loss_D: 0.024256, lr_D: 0.000800 2023-06-20 17:05:55,349 - INFO - [Train] step: 2299, loss_G: 4.308054, lr_G: 0.000800 2023-06-20 17:06:52,877 - INFO - [Train] step: 2399, loss_D: 0.029725, lr_D: 0.000800 2023-06-20 17:06:52,960 - INFO - [Train] step: 2399, loss_G: 4.494064, lr_G: 0.000800 2023-06-20 17:07:50,472 - INFO - [Train] step: 2499, loss_D: 0.029456, lr_D: 0.000800 2023-06-20 17:07:50,555 - INFO - [Train] step: 2499, loss_G: 4.415339, lr_G: 0.000800 2023-06-20 17:08:48,248 - INFO - [Train] step: 2599, loss_D: 0.029419, lr_D: 0.000800 2023-06-20 17:08:48,331 - INFO - [Train] step: 2599, loss_G: 3.812042, lr_G: 0.000800 2023-06-20 17:09:45,864 - INFO - [Train] step: 2699, loss_D: 0.030265, lr_D: 0.000800 2023-06-20 17:09:45,946 - INFO - [Train] step: 2699, loss_G: 4.691515, lr_G: 0.000800 2023-06-20 17:10:43,439 - INFO - [Train] step: 2799, loss_D: 0.024228, lr_D: 0.000800 2023-06-20 17:10:43,522 - INFO - [Train] step: 2799, loss_G: 4.210952, lr_G: 0.000800 2023-06-20 17:11:41,040 - INFO - [Train] step: 2899, loss_D: 0.040463, lr_D: 0.000800 2023-06-20 17:11:41,123 - INFO - [Train] step: 2899, loss_G: 4.063533, lr_G: 0.000800 2023-06-20 17:12:38,620 - INFO - [Train] step: 2999, loss_D: 0.007912, lr_D: 0.000800 2023-06-20 17:12:38,703 - INFO - [Train] step: 2999, loss_G: 5.529931, lr_G: 0.000800 2023-06-20 17:13:42,659 - INFO - [Eval] step: 2999, fid: 63.992565 2023-06-20 17:14:40,437 - INFO - [Train] step: 3099, loss_D: 0.074966, lr_D: 0.000800 2023-06-20 17:14:40,519 - INFO - [Train] step: 3099, loss_G: 3.893239, lr_G: 0.000800 2023-06-20 17:15:38,030 - INFO - [Train] step: 3199, loss_D: 0.026799, lr_D: 0.000800 2023-06-20 17:15:38,112 - INFO - [Train] step: 3199, loss_G: 4.824360, lr_G: 0.000800 2023-06-20 17:16:35,831 - INFO - [Train] step: 3299, loss_D: 0.067274, lr_D: 0.000800 2023-06-20 17:16:35,914 - INFO - [Train] step: 3299, loss_G: 3.970190, lr_G: 0.000800 2023-06-20 17:17:33,470 - INFO - [Train] step: 3399, loss_D: 0.028840, lr_D: 0.000800 2023-06-20 17:17:33,553 - INFO - [Train] step: 3399, loss_G: 4.565364, lr_G: 0.000800 2023-06-20 17:18:31,080 - INFO - [Train] step: 3499, loss_D: 0.041149, lr_D: 0.000800 2023-06-20 17:18:31,162 - INFO - [Train] step: 3499, loss_G: 4.091739, lr_G: 0.000800 2023-06-20 17:19:28,708 - INFO - [Train] step: 3599, loss_D: 0.095094, lr_D: 0.000800 2023-06-20 17:19:28,790 - INFO - [Train] step: 3599, loss_G: 3.502505, lr_G: 0.000800 2023-06-20 17:20:26,295 - INFO - [Train] step: 3699, loss_D: 0.033595, lr_D: 0.000800 2023-06-20 17:20:26,377 - INFO - [Train] step: 3699, loss_G: 3.606726, lr_G: 0.000800 2023-06-20 17:21:23,886 - INFO - [Train] step: 3799, loss_D: 0.076119, lr_D: 0.000800 2023-06-20 17:21:23,969 - INFO - [Train] step: 3799, loss_G: 3.477933, lr_G: 0.000800 2023-06-20 17:22:21,639 - INFO - [Train] step: 3899, loss_D: 0.028653, lr_D: 0.000800 2023-06-20 17:22:21,722 - INFO - [Train] step: 3899, loss_G: 4.161845, lr_G: 0.000800 2023-06-20 17:23:19,221 - INFO - [Train] step: 3999, loss_D: 0.020372, lr_D: 0.000800 2023-06-20 17:23:19,304 - INFO - [Train] step: 3999, loss_G: 4.183169, lr_G: 0.000800 2023-06-20 17:24:23,187 - INFO - [Eval] step: 3999, fid: 44.852359 2023-06-20 17:25:20,952 - INFO - [Train] step: 4099, loss_D: 0.027073, lr_D: 0.000800 2023-06-20 17:25:21,035 - INFO - [Train] step: 4099, loss_G: 4.307897, lr_G: 0.000800 2023-06-20 17:26:18,531 - INFO - [Train] step: 4199, loss_D: 0.067028, lr_D: 0.000800 2023-06-20 17:26:18,614 - INFO - [Train] step: 4199, loss_G: 3.484782, lr_G: 0.000800 2023-06-20 17:27:16,120 - INFO - [Train] step: 4299, loss_D: 0.038063, lr_D: 0.000800 2023-06-20 17:27:16,203 - INFO - [Train] step: 4299, loss_G: 3.977466, lr_G: 0.000800 2023-06-20 17:28:13,725 - INFO - [Train] step: 4399, loss_D: 0.256176, lr_D: 0.000800 2023-06-20 17:28:13,807 - INFO - [Train] step: 4399, loss_G: 2.317088, lr_G: 0.000800 2023-06-20 17:29:11,289 - INFO - [Train] step: 4499, loss_D: 0.015697, lr_D: 0.000800 2023-06-20 17:29:11,371 - INFO - [Train] step: 4499, loss_G: 4.672412, lr_G: 0.000800 2023-06-20 17:30:09,028 - INFO - [Train] step: 4599, loss_D: 0.054706, lr_D: 0.000800 2023-06-20 17:30:09,110 - INFO - [Train] step: 4599, loss_G: 3.268631, lr_G: 0.000800 2023-06-20 17:31:06,600 - INFO - [Train] step: 4699, loss_D: 0.026348, lr_D: 0.000800 2023-06-20 17:31:06,682 - INFO - [Train] step: 4699, loss_G: 4.173070, lr_G: 0.000800 2023-06-20 17:32:04,156 - INFO - [Train] step: 4799, loss_D: 0.005408, lr_D: 0.000800 2023-06-20 17:32:04,239 - INFO - [Train] step: 4799, loss_G: 5.550467, lr_G: 0.000800 2023-06-20 17:33:01,740 - INFO - [Train] step: 4899, loss_D: 0.069578, lr_D: 0.000800 2023-06-20 17:33:01,822 - INFO - [Train] step: 4899, loss_G: 2.943511, lr_G: 0.000800 2023-06-20 17:33:59,307 - INFO - [Train] step: 4999, loss_D: 0.348759, lr_D: 0.000800 2023-06-20 17:33:59,389 - INFO - [Train] step: 4999, loss_G: 1.688903, lr_G: 0.000800 2023-06-20 17:35:03,484 - INFO - [Eval] step: 4999, fid: 38.655528 2023-06-20 17:36:01,299 - INFO - [Train] step: 5099, loss_D: 0.011742, lr_D: 0.000800 2023-06-20 17:36:01,381 - INFO - [Train] step: 5099, loss_G: 5.048460, lr_G: 0.000800 2023-06-20 17:36:59,124 - INFO - [Train] step: 5199, loss_D: 0.051565, lr_D: 0.000800 2023-06-20 17:36:59,207 - INFO - [Train] step: 5199, loss_G: 3.915628, lr_G: 0.000800 2023-06-20 17:37:56,771 - INFO - [Train] step: 5299, loss_D: 0.050301, lr_D: 0.000800 2023-06-20 17:37:56,854 - INFO - [Train] step: 5299, loss_G: 3.415740, lr_G: 0.000800 2023-06-20 17:38:54,420 - INFO - [Train] step: 5399, loss_D: 0.158297, lr_D: 0.000800 2023-06-20 17:38:54,502 - INFO - [Train] step: 5399, loss_G: 2.275898, lr_G: 0.000800 2023-06-20 17:39:52,075 - INFO - [Train] step: 5499, loss_D: 0.055285, lr_D: 0.000800 2023-06-20 17:39:52,157 - INFO - [Train] step: 5499, loss_G: 3.145108, lr_G: 0.000800 2023-06-20 17:40:49,753 - INFO - [Train] step: 5599, loss_D: 0.068192, lr_D: 0.000800 2023-06-20 17:40:49,836 - INFO - [Train] step: 5599, loss_G: 3.757272, lr_G: 0.000800 2023-06-20 17:41:47,415 - INFO - [Train] step: 5699, loss_D: 0.203779, lr_D: 0.000800 2023-06-20 17:41:47,497 - INFO - [Train] step: 5699, loss_G: 2.264235, lr_G: 0.000800 2023-06-20 17:42:45,049 - INFO - [Train] step: 5799, loss_D: 0.060837, lr_D: 0.000800 2023-06-20 17:42:45,132 - INFO - [Train] step: 5799, loss_G: 4.192678, lr_G: 0.000800 2023-06-20 17:43:42,872 - INFO - [Train] step: 5899, loss_D: 0.055329, lr_D: 0.000800 2023-06-20 17:43:42,954 - INFO - [Train] step: 5899, loss_G: 3.235473, lr_G: 0.000800 2023-06-20 17:44:40,527 - INFO - [Train] step: 5999, loss_D: 0.015543, lr_D: 0.000800 2023-06-20 17:44:40,610 - INFO - [Train] step: 5999, loss_G: 3.392641, lr_G: 0.000800 2023-06-20 17:45:44,593 - INFO - [Eval] step: 5999, fid: 35.951343 2023-06-20 17:46:42,434 - INFO - [Train] step: 6099, loss_D: 0.004670, lr_D: 0.000800 2023-06-20 17:46:42,516 - INFO - [Train] step: 6099, loss_G: 5.575942, lr_G: 0.000800 2023-06-20 17:47:40,066 - INFO - [Train] step: 6199, loss_D: 0.015802, lr_D: 0.000800 2023-06-20 17:47:40,148 - INFO - [Train] step: 6199, loss_G: 4.321508, lr_G: 0.000800 2023-06-20 17:48:37,695 - INFO - [Train] step: 6299, loss_D: 0.212002, lr_D: 0.000800 2023-06-20 17:48:37,778 - INFO - [Train] step: 6299, loss_G: 2.305909, lr_G: 0.000800 2023-06-20 17:49:35,327 - INFO - [Train] step: 6399, loss_D: 0.080969, lr_D: 0.000800 2023-06-20 17:49:35,410 - INFO - [Train] step: 6399, loss_G: 3.193407, lr_G: 0.000800 2023-06-20 17:50:33,157 - INFO - [Train] step: 6499, loss_D: 0.077497, lr_D: 0.000800 2023-06-20 17:50:33,240 - INFO - [Train] step: 6499, loss_G: 4.134596, lr_G: 0.000800 2023-06-20 17:51:30,803 - INFO - [Train] step: 6599, loss_D: 0.005917, lr_D: 0.000800 2023-06-20 17:51:30,885 - INFO - [Train] step: 6599, loss_G: 5.478795, lr_G: 0.000800 2023-06-20 17:52:28,469 - INFO - [Train] step: 6699, loss_D: 0.047538, lr_D: 0.000800 2023-06-20 17:52:28,552 - INFO - [Train] step: 6699, loss_G: 3.830052, lr_G: 0.000800 2023-06-20 17:53:26,092 - INFO - [Train] step: 6799, loss_D: 0.042657, lr_D: 0.000800 2023-06-20 17:53:26,175 - INFO - [Train] step: 6799, loss_G: 3.842157, lr_G: 0.000800 2023-06-20 17:54:23,712 - INFO - [Train] step: 6899, loss_D: 0.071364, lr_D: 0.000800 2023-06-20 17:54:23,794 - INFO - [Train] step: 6899, loss_G: 3.297514, lr_G: 0.000800 2023-06-20 17:55:21,377 - INFO - [Train] step: 6999, loss_D: 0.055405, lr_D: 0.000800 2023-06-20 17:55:21,459 - INFO - [Train] step: 6999, loss_G: 3.644185, lr_G: 0.000800 2023-06-20 17:56:25,555 - INFO - [Eval] step: 6999, fid: 32.909344 2023-06-20 17:57:23,351 - INFO - [Train] step: 7099, loss_D: 0.075832, lr_D: 0.000800 2023-06-20 17:57:23,433 - INFO - [Train] step: 7099, loss_G: 3.750996, lr_G: 0.000800 2023-06-20 17:58:21,197 - INFO - [Train] step: 7199, loss_D: 0.053445, lr_D: 0.000800 2023-06-20 17:58:21,280 - INFO - [Train] step: 7199, loss_G: 3.472252, lr_G: 0.000800 2023-06-20 17:59:18,835 - INFO - [Train] step: 7299, loss_D: 0.012912, lr_D: 0.000800 2023-06-20 17:59:18,917 - INFO - [Train] step: 7299, loss_G: 4.942597, lr_G: 0.000800 2023-06-20 18:00:16,465 - INFO - [Train] step: 7399, loss_D: 0.005171, lr_D: 0.000800 2023-06-20 18:00:16,548 - INFO - [Train] step: 7399, loss_G: 5.817299, lr_G: 0.000800 2023-06-20 18:01:14,095 - INFO - [Train] step: 7499, loss_D: 0.044122, lr_D: 0.000800 2023-06-20 18:01:14,177 - INFO - [Train] step: 7499, loss_G: 3.847531, lr_G: 0.000800 2023-06-20 18:02:11,737 - INFO - [Train] step: 7599, loss_D: 0.057308, lr_D: 0.000800 2023-06-20 18:02:11,820 - INFO - [Train] step: 7599, loss_G: 3.519135, lr_G: 0.000800 2023-06-20 18:03:09,355 - INFO - [Train] step: 7699, loss_D: 0.043514, lr_D: 0.000800 2023-06-20 18:03:09,437 - INFO - [Train] step: 7699, loss_G: 3.879828, lr_G: 0.000800 2023-06-20 18:04:07,165 - INFO - [Train] step: 7799, loss_D: 0.014018, lr_D: 0.000800 2023-06-20 18:04:07,247 - INFO - [Train] step: 7799, loss_G: 5.051363, lr_G: 0.000800 2023-06-20 18:05:04,813 - INFO - [Train] step: 7899, loss_D: 0.073231, lr_D: 0.000800 2023-06-20 18:05:04,896 - INFO - [Train] step: 7899, loss_G: 2.965210, lr_G: 0.000800 2023-06-20 18:06:02,440 - INFO - [Train] step: 7999, loss_D: 0.042420, lr_D: 0.000800 2023-06-20 18:06:02,523 - INFO - [Train] step: 7999, loss_G: 3.934088, lr_G: 0.000800 2023-06-20 18:07:06,563 - INFO - [Eval] step: 7999, fid: 31.470606 2023-06-20 18:08:04,263 - INFO - [Train] step: 8099, loss_D: 0.034912, lr_D: 0.000800 2023-06-20 18:08:04,345 - INFO - [Train] step: 8099, loss_G: 3.644726, lr_G: 0.000800 2023-06-20 18:09:01,764 - INFO - [Train] step: 8199, loss_D: 0.180154, lr_D: 0.000800 2023-06-20 18:09:01,847 - INFO - [Train] step: 8199, loss_G: 4.534343, lr_G: 0.000800 2023-06-20 18:09:59,377 - INFO - [Train] step: 8299, loss_D: 0.060114, lr_D: 0.000800 2023-06-20 18:09:59,460 - INFO - [Train] step: 8299, loss_G: 3.470419, lr_G: 0.000800 2023-06-20 18:10:57,013 - INFO - [Train] step: 8399, loss_D: 0.051882, lr_D: 0.000800 2023-06-20 18:10:57,095 - INFO - [Train] step: 8399, loss_G: 3.473439, lr_G: 0.000800 2023-06-20 18:11:54,818 - INFO - [Train] step: 8499, loss_D: 0.024281, lr_D: 0.000800 2023-06-20 18:11:54,901 - INFO - [Train] step: 8499, loss_G: 3.960746, lr_G: 0.000800 2023-06-20 18:12:52,457 - INFO - [Train] step: 8599, loss_D: 0.044740, lr_D: 0.000800 2023-06-20 18:12:52,539 - INFO - [Train] step: 8599, loss_G: 3.373656, lr_G: 0.000800 2023-06-20 18:13:50,099 - INFO - [Train] step: 8699, loss_D: 0.134878, lr_D: 0.000800 2023-06-20 18:13:50,182 - INFO - [Train] step: 8699, loss_G: 2.556164, lr_G: 0.000800 2023-06-20 18:14:47,735 - INFO - [Train] step: 8799, loss_D: 0.013849, lr_D: 0.000800 2023-06-20 18:14:47,817 - INFO - [Train] step: 8799, loss_G: 4.678804, lr_G: 0.000800 2023-06-20 18:15:45,375 - INFO - [Train] step: 8899, loss_D: 0.108809, lr_D: 0.000800 2023-06-20 18:15:45,457 - INFO - [Train] step: 8899, loss_G: 3.459222, lr_G: 0.000800 2023-06-20 18:16:43,014 - INFO - [Train] step: 8999, loss_D: 0.045173, lr_D: 0.000800 2023-06-20 18:16:43,096 - INFO - [Train] step: 8999, loss_G: 3.468558, lr_G: 0.000800 2023-06-20 18:17:47,231 - INFO - [Eval] step: 8999, fid: 31.278803 2023-06-20 18:18:45,193 - INFO - [Train] step: 9099, loss_D: 0.100650, lr_D: 0.000800 2023-06-20 18:18:45,276 - INFO - [Train] step: 9099, loss_G: 2.805575, lr_G: 0.000800 2023-06-20 18:19:42,840 - INFO - [Train] step: 9199, loss_D: 0.037195, lr_D: 0.000800 2023-06-20 18:19:42,922 - INFO - [Train] step: 9199, loss_G: 3.895432, lr_G: 0.000800 2023-06-20 18:20:40,490 - INFO - [Train] step: 9299, loss_D: 0.044144, lr_D: 0.000800 2023-06-20 18:20:40,573 - INFO - [Train] step: 9299, loss_G: 3.488558, lr_G: 0.000800 2023-06-20 18:21:38,114 - INFO - [Train] step: 9399, loss_D: 0.036586, lr_D: 0.000800 2023-06-20 18:21:38,196 - INFO - [Train] step: 9399, loss_G: 3.361644, lr_G: 0.000800 2023-06-20 18:22:35,757 - INFO - [Train] step: 9499, loss_D: 0.013038, lr_D: 0.000800 2023-06-20 18:22:35,839 - INFO - [Train] step: 9499, loss_G: 5.251925, lr_G: 0.000800 2023-06-20 18:23:33,427 - INFO - [Train] step: 9599, loss_D: 0.003294, lr_D: 0.000800 2023-06-20 18:23:33,510 - INFO - [Train] step: 9599, loss_G: 6.217975, lr_G: 0.000800 2023-06-20 18:24:31,107 - INFO - [Train] step: 9699, loss_D: 0.186978, lr_D: 0.000800 2023-06-20 18:24:31,189 - INFO - [Train] step: 9699, loss_G: 6.398161, lr_G: 0.000800 2023-06-20 18:25:28,932 - INFO - [Train] step: 9799, loss_D: 0.031792, lr_D: 0.000800 2023-06-20 18:25:29,014 - INFO - [Train] step: 9799, loss_G: 3.361913, lr_G: 0.000800 2023-06-20 18:26:26,562 - INFO - [Train] step: 9899, loss_D: 0.081867, lr_D: 0.000800 2023-06-20 18:26:26,644 - INFO - [Train] step: 9899, loss_G: 3.246875, lr_G: 0.000800 2023-06-20 18:27:24,243 - INFO - [Train] step: 9999, loss_D: 0.060253, lr_D: 0.000800 2023-06-20 18:27:24,325 - INFO - [Train] step: 9999, loss_G: 2.944768, lr_G: 0.000800 2023-06-20 18:28:28,453 - INFO - [Eval] step: 9999, fid: 29.399864 2023-06-20 18:29:26,314 - INFO - [Train] step: 10099, loss_D: 0.010109, lr_D: 0.000800 2023-06-20 18:29:26,397 - INFO - [Train] step: 10099, loss_G: 5.046092, lr_G: 0.000800 2023-06-20 18:30:23,833 - INFO - [Train] step: 10199, loss_D: 0.041413, lr_D: 0.000800 2023-06-20 18:30:23,915 - INFO - [Train] step: 10199, loss_G: 3.696713, lr_G: 0.000800 2023-06-20 18:31:21,376 - INFO - [Train] step: 10299, loss_D: 0.068789, lr_D: 0.000800 2023-06-20 18:31:21,457 - INFO - [Train] step: 10299, loss_G: 2.549175, lr_G: 0.000800 2023-06-20 18:32:19,088 - INFO - [Train] step: 10399, loss_D: 0.004832, lr_D: 0.000800 2023-06-20 18:32:19,171 - INFO - [Train] step: 10399, loss_G: 6.080018, lr_G: 0.000800 2023-06-20 18:33:16,604 - INFO - [Train] step: 10499, loss_D: 0.040449, lr_D: 0.000800 2023-06-20 18:33:16,686 - INFO - [Train] step: 10499, loss_G: 3.563605, lr_G: 0.000800 2023-06-20 18:34:14,120 - INFO - [Train] step: 10599, loss_D: 0.029143, lr_D: 0.000800 2023-06-20 18:34:14,202 - INFO - [Train] step: 10599, loss_G: 3.478001, lr_G: 0.000800 2023-06-20 18:35:11,649 - INFO - [Train] step: 10699, loss_D: 0.033750, lr_D: 0.000800 2023-06-20 18:35:11,731 - INFO - [Train] step: 10699, loss_G: 3.946764, lr_G: 0.000800 2023-06-20 18:36:09,174 - INFO - [Train] step: 10799, loss_D: 0.044910, lr_D: 0.000800 2023-06-20 18:36:09,257 - INFO - [Train] step: 10799, loss_G: 4.267330, lr_G: 0.000800 2023-06-20 18:37:06,743 - INFO - [Train] step: 10899, loss_D: 0.042916, lr_D: 0.000800 2023-06-20 18:37:06,825 - INFO - [Train] step: 10899, loss_G: 3.568595, lr_G: 0.000800 2023-06-20 18:38:04,469 - INFO - [Train] step: 10999, loss_D: 0.019066, lr_D: 0.000800 2023-06-20 18:38:04,552 - INFO - [Train] step: 10999, loss_G: 4.571453, lr_G: 0.000800 2023-06-20 18:39:08,551 - INFO - [Eval] step: 10999, fid: 30.685652 2023-06-20 18:40:05,953 - INFO - [Train] step: 11099, loss_D: 0.032275, lr_D: 0.000800 2023-06-20 18:40:06,036 - INFO - [Train] step: 11099, loss_G: 4.216719, lr_G: 0.000800 2023-06-20 18:41:03,461 - INFO - [Train] step: 11199, loss_D: 0.022420, lr_D: 0.000800 2023-06-20 18:41:03,543 - INFO - [Train] step: 11199, loss_G: 3.793826, lr_G: 0.000800 2023-06-20 18:42:00,998 - INFO - [Train] step: 11299, loss_D: 0.035346, lr_D: 0.000800 2023-06-20 18:42:01,081 - INFO - [Train] step: 11299, loss_G: 3.861779, lr_G: 0.000800 2023-06-20 18:42:58,539 - INFO - [Train] step: 11399, loss_D: 0.082192, lr_D: 0.000800 2023-06-20 18:42:58,621 - INFO - [Train] step: 11399, loss_G: 3.591079, lr_G: 0.000800 2023-06-20 18:43:56,078 - INFO - [Train] step: 11499, loss_D: 0.049160, lr_D: 0.000800 2023-06-20 18:43:56,161 - INFO - [Train] step: 11499, loss_G: 4.039227, lr_G: 0.000800 2023-06-20 18:44:53,637 - INFO - [Train] step: 11599, loss_D: 0.032215, lr_D: 0.000800 2023-06-20 18:44:53,719 - INFO - [Train] step: 11599, loss_G: 4.083654, lr_G: 0.000800 2023-06-20 18:45:51,371 - INFO - [Train] step: 11699, loss_D: 0.024630, lr_D: 0.000800 2023-06-20 18:45:51,453 - INFO - [Train] step: 11699, loss_G: 4.305667, lr_G: 0.000800 2023-06-20 18:46:48,899 - INFO - [Train] step: 11799, loss_D: 0.048161, lr_D: 0.000800 2023-06-20 18:46:48,981 - INFO - [Train] step: 11799, loss_G: 3.717360, lr_G: 0.000800 2023-06-20 18:47:46,431 - INFO - [Train] step: 11899, loss_D: 0.069436, lr_D: 0.000800 2023-06-20 18:47:46,514 - INFO - [Train] step: 11899, loss_G: 3.379128, lr_G: 0.000800 2023-06-20 18:48:43,963 - INFO - [Train] step: 11999, loss_D: 0.683973, lr_D: 0.000800 2023-06-20 18:48:44,046 - INFO - [Train] step: 11999, loss_G: 1.404226, lr_G: 0.000800 2023-06-20 18:49:48,107 - INFO - [Eval] step: 11999, fid: 29.007294 2023-06-20 18:50:45,863 - INFO - [Train] step: 12099, loss_D: 0.055991, lr_D: 0.000800 2023-06-20 18:50:45,945 - INFO - [Train] step: 12099, loss_G: 2.857309, lr_G: 0.000800 2023-06-20 18:51:43,411 - INFO - [Train] step: 12199, loss_D: 0.301637, lr_D: 0.000800 2023-06-20 18:51:43,493 - INFO - [Train] step: 12199, loss_G: 1.195544, lr_G: 0.000800 2023-06-20 18:52:41,107 - INFO - [Train] step: 12299, loss_D: 0.043649, lr_D: 0.000800 2023-06-20 18:52:41,189 - INFO - [Train] step: 12299, loss_G: 3.626347, lr_G: 0.000800 2023-06-20 18:53:38,635 - INFO - [Train] step: 12399, loss_D: 0.006682, lr_D: 0.000800 2023-06-20 18:53:38,717 - INFO - [Train] step: 12399, loss_G: 6.106777, lr_G: 0.000800 2023-06-20 18:54:36,183 - INFO - [Train] step: 12499, loss_D: 0.425387, lr_D: 0.000800 2023-06-20 18:54:36,266 - INFO - [Train] step: 12499, loss_G: 0.925449, lr_G: 0.000800 2023-06-20 18:55:33,699 - INFO - [Train] step: 12599, loss_D: 0.072830, lr_D: 0.000800 2023-06-20 18:55:33,782 - INFO - [Train] step: 12599, loss_G: 3.093293, lr_G: 0.000800 2023-06-20 18:56:31,231 - INFO - [Train] step: 12699, loss_D: 0.025211, lr_D: 0.000800 2023-06-20 18:56:31,314 - INFO - [Train] step: 12699, loss_G: 3.743865, lr_G: 0.000800 2023-06-20 18:57:28,773 - INFO - [Train] step: 12799, loss_D: 0.014446, lr_D: 0.000800 2023-06-20 18:57:28,855 - INFO - [Train] step: 12799, loss_G: 5.094555, lr_G: 0.000800 2023-06-20 18:58:26,287 - INFO - [Train] step: 12899, loss_D: 0.026383, lr_D: 0.000800 2023-06-20 18:58:26,369 - INFO - [Train] step: 12899, loss_G: 4.194902, lr_G: 0.000800 2023-06-20 18:59:24,002 - INFO - [Train] step: 12999, loss_D: 0.141212, lr_D: 0.000800 2023-06-20 18:59:24,085 - INFO - [Train] step: 12999, loss_G: 2.385814, lr_G: 0.000800 2023-06-20 19:00:28,214 - INFO - [Eval] step: 12999, fid: 27.512611 2023-06-20 19:01:25,929 - INFO - [Train] step: 13099, loss_D: 0.042086, lr_D: 0.000800 2023-06-20 19:01:26,011 - INFO - [Train] step: 13099, loss_G: 4.142386, lr_G: 0.000800 2023-06-20 19:02:23,508 - INFO - [Train] step: 13199, loss_D: 0.011729, lr_D: 0.000800 2023-06-20 19:02:23,590 - INFO - [Train] step: 13199, loss_G: 4.984503, lr_G: 0.000800 2023-06-20 19:03:21,081 - INFO - [Train] step: 13299, loss_D: 0.022402, lr_D: 0.000800 2023-06-20 19:03:21,164 - INFO - [Train] step: 13299, loss_G: 5.160158, lr_G: 0.000800 2023-06-20 19:04:18,665 - INFO - [Train] step: 13399, loss_D: 0.005368, lr_D: 0.000800 2023-06-20 19:04:18,747 - INFO - [Train] step: 13399, loss_G: 6.016826, lr_G: 0.000800 2023-06-20 19:05:16,232 - INFO - [Train] step: 13499, loss_D: 0.003723, lr_D: 0.000800 2023-06-20 19:05:16,314 - INFO - [Train] step: 13499, loss_G: 6.381131, lr_G: 0.000800 2023-06-20 19:06:13,972 - INFO - [Train] step: 13599, loss_D: 0.020226, lr_D: 0.000800 2023-06-20 19:06:14,054 - INFO - [Train] step: 13599, loss_G: 4.380473, lr_G: 0.000800 2023-06-20 19:07:11,522 - INFO - [Train] step: 13699, loss_D: 0.017999, lr_D: 0.000800 2023-06-20 19:07:11,604 - INFO - [Train] step: 13699, loss_G: 4.562202, lr_G: 0.000800 2023-06-20 19:08:09,101 - INFO - [Train] step: 13799, loss_D: 0.020610, lr_D: 0.000800 2023-06-20 19:08:09,191 - INFO - [Train] step: 13799, loss_G: 4.485044, lr_G: 0.000800 2023-06-20 19:09:06,686 - INFO - [Train] step: 13899, loss_D: 0.074970, lr_D: 0.000800 2023-06-20 19:09:06,768 - INFO - [Train] step: 13899, loss_G: 3.238503, lr_G: 0.000800 2023-06-20 19:10:04,260 - INFO - [Train] step: 13999, loss_D: 0.001252, lr_D: 0.000800 2023-06-20 19:10:04,342 - INFO - [Train] step: 13999, loss_G: 7.165111, lr_G: 0.000800 2023-06-20 19:11:08,397 - INFO - [Eval] step: 13999, fid: 27.391182 2023-06-20 19:12:06,129 - INFO - [Train] step: 14099, loss_D: 0.060901, lr_D: 0.000800 2023-06-20 19:12:06,212 - INFO - [Train] step: 14099, loss_G: 3.020299, lr_G: 0.000800 2023-06-20 19:13:03,683 - INFO - [Train] step: 14199, loss_D: 0.028133, lr_D: 0.000800 2023-06-20 19:13:03,765 - INFO - [Train] step: 14199, loss_G: 4.077332, lr_G: 0.000800 2023-06-20 19:14:01,442 - INFO - [Train] step: 14299, loss_D: 0.029699, lr_D: 0.000800 2023-06-20 19:14:01,525 - INFO - [Train] step: 14299, loss_G: 4.205725, lr_G: 0.000800 2023-06-20 19:14:59,067 - INFO - [Train] step: 14399, loss_D: 0.027483, lr_D: 0.000800 2023-06-20 19:14:59,150 - INFO - [Train] step: 14399, loss_G: 4.017812, lr_G: 0.000800 2023-06-20 19:15:56,655 - INFO - [Train] step: 14499, loss_D: 0.023228, lr_D: 0.000800 2023-06-20 19:15:56,738 - INFO - [Train] step: 14499, loss_G: 3.771870, lr_G: 0.000800 2023-06-20 19:16:54,244 - INFO - [Train] step: 14599, loss_D: 0.050018, lr_D: 0.000800 2023-06-20 19:16:54,326 - INFO - [Train] step: 14599, loss_G: 3.405501, lr_G: 0.000800 2023-06-20 19:17:51,803 - INFO - [Train] step: 14699, loss_D: 0.012700, lr_D: 0.000800 2023-06-20 19:17:51,885 - INFO - [Train] step: 14699, loss_G: 4.813305, lr_G: 0.000800 2023-06-20 19:18:49,335 - INFO - [Train] step: 14799, loss_D: 0.084879, lr_D: 0.000800 2023-06-20 19:18:49,417 - INFO - [Train] step: 14799, loss_G: 3.528526, lr_G: 0.000800 2023-06-20 19:19:47,050 - INFO - [Train] step: 14899, loss_D: 0.003490, lr_D: 0.000800 2023-06-20 19:19:47,133 - INFO - [Train] step: 14899, loss_G: 6.505185, lr_G: 0.000800 2023-06-20 19:20:44,601 - INFO - [Train] step: 14999, loss_D: 0.006495, lr_D: 0.000800 2023-06-20 19:20:44,684 - INFO - [Train] step: 14999, loss_G: 6.447754, lr_G: 0.000800 2023-06-20 19:21:48,740 - INFO - [Eval] step: 14999, fid: 26.221094 2023-06-20 19:22:46,363 - INFO - [Train] step: 15099, loss_D: 0.016115, lr_D: 0.000800 2023-06-20 19:22:46,446 - INFO - [Train] step: 15099, loss_G: 4.760447, lr_G: 0.000800 2023-06-20 19:23:43,829 - INFO - [Train] step: 15199, loss_D: 0.013620, lr_D: 0.000800 2023-06-20 19:23:43,912 - INFO - [Train] step: 15199, loss_G: 4.941197, lr_G: 0.000800 2023-06-20 19:24:41,277 - INFO - [Train] step: 15299, loss_D: 0.007907, lr_D: 0.000800 2023-06-20 19:24:41,358 - INFO - [Train] step: 15299, loss_G: 5.390777, lr_G: 0.000800 2023-06-20 19:25:38,740 - INFO - [Train] step: 15399, loss_D: 0.075361, lr_D: 0.000800 2023-06-20 19:25:38,822 - INFO - [Train] step: 15399, loss_G: 2.280646, lr_G: 0.000800 2023-06-20 19:26:36,189 - INFO - [Train] step: 15499, loss_D: 0.017524, lr_D: 0.000800 2023-06-20 19:26:36,271 - INFO - [Train] step: 15499, loss_G: 4.809962, lr_G: 0.000800 2023-06-20 19:27:33,842 - INFO - [Train] step: 15599, loss_D: 0.008891, lr_D: 0.000800 2023-06-20 19:27:33,924 - INFO - [Train] step: 15599, loss_G: 4.788931, lr_G: 0.000800 2023-06-20 19:28:31,307 - INFO - [Train] step: 15699, loss_D: 0.004728, lr_D: 0.000800 2023-06-20 19:28:31,389 - INFO - [Train] step: 15699, loss_G: 6.328642, lr_G: 0.000800 2023-06-20 19:29:28,757 - INFO - [Train] step: 15799, loss_D: 0.018661, lr_D: 0.000800 2023-06-20 19:29:28,840 - INFO - [Train] step: 15799, loss_G: 4.092789, lr_G: 0.000800 2023-06-20 19:30:26,208 - INFO - [Train] step: 15899, loss_D: 0.029518, lr_D: 0.000800 2023-06-20 19:30:26,290 - INFO - [Train] step: 15899, loss_G: 3.635034, lr_G: 0.000800 2023-06-20 19:31:23,698 - INFO - [Train] step: 15999, loss_D: 0.008346, lr_D: 0.000800 2023-06-20 19:31:23,780 - INFO - [Train] step: 15999, loss_G: 5.080687, lr_G: 0.000800 2023-06-20 19:32:27,898 - INFO - [Eval] step: 15999, fid: 26.892873 2023-06-20 19:33:25,229 - INFO - [Train] step: 16099, loss_D: 0.003758, lr_D: 0.000800 2023-06-20 19:33:25,311 - INFO - [Train] step: 16099, loss_G: 6.190856, lr_G: 0.000800 2023-06-20 19:34:22,872 - INFO - [Train] step: 16199, loss_D: 0.017040, lr_D: 0.000800 2023-06-20 19:34:22,955 - INFO - [Train] step: 16199, loss_G: 5.007480, lr_G: 0.000800 2023-06-20 19:35:20,317 - INFO - [Train] step: 16299, loss_D: 0.011828, lr_D: 0.000800 2023-06-20 19:35:20,399 - INFO - [Train] step: 16299, loss_G: 4.986794, lr_G: 0.000800 2023-06-20 19:36:17,772 - INFO - [Train] step: 16399, loss_D: 0.046568, lr_D: 0.000800 2023-06-20 19:36:17,854 - INFO - [Train] step: 16399, loss_G: 3.820261, lr_G: 0.000800 2023-06-20 19:37:15,233 - INFO - [Train] step: 16499, loss_D: 0.003703, lr_D: 0.000800 2023-06-20 19:37:15,315 - INFO - [Train] step: 16499, loss_G: 6.416226, lr_G: 0.000800 2023-06-20 19:38:12,692 - INFO - [Train] step: 16599, loss_D: 0.015892, lr_D: 0.000800 2023-06-20 19:38:12,774 - INFO - [Train] step: 16599, loss_G: 4.495651, lr_G: 0.000800 2023-06-20 19:39:10,139 - INFO - [Train] step: 16699, loss_D: 0.010688, lr_D: 0.000800 2023-06-20 19:39:10,221 - INFO - [Train] step: 16699, loss_G: 4.937366, lr_G: 0.000800 2023-06-20 19:40:07,591 - INFO - [Train] step: 16799, loss_D: 0.023170, lr_D: 0.000800 2023-06-20 19:40:07,673 - INFO - [Train] step: 16799, loss_G: 4.051939, lr_G: 0.000800 2023-06-20 19:41:05,211 - INFO - [Train] step: 16899, loss_D: 0.008578, lr_D: 0.000800 2023-06-20 19:41:05,294 - INFO - [Train] step: 16899, loss_G: 5.037694, lr_G: 0.000800 2023-06-20 19:42:02,636 - INFO - [Train] step: 16999, loss_D: 0.008577, lr_D: 0.000800 2023-06-20 19:42:02,718 - INFO - [Train] step: 16999, loss_G: 5.178760, lr_G: 0.000800 2023-06-20 19:43:06,825 - INFO - [Eval] step: 16999, fid: 26.448220 2023-06-20 19:44:04,167 - INFO - [Train] step: 17099, loss_D: 0.010184, lr_D: 0.000800 2023-06-20 19:44:04,249 - INFO - [Train] step: 17099, loss_G: 4.061676, lr_G: 0.000800 2023-06-20 19:45:01,620 - INFO - [Train] step: 17199, loss_D: 0.024492, lr_D: 0.000800 2023-06-20 19:45:01,702 - INFO - [Train] step: 17199, loss_G: 4.277374, lr_G: 0.000800 2023-06-20 19:45:59,075 - INFO - [Train] step: 17299, loss_D: 0.136733, lr_D: 0.000800 2023-06-20 19:45:59,157 - INFO - [Train] step: 17299, loss_G: 3.530210, lr_G: 0.000800 2023-06-20 19:46:56,511 - INFO - [Train] step: 17399, loss_D: 0.045074, lr_D: 0.000800 2023-06-20 19:46:56,593 - INFO - [Train] step: 17399, loss_G: 4.529955, lr_G: 0.000800 2023-06-20 19:47:54,143 - INFO - [Train] step: 17499, loss_D: 0.023395, lr_D: 0.000800 2023-06-20 19:47:54,226 - INFO - [Train] step: 17499, loss_G: 4.381906, lr_G: 0.000800 2023-06-20 19:48:51,613 - INFO - [Train] step: 17599, loss_D: 0.017775, lr_D: 0.000800 2023-06-20 19:48:51,695 - INFO - [Train] step: 17599, loss_G: 4.811211, lr_G: 0.000800 2023-06-20 19:49:49,075 - INFO - [Train] step: 17699, loss_D: 0.007035, lr_D: 0.000800 2023-06-20 19:49:49,157 - INFO - [Train] step: 17699, loss_G: 5.286721, lr_G: 0.000800 2023-06-20 19:50:46,521 - INFO - [Train] step: 17799, loss_D: 0.011250, lr_D: 0.000800 2023-06-20 19:50:46,603 - INFO - [Train] step: 17799, loss_G: 4.750059, lr_G: 0.000800 2023-06-20 19:51:43,955 - INFO - [Train] step: 17899, loss_D: 0.065488, lr_D: 0.000800 2023-06-20 19:51:44,038 - INFO - [Train] step: 17899, loss_G: 3.741673, lr_G: 0.000800 2023-06-20 19:52:41,397 - INFO - [Train] step: 17999, loss_D: 0.002580, lr_D: 0.000800 2023-06-20 19:52:41,479 - INFO - [Train] step: 17999, loss_G: 6.290638, lr_G: 0.000800 2023-06-20 19:53:45,512 - INFO - [Eval] step: 17999, fid: 26.559749 2023-06-20 19:54:42,941 - INFO - [Train] step: 18099, loss_D: 0.025685, lr_D: 0.000800 2023-06-20 19:54:43,023 - INFO - [Train] step: 18099, loss_G: 3.815683, lr_G: 0.000800 2023-06-20 19:55:40,634 - INFO - [Train] step: 18199, loss_D: 0.020652, lr_D: 0.000800 2023-06-20 19:55:40,716 - INFO - [Train] step: 18199, loss_G: 4.609305, lr_G: 0.000800 2023-06-20 19:56:38,172 - INFO - [Train] step: 18299, loss_D: 0.002548, lr_D: 0.000800 2023-06-20 19:56:38,255 - INFO - [Train] step: 18299, loss_G: 6.744552, lr_G: 0.000800 2023-06-20 19:57:35,710 - INFO - [Train] step: 18399, loss_D: 0.007309, lr_D: 0.000800 2023-06-20 19:57:35,792 - INFO - [Train] step: 18399, loss_G: 5.584167, lr_G: 0.000800 2023-06-20 19:58:33,244 - INFO - [Train] step: 18499, loss_D: 0.020682, lr_D: 0.000800 2023-06-20 19:58:33,327 - INFO - [Train] step: 18499, loss_G: 5.756335, lr_G: 0.000800 2023-06-20 19:59:30,759 - INFO - [Train] step: 18599, loss_D: 0.015026, lr_D: 0.000800 2023-06-20 19:59:30,841 - INFO - [Train] step: 18599, loss_G: 4.711163, lr_G: 0.000800 2023-06-20 20:00:28,302 - INFO - [Train] step: 18699, loss_D: 0.009148, lr_D: 0.000800 2023-06-20 20:00:28,384 - INFO - [Train] step: 18699, loss_G: 5.264832, lr_G: 0.000800 2023-06-20 20:01:26,001 - INFO - [Train] step: 18799, loss_D: 0.086122, lr_D: 0.000800 2023-06-20 20:01:26,083 - INFO - [Train] step: 18799, loss_G: 3.431658, lr_G: 0.000800 2023-06-20 20:02:23,500 - INFO - [Train] step: 18899, loss_D: 0.013240, lr_D: 0.000800 2023-06-20 20:02:23,582 - INFO - [Train] step: 18899, loss_G: 4.414982, lr_G: 0.000800 2023-06-20 20:03:21,018 - INFO - [Train] step: 18999, loss_D: 0.007951, lr_D: 0.000800 2023-06-20 20:03:21,100 - INFO - [Train] step: 18999, loss_G: 5.538564, lr_G: 0.000800 2023-06-20 20:04:25,184 - INFO - [Eval] step: 18999, fid: 26.039452 2023-06-20 20:05:22,924 - INFO - [Train] step: 19099, loss_D: 0.036986, lr_D: 0.000800 2023-06-20 20:05:23,006 - INFO - [Train] step: 19099, loss_G: 3.364111, lr_G: 0.000800 2023-06-20 20:06:20,460 - INFO - [Train] step: 19199, loss_D: 0.015688, lr_D: 0.000800 2023-06-20 20:06:20,542 - INFO - [Train] step: 19199, loss_G: 4.603758, lr_G: 0.000800 2023-06-20 20:07:18,002 - INFO - [Train] step: 19299, loss_D: 0.114450, lr_D: 0.000800 2023-06-20 20:07:18,085 - INFO - [Train] step: 19299, loss_G: 2.677832, lr_G: 0.000800 2023-06-20 20:08:15,573 - INFO - [Train] step: 19399, loss_D: 0.012232, lr_D: 0.000800 2023-06-20 20:08:15,655 - INFO - [Train] step: 19399, loss_G: 5.086120, lr_G: 0.000800 2023-06-20 20:09:13,257 - INFO - [Train] step: 19499, loss_D: 0.001888, lr_D: 0.000800 2023-06-20 20:09:13,339 - INFO - [Train] step: 19499, loss_G: 6.516899, lr_G: 0.000800 2023-06-20 20:10:10,782 - INFO - [Train] step: 19599, loss_D: 0.011665, lr_D: 0.000800 2023-06-20 20:10:10,864 - INFO - [Train] step: 19599, loss_G: 4.700666, lr_G: 0.000800 2023-06-20 20:11:08,297 - INFO - [Train] step: 19699, loss_D: 0.071011, lr_D: 0.000800 2023-06-20 20:11:08,380 - INFO - [Train] step: 19699, loss_G: 3.423562, lr_G: 0.000800 2023-06-20 20:12:05,815 - INFO - [Train] step: 19799, loss_D: 0.051582, lr_D: 0.000800 2023-06-20 20:12:05,897 - INFO - [Train] step: 19799, loss_G: 3.791520, lr_G: 0.000800 2023-06-20 20:13:03,332 - INFO - [Train] step: 19899, loss_D: 0.010322, lr_D: 0.000800 2023-06-20 20:13:03,415 - INFO - [Train] step: 19899, loss_G: 5.491682, lr_G: 0.000800 2023-06-20 20:14:00,864 - INFO - [Train] step: 19999, loss_D: 0.038203, lr_D: 0.000800 2023-06-20 20:14:00,947 - INFO - [Train] step: 19999, loss_G: 3.532337, lr_G: 0.000800 2023-06-20 20:15:05,034 - INFO - [Eval] step: 19999, fid: 25.522555 2023-06-20 20:16:03,086 - INFO - [Train] step: 20099, loss_D: 0.003751, lr_D: 0.000800 2023-06-20 20:16:03,169 - INFO - [Train] step: 20099, loss_G: 6.373525, lr_G: 0.000800 2023-06-20 20:17:00,625 - INFO - [Train] step: 20199, loss_D: 0.004873, lr_D: 0.000800 2023-06-20 20:17:00,707 - INFO - [Train] step: 20199, loss_G: 5.589916, lr_G: 0.000800 2023-06-20 20:17:58,163 - INFO - [Train] step: 20299, loss_D: 0.007372, lr_D: 0.000800 2023-06-20 20:17:58,246 - INFO - [Train] step: 20299, loss_G: 5.577626, lr_G: 0.000800 2023-06-20 20:18:55,710 - INFO - [Train] step: 20399, loss_D: 0.127482, lr_D: 0.000800 2023-06-20 20:18:55,793 - INFO - [Train] step: 20399, loss_G: 2.450526, lr_G: 0.000800 2023-06-20 20:19:53,256 - INFO - [Train] step: 20499, loss_D: 0.001982, lr_D: 0.000800 2023-06-20 20:19:53,339 - INFO - [Train] step: 20499, loss_G: 6.489986, lr_G: 0.000800 2023-06-20 20:20:50,839 - INFO - [Train] step: 20599, loss_D: 0.003483, lr_D: 0.000800 2023-06-20 20:20:50,921 - INFO - [Train] step: 20599, loss_G: 5.928268, lr_G: 0.000800 2023-06-20 20:21:48,589 - INFO - [Train] step: 20699, loss_D: 0.195238, lr_D: 0.000800 2023-06-20 20:21:48,672 - INFO - [Train] step: 20699, loss_G: 3.412206, lr_G: 0.000800 2023-06-20 20:22:46,152 - INFO - [Train] step: 20799, loss_D: 0.010172, lr_D: 0.000800 2023-06-20 20:22:46,234 - INFO - [Train] step: 20799, loss_G: 4.993641, lr_G: 0.000800 2023-06-20 20:23:43,686 - INFO - [Train] step: 20899, loss_D: 0.001974, lr_D: 0.000800 2023-06-20 20:23:43,768 - INFO - [Train] step: 20899, loss_G: 6.781989, lr_G: 0.000800 2023-06-20 20:24:41,217 - INFO - [Train] step: 20999, loss_D: 0.017884, lr_D: 0.000800 2023-06-20 20:24:41,300 - INFO - [Train] step: 20999, loss_G: 4.525687, lr_G: 0.000800 2023-06-20 20:25:45,300 - INFO - [Eval] step: 20999, fid: 26.091555 2023-06-20 20:26:42,734 - INFO - [Train] step: 21099, loss_D: 0.006503, lr_D: 0.000800 2023-06-20 20:26:42,816 - INFO - [Train] step: 21099, loss_G: 4.889439, lr_G: 0.000800 2023-06-20 20:27:40,263 - INFO - [Train] step: 21199, loss_D: 0.333188, lr_D: 0.000800 2023-06-20 20:27:40,345 - INFO - [Train] step: 21199, loss_G: 1.722939, lr_G: 0.000800 2023-06-20 20:28:37,806 - INFO - [Train] step: 21299, loss_D: 0.081368, lr_D: 0.000800 2023-06-20 20:28:37,888 - INFO - [Train] step: 21299, loss_G: 1.402308, lr_G: 0.000800 2023-06-20 20:29:35,539 - INFO - [Train] step: 21399, loss_D: 0.168612, lr_D: 0.000800 2023-06-20 20:29:35,621 - INFO - [Train] step: 21399, loss_G: 1.910682, lr_G: 0.000800 2023-06-20 20:30:33,092 - INFO - [Train] step: 21499, loss_D: 0.015338, lr_D: 0.000800 2023-06-20 20:30:33,174 - INFO - [Train] step: 21499, loss_G: 5.397229, lr_G: 0.000800 2023-06-20 20:31:30,635 - INFO - [Train] step: 21599, loss_D: 0.011413, lr_D: 0.000800 2023-06-20 20:31:30,717 - INFO - [Train] step: 21599, loss_G: 5.255408, lr_G: 0.000800 2023-06-20 20:32:28,190 - INFO - [Train] step: 21699, loss_D: 0.074606, lr_D: 0.000800 2023-06-20 20:32:28,273 - INFO - [Train] step: 21699, loss_G: 3.385686, lr_G: 0.000800 2023-06-20 20:33:25,725 - INFO - [Train] step: 21799, loss_D: 0.015428, lr_D: 0.000800 2023-06-20 20:33:25,807 - INFO - [Train] step: 21799, loss_G: 4.840065, lr_G: 0.000800 2023-06-20 20:34:23,273 - INFO - [Train] step: 21899, loss_D: 0.005044, lr_D: 0.000800 2023-06-20 20:34:23,356 - INFO - [Train] step: 21899, loss_G: 6.017883, lr_G: 0.000800 2023-06-20 20:35:21,017 - INFO - [Train] step: 21999, loss_D: 0.012010, lr_D: 0.000800 2023-06-20 20:35:21,099 - INFO - [Train] step: 21999, loss_G: 5.229033, lr_G: 0.000800 2023-06-20 20:36:25,242 - INFO - [Eval] step: 21999, fid: 25.191562 2023-06-20 20:37:23,100 - INFO - [Train] step: 22099, loss_D: 0.010556, lr_D: 0.000800 2023-06-20 20:37:23,183 - INFO - [Train] step: 22099, loss_G: 5.178079, lr_G: 0.000800 2023-06-20 20:38:20,634 - INFO - [Train] step: 22199, loss_D: 0.015245, lr_D: 0.000800 2023-06-20 20:38:20,716 - INFO - [Train] step: 22199, loss_G: 4.640171, lr_G: 0.000800 2023-06-20 20:39:18,183 - INFO - [Train] step: 22299, loss_D: 0.006905, lr_D: 0.000800 2023-06-20 20:39:18,265 - INFO - [Train] step: 22299, loss_G: 5.663626, lr_G: 0.000800 2023-06-20 20:40:15,704 - INFO - [Train] step: 22399, loss_D: 0.036224, lr_D: 0.000800 2023-06-20 20:40:15,786 - INFO - [Train] step: 22399, loss_G: 4.468874, lr_G: 0.000800 2023-06-20 20:41:13,243 - INFO - [Train] step: 22499, loss_D: 0.010740, lr_D: 0.000800 2023-06-20 20:41:13,325 - INFO - [Train] step: 22499, loss_G: 5.223192, lr_G: 0.000800 2023-06-20 20:42:10,803 - INFO - [Train] step: 22599, loss_D: 0.007376, lr_D: 0.000800 2023-06-20 20:42:10,885 - INFO - [Train] step: 22599, loss_G: 5.512803, lr_G: 0.000800 2023-06-20 20:43:08,537 - INFO - [Train] step: 22699, loss_D: 0.033011, lr_D: 0.000800 2023-06-20 20:43:08,619 - INFO - [Train] step: 22699, loss_G: 4.133174, lr_G: 0.000800 2023-06-20 20:44:06,112 - INFO - [Train] step: 22799, loss_D: 0.001000, lr_D: 0.000800 2023-06-20 20:44:06,202 - INFO - [Train] step: 22799, loss_G: 7.687656, lr_G: 0.000800 2023-06-20 20:45:03,678 - INFO - [Train] step: 22899, loss_D: 0.005652, lr_D: 0.000800 2023-06-20 20:45:03,760 - INFO - [Train] step: 22899, loss_G: 5.800527, lr_G: 0.000800 2023-06-20 20:46:01,240 - INFO - [Train] step: 22999, loss_D: 0.018649, lr_D: 0.000800 2023-06-20 20:46:01,322 - INFO - [Train] step: 22999, loss_G: 4.034059, lr_G: 0.000800 2023-06-20 20:47:05,576 - INFO - [Eval] step: 22999, fid: 26.844406 2023-06-20 20:48:03,008 - INFO - [Train] step: 23099, loss_D: 0.002030, lr_D: 0.000800 2023-06-20 20:48:03,091 - INFO - [Train] step: 23099, loss_G: 6.818055, lr_G: 0.000800 2023-06-20 20:49:00,545 - INFO - [Train] step: 23199, loss_D: 0.011784, lr_D: 0.000800 2023-06-20 20:49:00,628 - INFO - [Train] step: 23199, loss_G: 5.344277, lr_G: 0.000800 2023-06-20 20:49:58,259 - INFO - [Train] step: 23299, loss_D: 0.017223, lr_D: 0.000800 2023-06-20 20:49:58,342 - INFO - [Train] step: 23299, loss_G: 4.875279, lr_G: 0.000800 2023-06-20 20:50:55,801 - INFO - [Train] step: 23399, loss_D: 0.008644, lr_D: 0.000800 2023-06-20 20:50:55,883 - INFO - [Train] step: 23399, loss_G: 5.472541, lr_G: 0.000800 2023-06-20 20:51:53,317 - INFO - [Train] step: 23499, loss_D: 0.021989, lr_D: 0.000800 2023-06-20 20:51:53,399 - INFO - [Train] step: 23499, loss_G: 4.892233, lr_G: 0.000800 2023-06-20 20:52:50,853 - INFO - [Train] step: 23599, loss_D: 0.152204, lr_D: 0.000800 2023-06-20 20:52:50,935 - INFO - [Train] step: 23599, loss_G: 2.960115, lr_G: 0.000800 2023-06-20 20:53:48,384 - INFO - [Train] step: 23699, loss_D: 0.012804, lr_D: 0.000800 2023-06-20 20:53:48,467 - INFO - [Train] step: 23699, loss_G: 5.126138, lr_G: 0.000800 2023-06-20 20:54:45,917 - INFO - [Train] step: 23799, loss_D: 0.055835, lr_D: 0.000800 2023-06-20 20:54:46,000 - INFO - [Train] step: 23799, loss_G: 3.215057, lr_G: 0.000800 2023-06-20 20:55:43,448 - INFO - [Train] step: 23899, loss_D: 0.011134, lr_D: 0.000800 2023-06-20 20:55:43,530 - INFO - [Train] step: 23899, loss_G: 5.365072, lr_G: 0.000800 2023-06-20 20:56:41,167 - INFO - [Train] step: 23999, loss_D: 0.001943, lr_D: 0.000800 2023-06-20 20:56:41,249 - INFO - [Train] step: 23999, loss_G: 6.923909, lr_G: 0.000800 2023-06-20 20:57:45,375 - INFO - [Eval] step: 23999, fid: 25.264452 2023-06-20 20:58:42,792 - INFO - [Train] step: 24099, loss_D: 0.013769, lr_D: 0.000800 2023-06-20 20:58:42,875 - INFO - [Train] step: 24099, loss_G: 5.861617, lr_G: 0.000800 2023-06-20 20:59:40,373 - INFO - [Train] step: 24199, loss_D: 0.020027, lr_D: 0.000800 2023-06-20 20:59:40,455 - INFO - [Train] step: 24199, loss_G: 4.452385, lr_G: 0.000800 2023-06-20 21:00:37,954 - INFO - [Train] step: 24299, loss_D: 0.003401, lr_D: 0.000800 2023-06-20 21:00:38,036 - INFO - [Train] step: 24299, loss_G: 5.995618, lr_G: 0.000800 2023-06-20 21:01:35,539 - INFO - [Train] step: 24399, loss_D: 0.019958, lr_D: 0.000800 2023-06-20 21:01:35,621 - INFO - [Train] step: 24399, loss_G: 4.311007, lr_G: 0.000800 2023-06-20 21:02:33,076 - INFO - [Train] step: 24499, loss_D: 0.007779, lr_D: 0.000800 2023-06-20 21:02:33,158 - INFO - [Train] step: 24499, loss_G: 5.375480, lr_G: 0.000800 2023-06-20 21:03:30,817 - INFO - [Train] step: 24599, loss_D: 0.100000, lr_D: 0.000800 2023-06-20 21:03:30,899 - INFO - [Train] step: 24599, loss_G: 2.964385, lr_G: 0.000800 2023-06-20 21:04:28,329 - INFO - [Train] step: 24699, loss_D: 0.004223, lr_D: 0.000800 2023-06-20 21:04:28,412 - INFO - [Train] step: 24699, loss_G: 6.441453, lr_G: 0.000800 2023-06-20 21:05:25,865 - INFO - [Train] step: 24799, loss_D: 0.007910, lr_D: 0.000800 2023-06-20 21:05:25,948 - INFO - [Train] step: 24799, loss_G: 5.555909, lr_G: 0.000800 2023-06-20 21:06:23,379 - INFO - [Train] step: 24899, loss_D: 0.004477, lr_D: 0.000800 2023-06-20 21:06:23,461 - INFO - [Train] step: 24899, loss_G: 5.320503, lr_G: 0.000800 2023-06-20 21:07:20,894 - INFO - [Train] step: 24999, loss_D: 0.005713, lr_D: 0.000800 2023-06-20 21:07:20,976 - INFO - [Train] step: 24999, loss_G: 5.784454, lr_G: 0.000800 2023-06-20 21:08:25,075 - INFO - [Eval] step: 24999, fid: 26.167765 2023-06-20 21:09:22,524 - INFO - [Train] step: 25099, loss_D: 0.000615, lr_D: 0.000800 2023-06-20 21:09:22,606 - INFO - [Train] step: 25099, loss_G: 7.635631, lr_G: 0.000800 2023-06-20 21:10:20,066 - INFO - [Train] step: 25199, loss_D: 0.009024, lr_D: 0.000800 2023-06-20 21:10:20,148 - INFO - [Train] step: 25199, loss_G: 4.953653, lr_G: 0.000800 2023-06-20 21:11:17,795 - INFO - [Train] step: 25299, loss_D: 0.011211, lr_D: 0.000800 2023-06-20 21:11:17,878 - INFO - [Train] step: 25299, loss_G: 5.590804, lr_G: 0.000800 2023-06-20 21:12:15,316 - INFO - [Train] step: 25399, loss_D: 0.008275, lr_D: 0.000800 2023-06-20 21:12:15,398 - INFO - [Train] step: 25399, loss_G: 5.805459, lr_G: 0.000800 2023-06-20 21:13:12,845 - INFO - [Train] step: 25499, loss_D: 0.004602, lr_D: 0.000800 2023-06-20 21:13:12,927 - INFO - [Train] step: 25499, loss_G: 6.732613, lr_G: 0.000800 2023-06-20 21:14:10,378 - INFO - [Train] step: 25599, loss_D: 0.022863, lr_D: 0.000800 2023-06-20 21:14:10,460 - INFO - [Train] step: 25599, loss_G: 5.009499, lr_G: 0.000800 2023-06-20 21:15:07,909 - INFO - [Train] step: 25699, loss_D: 0.005245, lr_D: 0.000800 2023-06-20 21:15:07,992 - INFO - [Train] step: 25699, loss_G: 5.995178, lr_G: 0.000800 2023-06-20 21:16:05,463 - INFO - [Train] step: 25799, loss_D: 0.002739, lr_D: 0.000800 2023-06-20 21:16:05,546 - INFO - [Train] step: 25799, loss_G: 6.778831, lr_G: 0.000800 2023-06-20 21:17:03,213 - INFO - [Train] step: 25899, loss_D: 0.016181, lr_D: 0.000800 2023-06-20 21:17:03,296 - INFO - [Train] step: 25899, loss_G: 4.959986, lr_G: 0.000800 2023-06-20 21:18:00,785 - INFO - [Train] step: 25999, loss_D: 0.004177, lr_D: 0.000800 2023-06-20 21:18:00,868 - INFO - [Train] step: 25999, loss_G: 5.615060, lr_G: 0.000800 2023-06-20 21:19:04,996 - INFO - [Eval] step: 25999, fid: 25.281998 2023-06-20 21:20:02,457 - INFO - [Train] step: 26099, loss_D: 0.025516, lr_D: 0.000800 2023-06-20 21:20:02,540 - INFO - [Train] step: 26099, loss_G: 3.670362, lr_G: 0.000800 2023-06-20 21:21:00,043 - INFO - [Train] step: 26199, loss_D: 0.021325, lr_D: 0.000800 2023-06-20 21:21:00,125 - INFO - [Train] step: 26199, loss_G: 4.669709, lr_G: 0.000800 2023-06-20 21:21:57,620 - INFO - [Train] step: 26299, loss_D: 0.005857, lr_D: 0.000800 2023-06-20 21:21:57,703 - INFO - [Train] step: 26299, loss_G: 5.837061, lr_G: 0.000800 2023-06-20 21:22:55,190 - INFO - [Train] step: 26399, loss_D: 0.002703, lr_D: 0.000800 2023-06-20 21:22:55,272 - INFO - [Train] step: 26399, loss_G: 7.080052, lr_G: 0.000800 2023-06-20 21:23:52,759 - INFO - [Train] step: 26499, loss_D: 0.011359, lr_D: 0.000800 2023-06-20 21:23:52,841 - INFO - [Train] step: 26499, loss_G: 5.376851, lr_G: 0.000800 2023-06-20 21:24:50,498 - INFO - [Train] step: 26599, loss_D: 0.011306, lr_D: 0.000800 2023-06-20 21:24:50,580 - INFO - [Train] step: 26599, loss_G: 5.589914, lr_G: 0.000800 2023-06-20 21:25:48,056 - INFO - [Train] step: 26699, loss_D: 0.001192, lr_D: 0.000800 2023-06-20 21:25:48,138 - INFO - [Train] step: 26699, loss_G: 7.472775, lr_G: 0.000800 2023-06-20 21:26:45,601 - INFO - [Train] step: 26799, loss_D: 0.010616, lr_D: 0.000800 2023-06-20 21:26:45,684 - INFO - [Train] step: 26799, loss_G: 5.512984, lr_G: 0.000800 2023-06-20 21:27:43,143 - INFO - [Train] step: 26899, loss_D: 0.002421, lr_D: 0.000800 2023-06-20 21:27:43,225 - INFO - [Train] step: 26899, loss_G: 7.072251, lr_G: 0.000800 2023-06-20 21:28:40,697 - INFO - [Train] step: 26999, loss_D: 0.325729, lr_D: 0.000800 2023-06-20 21:28:40,779 - INFO - [Train] step: 26999, loss_G: 2.300820, lr_G: 0.000800 2023-06-20 21:29:44,865 - INFO - [Eval] step: 26999, fid: 24.899292 2023-06-20 21:30:42,628 - INFO - [Train] step: 27099, loss_D: 0.020761, lr_D: 0.000800 2023-06-20 21:30:42,710 - INFO - [Train] step: 27099, loss_G: 4.347462, lr_G: 0.000800 2023-06-20 21:31:40,353 - INFO - [Train] step: 27199, loss_D: 0.001261, lr_D: 0.000800 2023-06-20 21:31:40,436 - INFO - [Train] step: 27199, loss_G: 7.359065, lr_G: 0.000800 2023-06-20 21:32:37,899 - INFO - [Train] step: 27299, loss_D: 0.002147, lr_D: 0.000800 2023-06-20 21:32:37,981 - INFO - [Train] step: 27299, loss_G: 7.699656, lr_G: 0.000800 2023-06-20 21:33:35,435 - INFO - [Train] step: 27399, loss_D: 0.011163, lr_D: 0.000800 2023-06-20 21:33:35,517 - INFO - [Train] step: 27399, loss_G: 5.632913, lr_G: 0.000800 2023-06-20 21:34:32,950 - INFO - [Train] step: 27499, loss_D: 0.184047, lr_D: 0.000800 2023-06-20 21:34:33,032 - INFO - [Train] step: 27499, loss_G: 2.798954, lr_G: 0.000800 2023-06-20 21:35:30,483 - INFO - [Train] step: 27599, loss_D: 0.006202, lr_D: 0.000800 2023-06-20 21:35:30,565 - INFO - [Train] step: 27599, loss_G: 5.398118, lr_G: 0.000800 2023-06-20 21:36:28,020 - INFO - [Train] step: 27699, loss_D: 0.004869, lr_D: 0.000800 2023-06-20 21:36:28,102 - INFO - [Train] step: 27699, loss_G: 5.833999, lr_G: 0.000800 2023-06-20 21:37:25,533 - INFO - [Train] step: 27799, loss_D: 0.001517, lr_D: 0.000800 2023-06-20 21:37:25,615 - INFO - [Train] step: 27799, loss_G: 7.029740, lr_G: 0.000800 2023-06-20 21:38:23,241 - INFO - [Train] step: 27899, loss_D: 0.199942, lr_D: 0.000800 2023-06-20 21:38:23,323 - INFO - [Train] step: 27899, loss_G: 2.403128, lr_G: 0.000800 2023-06-20 21:39:20,793 - INFO - [Train] step: 27999, loss_D: 0.413004, lr_D: 0.000800 2023-06-20 21:39:20,876 - INFO - [Train] step: 27999, loss_G: 1.771097, lr_G: 0.000800 2023-06-20 21:40:25,002 - INFO - [Eval] step: 27999, fid: 25.545286 2023-06-20 21:41:22,452 - INFO - [Train] step: 28099, loss_D: 0.004248, lr_D: 0.000800 2023-06-20 21:41:22,534 - INFO - [Train] step: 28099, loss_G: 5.959349, lr_G: 0.000800 2023-06-20 21:42:19,999 - INFO - [Train] step: 28199, loss_D: 0.011772, lr_D: 0.000800 2023-06-20 21:42:20,082 - INFO - [Train] step: 28199, loss_G: 4.810378, lr_G: 0.000800 2023-06-20 21:43:17,569 - INFO - [Train] step: 28299, loss_D: 0.002744, lr_D: 0.000800 2023-06-20 21:43:17,652 - INFO - [Train] step: 28299, loss_G: 5.658389, lr_G: 0.000800 2023-06-20 21:44:15,256 - INFO - [Train] step: 28399, loss_D: 0.016061, lr_D: 0.000800 2023-06-20 21:44:15,338 - INFO - [Train] step: 28399, loss_G: 5.201997, lr_G: 0.000800 2023-06-20 21:45:13,095 - INFO - [Train] step: 28499, loss_D: 0.001799, lr_D: 0.000800 2023-06-20 21:45:13,177 - INFO - [Train] step: 28499, loss_G: 6.934212, lr_G: 0.000800 2023-06-20 21:46:10,763 - INFO - [Train] step: 28599, loss_D: 0.009509, lr_D: 0.000800 2023-06-20 21:46:10,845 - INFO - [Train] step: 28599, loss_G: 5.597548, lr_G: 0.000800 2023-06-20 21:47:08,411 - INFO - [Train] step: 28699, loss_D: 0.004966, lr_D: 0.000800 2023-06-20 21:47:08,494 - INFO - [Train] step: 28699, loss_G: 6.422562, lr_G: 0.000800 2023-06-20 21:48:06,050 - INFO - [Train] step: 28799, loss_D: 0.005451, lr_D: 0.000800 2023-06-20 21:48:06,132 - INFO - [Train] step: 28799, loss_G: 6.875413, lr_G: 0.000800 2023-06-20 21:49:03,695 - INFO - [Train] step: 28899, loss_D: 0.007226, lr_D: 0.000800 2023-06-20 21:49:03,778 - INFO - [Train] step: 28899, loss_G: 7.504360, lr_G: 0.000800 2023-06-20 21:50:01,330 - INFO - [Train] step: 28999, loss_D: 0.004669, lr_D: 0.000800 2023-06-20 21:50:01,412 - INFO - [Train] step: 28999, loss_G: 5.975873, lr_G: 0.000800 2023-06-20 21:51:05,586 - INFO - [Eval] step: 28999, fid: 25.512529 2023-06-20 21:52:03,078 - INFO - [Train] step: 29099, loss_D: 0.008837, lr_D: 0.000800 2023-06-20 21:52:03,160 - INFO - [Train] step: 29099, loss_G: 5.909943, lr_G: 0.000800 2023-06-20 21:53:00,840 - INFO - [Train] step: 29199, loss_D: 0.056548, lr_D: 0.000800 2023-06-20 21:53:00,922 - INFO - [Train] step: 29199, loss_G: 4.301550, lr_G: 0.000800 2023-06-20 21:53:58,509 - INFO - [Train] step: 29299, loss_D: 0.003898, lr_D: 0.000800 2023-06-20 21:53:58,592 - INFO - [Train] step: 29299, loss_G: 6.295677, lr_G: 0.000800 2023-06-20 21:54:56,179 - INFO - [Train] step: 29399, loss_D: 0.000931, lr_D: 0.000800 2023-06-20 21:54:56,261 - INFO - [Train] step: 29399, loss_G: 7.287049, lr_G: 0.000800 2023-06-20 21:55:53,830 - INFO - [Train] step: 29499, loss_D: 0.042233, lr_D: 0.000800 2023-06-20 21:55:53,912 - INFO - [Train] step: 29499, loss_G: 3.906187, lr_G: 0.000800 2023-06-20 21:56:51,503 - INFO - [Train] step: 29599, loss_D: 0.215042, lr_D: 0.000800 2023-06-20 21:56:51,585 - INFO - [Train] step: 29599, loss_G: 2.617697, lr_G: 0.000800 2023-06-20 21:57:49,175 - INFO - [Train] step: 29699, loss_D: 0.003476, lr_D: 0.000800 2023-06-20 21:57:49,257 - INFO - [Train] step: 29699, loss_G: 5.971042, lr_G: 0.000800 2023-06-20 21:58:46,999 - INFO - [Train] step: 29799, loss_D: 0.003699, lr_D: 0.000800 2023-06-20 21:58:47,082 - INFO - [Train] step: 29799, loss_G: 6.705812, lr_G: 0.000800 2023-06-20 21:59:44,677 - INFO - [Train] step: 29899, loss_D: 0.007730, lr_D: 0.000800 2023-06-20 21:59:44,759 - INFO - [Train] step: 29899, loss_G: 5.622865, lr_G: 0.000800 2023-06-20 22:00:42,322 - INFO - [Train] step: 29999, loss_D: 0.010752, lr_D: 0.000800 2023-06-20 22:00:42,404 - INFO - [Train] step: 29999, loss_G: 4.860030, lr_G: 0.000800 2023-06-20 22:01:46,556 - INFO - [Eval] step: 29999, fid: 25.527844 2023-06-20 22:02:44,213 - INFO - [Train] step: 30099, loss_D: 0.006241, lr_D: 0.000800 2023-06-20 22:02:44,296 - INFO - [Train] step: 30099, loss_G: 5.771519, lr_G: 0.000800 2023-06-20 22:03:41,868 - INFO - [Train] step: 30199, loss_D: 0.423494, lr_D: 0.000800 2023-06-20 22:03:41,951 - INFO - [Train] step: 30199, loss_G: 3.906664, lr_G: 0.000800 2023-06-20 22:04:39,542 - INFO - [Train] step: 30299, loss_D: 0.021318, lr_D: 0.000800 2023-06-20 22:04:39,625 - INFO - [Train] step: 30299, loss_G: 4.601987, lr_G: 0.000800 2023-06-20 22:05:37,409 - INFO - [Train] step: 30399, loss_D: 0.005734, lr_D: 0.000800 2023-06-20 22:05:37,492 - INFO - [Train] step: 30399, loss_G: 5.764622, lr_G: 0.000800 2023-06-20 22:06:35,066 - INFO - [Train] step: 30499, loss_D: 0.002714, lr_D: 0.000800 2023-06-20 22:06:35,148 - INFO - [Train] step: 30499, loss_G: 6.505485, lr_G: 0.000800 2023-06-20 22:07:32,746 - INFO - [Train] step: 30599, loss_D: 0.093519, lr_D: 0.000800 2023-06-20 22:07:32,829 - INFO - [Train] step: 30599, loss_G: 3.998865, lr_G: 0.000800 2023-06-20 22:08:30,402 - INFO - [Train] step: 30699, loss_D: 0.013326, lr_D: 0.000800 2023-06-20 22:08:30,485 - INFO - [Train] step: 30699, loss_G: 4.743144, lr_G: 0.000800 2023-06-20 22:09:28,080 - INFO - [Train] step: 30799, loss_D: 0.195429, lr_D: 0.000800 2023-06-20 22:09:28,163 - INFO - [Train] step: 30799, loss_G: 2.172445, lr_G: 0.000800 2023-06-20 22:10:25,750 - INFO - [Train] step: 30899, loss_D: 0.003579, lr_D: 0.000800 2023-06-20 22:10:25,833 - INFO - [Train] step: 30899, loss_G: 7.010318, lr_G: 0.000800 2023-06-20 22:11:23,411 - INFO - [Train] step: 30999, loss_D: 0.003529, lr_D: 0.000800 2023-06-20 22:11:23,493 - INFO - [Train] step: 30999, loss_G: 6.782528, lr_G: 0.000800 2023-06-20 22:12:27,566 - INFO - [Eval] step: 30999, fid: 25.458165 2023-06-20 22:13:25,212 - INFO - [Train] step: 31099, loss_D: 0.093581, lr_D: 0.000800 2023-06-20 22:13:25,295 - INFO - [Train] step: 31099, loss_G: 2.681876, lr_G: 0.000800 2023-06-20 22:14:22,881 - INFO - [Train] step: 31199, loss_D: 0.008077, lr_D: 0.000800 2023-06-20 22:14:22,963 - INFO - [Train] step: 31199, loss_G: 6.172701, lr_G: 0.000800 2023-06-20 22:15:20,545 - INFO - [Train] step: 31299, loss_D: 0.003267, lr_D: 0.000800 2023-06-20 22:15:20,628 - INFO - [Train] step: 31299, loss_G: 6.453305, lr_G: 0.000800 2023-06-20 22:16:18,196 - INFO - [Train] step: 31399, loss_D: 0.008948, lr_D: 0.000800 2023-06-20 22:16:18,279 - INFO - [Train] step: 31399, loss_G: 4.827834, lr_G: 0.000800 2023-06-20 22:17:15,873 - INFO - [Train] step: 31499, loss_D: 0.859273, lr_D: 0.000800 2023-06-20 22:17:15,956 - INFO - [Train] step: 31499, loss_G: 3.480356, lr_G: 0.000800 2023-06-20 22:18:13,516 - INFO - [Train] step: 31599, loss_D: 0.003260, lr_D: 0.000800 2023-06-20 22:18:13,599 - INFO - [Train] step: 31599, loss_G: 6.631991, lr_G: 0.000800 2023-06-20 22:19:11,346 - INFO - [Train] step: 31699, loss_D: 0.001322, lr_D: 0.000800 2023-06-20 22:19:11,429 - INFO - [Train] step: 31699, loss_G: 8.001046, lr_G: 0.000800 2023-06-20 22:20:08,998 - INFO - [Train] step: 31799, loss_D: 0.021174, lr_D: 0.000800 2023-06-20 22:20:09,081 - INFO - [Train] step: 31799, loss_G: 4.666618, lr_G: 0.000800 2023-06-20 22:21:06,673 - INFO - [Train] step: 31899, loss_D: 0.007832, lr_D: 0.000800 2023-06-20 22:21:06,756 - INFO - [Train] step: 31899, loss_G: 5.668548, lr_G: 0.000800 2023-06-20 22:22:04,328 - INFO - [Train] step: 31999, loss_D: 0.003817, lr_D: 0.000800 2023-06-20 22:22:04,411 - INFO - [Train] step: 31999, loss_G: 6.591322, lr_G: 0.000800 2023-06-20 22:23:08,455 - INFO - [Eval] step: 31999, fid: 25.265765 2023-06-20 22:24:05,897 - INFO - [Train] step: 32099, loss_D: 0.006593, lr_D: 0.000800 2023-06-20 22:24:05,979 - INFO - [Train] step: 32099, loss_G: 6.646133, lr_G: 0.000800 2023-06-20 22:25:03,445 - INFO - [Train] step: 32199, loss_D: 0.005324, lr_D: 0.000800 2023-06-20 22:25:03,528 - INFO - [Train] step: 32199, loss_G: 6.060289, lr_G: 0.000800 2023-06-20 22:26:01,023 - INFO - [Train] step: 32299, loss_D: 0.004750, lr_D: 0.000800 2023-06-20 22:26:01,105 - INFO - [Train] step: 32299, loss_G: 6.356856, lr_G: 0.000800 2023-06-20 22:26:58,766 - INFO - [Train] step: 32399, loss_D: 0.023967, lr_D: 0.000800 2023-06-20 22:26:58,849 - INFO - [Train] step: 32399, loss_G: 5.371526, lr_G: 0.000800 2023-06-20 22:27:56,317 - INFO - [Train] step: 32499, loss_D: 0.001810, lr_D: 0.000800 2023-06-20 22:27:56,399 - INFO - [Train] step: 32499, loss_G: 6.695422, lr_G: 0.000800 2023-06-20 22:28:53,860 - INFO - [Train] step: 32599, loss_D: 0.012608, lr_D: 0.000800 2023-06-20 22:28:53,943 - INFO - [Train] step: 32599, loss_G: 5.649142, lr_G: 0.000800 2023-06-20 22:29:51,409 - INFO - [Train] step: 32699, loss_D: 0.005204, lr_D: 0.000800 2023-06-20 22:29:51,492 - INFO - [Train] step: 32699, loss_G: 5.518159, lr_G: 0.000800 2023-06-20 22:30:48,936 - INFO - [Train] step: 32799, loss_D: 0.226376, lr_D: 0.000800 2023-06-20 22:30:49,018 - INFO - [Train] step: 32799, loss_G: 1.736880, lr_G: 0.000800 2023-06-20 22:31:46,501 - INFO - [Train] step: 32899, loss_D: 0.003372, lr_D: 0.000800 2023-06-20 22:31:46,584 - INFO - [Train] step: 32899, loss_G: 5.742576, lr_G: 0.000800 2023-06-20 22:32:44,224 - INFO - [Train] step: 32999, loss_D: 0.018129, lr_D: 0.000800 2023-06-20 22:32:44,307 - INFO - [Train] step: 32999, loss_G: 4.696163, lr_G: 0.000800 2023-06-20 22:33:48,356 - INFO - [Eval] step: 32999, fid: 25.226989 2023-06-20 22:34:45,793 - INFO - [Train] step: 33099, loss_D: 0.003303, lr_D: 0.000800 2023-06-20 22:34:45,876 - INFO - [Train] step: 33099, loss_G: 7.189727, lr_G: 0.000800 2023-06-20 22:35:43,358 - INFO - [Train] step: 33199, loss_D: 0.002585, lr_D: 0.000800 2023-06-20 22:35:43,440 - INFO - [Train] step: 33199, loss_G: 5.980851, lr_G: 0.000800 2023-06-20 22:36:40,937 - INFO - [Train] step: 33299, loss_D: 0.002094, lr_D: 0.000800 2023-06-20 22:36:41,020 - INFO - [Train] step: 33299, loss_G: 7.288518, lr_G: 0.000800 2023-06-20 22:37:38,520 - INFO - [Train] step: 33399, loss_D: 0.014458, lr_D: 0.000800 2023-06-20 22:37:38,603 - INFO - [Train] step: 33399, loss_G: 4.699426, lr_G: 0.000800 2023-06-20 22:38:36,069 - INFO - [Train] step: 33499, loss_D: 0.002952, lr_D: 0.000800 2023-06-20 22:38:36,152 - INFO - [Train] step: 33499, loss_G: 6.823879, lr_G: 0.000800 2023-06-20 22:39:33,636 - INFO - [Train] step: 33599, loss_D: 0.011904, lr_D: 0.000800 2023-06-20 22:39:33,717 - INFO - [Train] step: 33599, loss_G: 5.475242, lr_G: 0.000800 2023-06-20 22:40:31,378 - INFO - [Train] step: 33699, loss_D: 0.001174, lr_D: 0.000800 2023-06-20 22:40:31,460 - INFO - [Train] step: 33699, loss_G: 6.914921, lr_G: 0.000800 2023-06-20 22:41:28,931 - INFO - [Train] step: 33799, loss_D: 0.005395, lr_D: 0.000800 2023-06-20 22:41:29,014 - INFO - [Train] step: 33799, loss_G: 6.230540, lr_G: 0.000800 2023-06-20 22:42:26,487 - INFO - [Train] step: 33899, loss_D: 0.005059, lr_D: 0.000800 2023-06-20 22:42:26,569 - INFO - [Train] step: 33899, loss_G: 6.182581, lr_G: 0.000800 2023-06-20 22:43:24,056 - INFO - [Train] step: 33999, loss_D: 0.004559, lr_D: 0.000800 2023-06-20 22:43:24,139 - INFO - [Train] step: 33999, loss_G: 6.003775, lr_G: 0.000800 2023-06-20 22:44:28,256 - INFO - [Eval] step: 33999, fid: 25.541488 2023-06-20 22:45:25,710 - INFO - [Train] step: 34099, loss_D: 0.004716, lr_D: 0.000800 2023-06-20 22:45:25,792 - INFO - [Train] step: 34099, loss_G: 5.984061, lr_G: 0.000800 2023-06-20 22:46:23,266 - INFO - [Train] step: 34199, loss_D: 0.003180, lr_D: 0.000800 2023-06-20 22:46:23,349 - INFO - [Train] step: 34199, loss_G: 7.138955, lr_G: 0.000800 2023-06-20 22:47:21,006 - INFO - [Train] step: 34299, loss_D: 0.000865, lr_D: 0.000800 2023-06-20 22:47:21,089 - INFO - [Train] step: 34299, loss_G: 7.670335, lr_G: 0.000800 2023-06-20 22:48:18,569 - INFO - [Train] step: 34399, loss_D: 0.002306, lr_D: 0.000800 2023-06-20 22:48:18,651 - INFO - [Train] step: 34399, loss_G: 6.767827, lr_G: 0.000800 2023-06-20 22:49:16,133 - INFO - [Train] step: 34499, loss_D: 0.016624, lr_D: 0.000800 2023-06-20 22:49:16,216 - INFO - [Train] step: 34499, loss_G: 4.652476, lr_G: 0.000800 2023-06-20 22:50:13,691 - INFO - [Train] step: 34599, loss_D: 0.003859, lr_D: 0.000800 2023-06-20 22:50:13,773 - INFO - [Train] step: 34599, loss_G: 6.019648, lr_G: 0.000800 2023-06-20 22:51:11,234 - INFO - [Train] step: 34699, loss_D: 0.046591, lr_D: 0.000800 2023-06-20 22:51:11,316 - INFO - [Train] step: 34699, loss_G: 4.152708, lr_G: 0.000800 2023-06-20 22:52:08,770 - INFO - [Train] step: 34799, loss_D: 0.010200, lr_D: 0.000800 2023-06-20 22:52:08,853 - INFO - [Train] step: 34799, loss_G: 5.465000, lr_G: 0.000800 2023-06-20 22:53:06,317 - INFO - [Train] step: 34899, loss_D: 0.004638, lr_D: 0.000800 2023-06-20 22:53:06,399 - INFO - [Train] step: 34899, loss_G: 5.314668, lr_G: 0.000800 2023-06-20 22:54:04,028 - INFO - [Train] step: 34999, loss_D: 0.001488, lr_D: 0.000800 2023-06-20 22:54:04,110 - INFO - [Train] step: 34999, loss_G: 6.951237, lr_G: 0.000800 2023-06-20 22:55:08,185 - INFO - [Eval] step: 34999, fid: 25.759812 2023-06-20 22:56:05,612 - INFO - [Train] step: 35099, loss_D: 0.304674, lr_D: 0.000800 2023-06-20 22:56:05,695 - INFO - [Train] step: 35099, loss_G: 1.742073, lr_G: 0.000800 2023-06-20 22:57:03,165 - INFO - [Train] step: 35199, loss_D: 0.005304, lr_D: 0.000800 2023-06-20 22:57:03,248 - INFO - [Train] step: 35199, loss_G: 6.403793, lr_G: 0.000800 2023-06-20 22:58:00,712 - INFO - [Train] step: 35299, loss_D: 0.008708, lr_D: 0.000800 2023-06-20 22:58:00,795 - INFO - [Train] step: 35299, loss_G: 5.703621, lr_G: 0.000800 2023-06-20 22:58:58,267 - INFO - [Train] step: 35399, loss_D: 0.000984, lr_D: 0.000800 2023-06-20 22:58:58,349 - INFO - [Train] step: 35399, loss_G: 7.664944, lr_G: 0.000800 2023-06-20 22:59:55,829 - INFO - [Train] step: 35499, loss_D: 0.003193, lr_D: 0.000800 2023-06-20 22:59:55,912 - INFO - [Train] step: 35499, loss_G: 6.703268, lr_G: 0.000800 2023-06-20 23:00:53,564 - INFO - [Train] step: 35599, loss_D: 0.008560, lr_D: 0.000800 2023-06-20 23:00:53,646 - INFO - [Train] step: 35599, loss_G: 5.999189, lr_G: 0.000800 2023-06-20 23:01:51,117 - INFO - [Train] step: 35699, loss_D: 0.002047, lr_D: 0.000800 2023-06-20 23:01:51,200 - INFO - [Train] step: 35699, loss_G: 6.507125, lr_G: 0.000800 2023-06-20 23:02:48,683 - INFO - [Train] step: 35799, loss_D: 0.009309, lr_D: 0.000800 2023-06-20 23:02:48,766 - INFO - [Train] step: 35799, loss_G: 5.194795, lr_G: 0.000800 2023-06-20 23:03:46,236 - INFO - [Train] step: 35899, loss_D: 0.005663, lr_D: 0.000800 2023-06-20 23:03:46,319 - INFO - [Train] step: 35899, loss_G: 5.485140, lr_G: 0.000800 2023-06-20 23:04:43,795 - INFO - [Train] step: 35999, loss_D: 0.001752, lr_D: 0.000800 2023-06-20 23:04:43,877 - INFO - [Train] step: 35999, loss_G: 7.093441, lr_G: 0.000800 2023-06-20 23:05:48,033 - INFO - [Eval] step: 35999, fid: 26.256785 2023-06-20 23:06:45,492 - INFO - [Train] step: 36099, loss_D: 0.002911, lr_D: 0.000800 2023-06-20 23:06:45,575 - INFO - [Train] step: 36099, loss_G: 6.017653, lr_G: 0.000800 2023-06-20 23:07:43,054 - INFO - [Train] step: 36199, loss_D: 0.006131, lr_D: 0.000800 2023-06-20 23:07:43,136 - INFO - [Train] step: 36199, loss_G: 6.950406, lr_G: 0.000800 2023-06-20 23:08:40,802 - INFO - [Train] step: 36299, loss_D: 0.001763, lr_D: 0.000800 2023-06-20 23:08:40,885 - INFO - [Train] step: 36299, loss_G: 6.156127, lr_G: 0.000800 2023-06-20 23:09:38,373 - INFO - [Train] step: 36399, loss_D: 0.002217, lr_D: 0.000800 2023-06-20 23:09:38,455 - INFO - [Train] step: 36399, loss_G: 6.696917, lr_G: 0.000800 2023-06-20 23:10:35,955 - INFO - [Train] step: 36499, loss_D: 0.367182, lr_D: 0.000800 2023-06-20 23:10:36,037 - INFO - [Train] step: 36499, loss_G: 1.694500, lr_G: 0.000800 2023-06-20 23:11:33,527 - INFO - [Train] step: 36599, loss_D: 0.005957, lr_D: 0.000800 2023-06-20 23:11:33,609 - INFO - [Train] step: 36599, loss_G: 5.863786, lr_G: 0.000800 2023-06-20 23:12:31,109 - INFO - [Train] step: 36699, loss_D: 0.002731, lr_D: 0.000800 2023-06-20 23:12:31,191 - INFO - [Train] step: 36699, loss_G: 7.294751, lr_G: 0.000800 2023-06-20 23:13:28,665 - INFO - [Train] step: 36799, loss_D: 0.046961, lr_D: 0.000800 2023-06-20 23:13:28,748 - INFO - [Train] step: 36799, loss_G: 4.408784, lr_G: 0.000800 2023-06-20 23:14:26,406 - INFO - [Train] step: 36899, loss_D: 0.005921, lr_D: 0.000800 2023-06-20 23:14:26,488 - INFO - [Train] step: 36899, loss_G: 6.682759, lr_G: 0.000800 2023-06-20 23:15:23,930 - INFO - [Train] step: 36999, loss_D: 0.001430, lr_D: 0.000800 2023-06-20 23:15:24,012 - INFO - [Train] step: 36999, loss_G: 7.716665, lr_G: 0.000800 2023-06-20 23:16:28,111 - INFO - [Eval] step: 36999, fid: 25.155429 2023-06-20 23:17:25,544 - INFO - [Train] step: 37099, loss_D: 0.019604, lr_D: 0.000800 2023-06-20 23:17:25,627 - INFO - [Train] step: 37099, loss_G: 5.001217, lr_G: 0.000800 2023-06-20 23:18:23,079 - INFO - [Train] step: 37199, loss_D: 0.001813, lr_D: 0.000800 2023-06-20 23:18:23,162 - INFO - [Train] step: 37199, loss_G: 7.110353, lr_G: 0.000800 2023-06-20 23:19:20,617 - INFO - [Train] step: 37299, loss_D: 0.002856, lr_D: 0.000800 2023-06-20 23:19:20,700 - INFO - [Train] step: 37299, loss_G: 6.629059, lr_G: 0.000800 2023-06-20 23:20:18,164 - INFO - [Train] step: 37399, loss_D: 0.004231, lr_D: 0.000800 2023-06-20 23:20:18,247 - INFO - [Train] step: 37399, loss_G: 6.176263, lr_G: 0.000800 2023-06-20 23:21:15,699 - INFO - [Train] step: 37499, loss_D: 0.001328, lr_D: 0.000800 2023-06-20 23:21:15,781 - INFO - [Train] step: 37499, loss_G: 7.244343, lr_G: 0.000800 2023-06-20 23:22:13,449 - INFO - [Train] step: 37599, loss_D: 0.001734, lr_D: 0.000800 2023-06-20 23:22:13,531 - INFO - [Train] step: 37599, loss_G: 6.850110, lr_G: 0.000800 2023-06-20 23:23:11,000 - INFO - [Train] step: 37699, loss_D: 0.009562, lr_D: 0.000800 2023-06-20 23:23:11,082 - INFO - [Train] step: 37699, loss_G: 4.957561, lr_G: 0.000800 2023-06-20 23:24:08,541 - INFO - [Train] step: 37799, loss_D: 0.001873, lr_D: 0.000800 2023-06-20 23:24:08,624 - INFO - [Train] step: 37799, loss_G: 7.260859, lr_G: 0.000800 2023-06-20 23:25:06,082 - INFO - [Train] step: 37899, loss_D: 0.001138, lr_D: 0.000800 2023-06-20 23:25:06,165 - INFO - [Train] step: 37899, loss_G: 8.025046, lr_G: 0.000800 2023-06-20 23:26:03,632 - INFO - [Train] step: 37999, loss_D: 0.006886, lr_D: 0.000800 2023-06-20 23:26:03,715 - INFO - [Train] step: 37999, loss_G: 6.441565, lr_G: 0.000800 2023-06-20 23:27:07,858 - INFO - [Eval] step: 37999, fid: 25.590791 2023-06-20 23:28:05,313 - INFO - [Train] step: 38099, loss_D: 0.004456, lr_D: 0.000800 2023-06-20 23:28:05,395 - INFO - [Train] step: 38099, loss_G: 5.917461, lr_G: 0.000800 2023-06-20 23:29:03,044 - INFO - [Train] step: 38199, loss_D: 0.028038, lr_D: 0.000800 2023-06-20 23:29:03,127 - INFO - [Train] step: 38199, loss_G: 3.713384, lr_G: 0.000800 2023-06-20 23:30:00,583 - INFO - [Train] step: 38299, loss_D: 0.003077, lr_D: 0.000800 2023-06-20 23:30:00,666 - INFO - [Train] step: 38299, loss_G: 6.926782, lr_G: 0.000800 2023-06-20 23:30:58,132 - INFO - [Train] step: 38399, loss_D: 0.007003, lr_D: 0.000800 2023-06-20 23:30:58,215 - INFO - [Train] step: 38399, loss_G: 6.217010, lr_G: 0.000800 2023-06-20 23:31:55,654 - INFO - [Train] step: 38499, loss_D: 0.022324, lr_D: 0.000800 2023-06-20 23:31:55,737 - INFO - [Train] step: 38499, loss_G: 4.749743, lr_G: 0.000800 2023-06-20 23:32:53,185 - INFO - [Train] step: 38599, loss_D: 0.002229, lr_D: 0.000800 2023-06-20 23:32:53,268 - INFO - [Train] step: 38599, loss_G: 7.160068, lr_G: 0.000800 2023-06-20 23:33:50,746 - INFO - [Train] step: 38699, loss_D: 0.003003, lr_D: 0.000800 2023-06-20 23:33:50,828 - INFO - [Train] step: 38699, loss_G: 6.808163, lr_G: 0.000800 2023-06-20 23:34:48,295 - INFO - [Train] step: 38799, loss_D: 0.002326, lr_D: 0.000800 2023-06-20 23:34:48,376 - INFO - [Train] step: 38799, loss_G: 6.654353, lr_G: 0.000800 2023-06-20 23:35:45,961 - INFO - [Train] step: 38899, loss_D: 0.002200, lr_D: 0.000800 2023-06-20 23:35:46,044 - INFO - [Train] step: 38899, loss_G: 6.955550, lr_G: 0.000800 2023-06-20 23:36:43,500 - INFO - [Train] step: 38999, loss_D: 0.014081, lr_D: 0.000800 2023-06-20 23:36:43,582 - INFO - [Train] step: 38999, loss_G: 5.478423, lr_G: 0.000800 2023-06-20 23:37:47,733 - INFO - [Eval] step: 38999, fid: 25.652421 2023-06-20 23:38:45,189 - INFO - [Train] step: 39099, loss_D: 0.001823, lr_D: 0.000800 2023-06-20 23:38:45,272 - INFO - [Train] step: 39099, loss_G: 6.203312, lr_G: 0.000800 2023-06-20 23:39:42,806 - INFO - [Train] step: 39199, loss_D: 0.006968, lr_D: 0.000800 2023-06-20 23:39:42,888 - INFO - [Train] step: 39199, loss_G: 6.224435, lr_G: 0.000800 2023-06-20 23:40:40,409 - INFO - [Train] step: 39299, loss_D: 0.003028, lr_D: 0.000800 2023-06-20 23:40:40,492 - INFO - [Train] step: 39299, loss_G: 6.717517, lr_G: 0.000800 2023-06-20 23:41:38,050 - INFO - [Train] step: 39399, loss_D: 0.001922, lr_D: 0.000800 2023-06-20 23:41:38,132 - INFO - [Train] step: 39399, loss_G: 7.657255, lr_G: 0.000800 2023-06-20 23:42:35,850 - INFO - [Train] step: 39499, loss_D: 0.001143, lr_D: 0.000800 2023-06-20 23:42:35,932 - INFO - [Train] step: 39499, loss_G: 6.409805, lr_G: 0.000800 2023-06-20 23:43:33,476 - INFO - [Train] step: 39599, loss_D: 0.016445, lr_D: 0.000800 2023-06-20 23:43:33,558 - INFO - [Train] step: 39599, loss_G: 5.113930, lr_G: 0.000800 2023-06-20 23:44:31,063 - INFO - [Train] step: 39699, loss_D: 0.002409, lr_D: 0.000800 2023-06-20 23:44:31,146 - INFO - [Train] step: 39699, loss_G: 6.956886, lr_G: 0.000800 2023-06-20 23:45:28,627 - INFO - [Train] step: 39799, loss_D: 0.039306, lr_D: 0.000800 2023-06-20 23:45:28,710 - INFO - [Train] step: 39799, loss_G: 3.923831, lr_G: 0.000800 2023-06-20 23:46:26,214 - INFO - [Train] step: 39899, loss_D: 0.004975, lr_D: 0.000800 2023-06-20 23:46:26,296 - INFO - [Train] step: 39899, loss_G: 5.772897, lr_G: 0.000800 2023-06-20 23:47:23,814 - INFO - [Train] step: 39999, loss_D: 0.004397, lr_D: 0.000800 2023-06-20 23:47:23,897 - INFO - [Train] step: 39999, loss_G: 6.648211, lr_G: 0.000800 2023-06-20 23:48:28,042 - INFO - [Eval] step: 39999, fid: 25.403855 2023-06-20 23:48:28,213 - INFO - Best FID score: 24.899292394803524 2023-06-20 23:48:28,213 - INFO - End of training