2023-07-11 11:32:41,232 - INFO - Experiment directory: ./runs/cgan_cifar10/ 2023-07-11 11:32:41,232 - INFO - Number of processes: 1 2023-07-11 11:32:41,232 - INFO - Distributed type: DistributedType.NO 2023-07-11 11:32:41,232 - INFO - Mixed precision: no 2023-07-11 11:32:41,232 - INFO - ============================== 2023-07-11 11:32:41,737 - INFO - Size of training set: 50000 2023-07-11 11:32:41,738 - INFO - Batch size per process: 512 2023-07-11 11:32:41,738 - INFO - Total batch size: 512 2023-07-11 11:32:41,738 - INFO - ============================== 2023-07-11 11:32:42,715 - INFO - Start training... 2023-07-11 11:33:40,187 - INFO - [Train] step: 99, loss_D: 0.001627, lr_D: 0.000800 2023-07-11 11:33:40,271 - INFO - [Train] step: 99, loss_G: 8.997356, lr_G: 0.000800 2023-07-11 11:34:36,377 - INFO - [Train] step: 199, loss_D: 0.032951, lr_D: 0.000800 2023-07-11 11:34:36,461 - INFO - [Train] step: 199, loss_G: 4.610237, lr_G: 0.000800 2023-07-11 11:35:32,676 - INFO - [Train] step: 299, loss_D: 0.014280, lr_D: 0.000800 2023-07-11 11:35:32,759 - INFO - [Train] step: 299, loss_G: 5.337371, lr_G: 0.000800 2023-07-11 11:36:28,994 - INFO - [Train] step: 399, loss_D: 0.009693, lr_D: 0.000800 2023-07-11 11:36:29,078 - INFO - [Train] step: 399, loss_G: 5.491062, lr_G: 0.000800 2023-07-11 11:37:25,366 - INFO - [Train] step: 499, loss_D: 0.033736, lr_D: 0.000800 2023-07-11 11:37:25,450 - INFO - [Train] step: 499, loss_G: 4.679213, lr_G: 0.000800 2023-07-11 11:38:21,736 - INFO - [Train] step: 599, loss_D: 0.011892, lr_D: 0.000800 2023-07-11 11:38:21,820 - INFO - [Train] step: 599, loss_G: 4.816560, lr_G: 0.000800 2023-07-11 11:39:18,235 - INFO - [Train] step: 699, loss_D: 0.031111, lr_D: 0.000800 2023-07-11 11:39:18,319 - INFO - [Train] step: 699, loss_G: 5.084354, lr_G: 0.000800 2023-07-11 11:40:14,587 - INFO - [Train] step: 799, loss_D: 0.023191, lr_D: 0.000800 2023-07-11 11:40:14,671 - INFO - [Train] step: 799, loss_G: 4.651308, lr_G: 0.000800 2023-07-11 11:41:10,961 - INFO - [Train] step: 899, loss_D: 0.015397, lr_D: 0.000800 2023-07-11 11:41:11,045 - INFO - [Train] step: 899, loss_G: 5.959859, lr_G: 0.000800 2023-07-11 11:42:07,312 - INFO - [Train] step: 999, loss_D: 0.026454, lr_D: 0.000800 2023-07-11 11:42:07,396 - INFO - [Train] step: 999, loss_G: 4.833085, lr_G: 0.000800 2023-07-11 11:43:11,563 - INFO - [Eval] step: 999, fid: 128.158505 2023-07-11 11:44:08,003 - INFO - [Train] step: 1099, loss_D: 0.008656, lr_D: 0.000800 2023-07-11 11:44:08,087 - INFO - [Train] step: 1099, loss_G: 6.271304, lr_G: 0.000800 2023-07-11 11:45:04,400 - INFO - [Train] step: 1199, loss_D: 0.021059, lr_D: 0.000800 2023-07-11 11:45:04,484 - INFO - [Train] step: 1199, loss_G: 5.619766, lr_G: 0.000800 2023-07-11 11:46:00,938 - INFO - [Train] step: 1299, loss_D: 0.014379, lr_D: 0.000800 2023-07-11 11:46:01,022 - INFO - [Train] step: 1299, loss_G: 5.458647, lr_G: 0.000800 2023-07-11 11:46:57,347 - INFO - [Train] step: 1399, loss_D: 0.026529, lr_D: 0.000800 2023-07-11 11:46:57,431 - INFO - [Train] step: 1399, loss_G: 5.784266, lr_G: 0.000800 2023-07-11 11:47:53,788 - INFO - [Train] step: 1499, loss_D: 0.013564, lr_D: 0.000800 2023-07-11 11:47:53,872 - INFO - [Train] step: 1499, loss_G: 5.812286, lr_G: 0.000800 2023-07-11 11:48:50,200 - INFO - [Train] step: 1599, loss_D: 0.032264, lr_D: 0.000800 2023-07-11 11:48:50,284 - INFO - [Train] step: 1599, loss_G: 5.074686, lr_G: 0.000800 2023-07-11 11:49:46,612 - INFO - [Train] step: 1699, loss_D: 0.008307, lr_D: 0.000800 2023-07-11 11:49:46,696 - INFO - [Train] step: 1699, loss_G: 6.551968, lr_G: 0.000800 2023-07-11 11:50:43,011 - INFO - [Train] step: 1799, loss_D: 0.026635, lr_D: 0.000800 2023-07-11 11:50:43,095 - INFO - [Train] step: 1799, loss_G: 5.197975, lr_G: 0.000800 2023-07-11 11:51:39,396 - INFO - [Train] step: 1899, loss_D: 0.063450, lr_D: 0.000800 2023-07-11 11:51:39,479 - INFO - [Train] step: 1899, loss_G: 3.990579, lr_G: 0.000800 2023-07-11 11:52:35,925 - INFO - [Train] step: 1999, loss_D: 0.029807, lr_D: 0.000800 2023-07-11 11:52:36,009 - INFO - [Train] step: 1999, loss_G: 5.723417, lr_G: 0.000800 2023-07-11 11:53:39,937 - INFO - [Eval] step: 1999, fid: 79.083606 2023-07-11 11:54:36,480 - INFO - [Train] step: 2099, loss_D: 0.028993, lr_D: 0.000800 2023-07-11 11:54:36,565 - INFO - [Train] step: 2099, loss_G: 4.681474, lr_G: 0.000800 2023-07-11 11:55:32,907 - INFO - [Train] step: 2199, loss_D: 0.022322, lr_D: 0.000800 2023-07-11 11:55:32,991 - INFO - [Train] step: 2199, loss_G: 4.906097, lr_G: 0.000800 2023-07-11 11:56:29,327 - INFO - [Train] step: 2299, loss_D: 0.115781, lr_D: 0.000800 2023-07-11 11:56:29,411 - INFO - [Train] step: 2299, loss_G: 5.437107, lr_G: 0.000800 2023-07-11 11:57:25,744 - INFO - [Train] step: 2399, loss_D: 0.032049, lr_D: 0.000800 2023-07-11 11:57:25,828 - INFO - [Train] step: 2399, loss_G: 5.646049, lr_G: 0.000800 2023-07-11 11:58:22,156 - INFO - [Train] step: 2499, loss_D: 0.016577, lr_D: 0.000800 2023-07-11 11:58:22,240 - INFO - [Train] step: 2499, loss_G: 5.342746, lr_G: 0.000800 2023-07-11 11:59:18,711 - INFO - [Train] step: 2599, loss_D: 0.024346, lr_D: 0.000800 2023-07-11 11:59:18,795 - INFO - [Train] step: 2599, loss_G: 5.324674, lr_G: 0.000800 2023-07-11 12:00:15,116 - INFO - [Train] step: 2699, loss_D: 0.065964, lr_D: 0.000800 2023-07-11 12:00:15,200 - INFO - [Train] step: 2699, loss_G: 4.916304, lr_G: 0.000800 2023-07-11 12:01:11,518 - INFO - [Train] step: 2799, loss_D: 0.075717, lr_D: 0.000800 2023-07-11 12:01:11,603 - INFO - [Train] step: 2799, loss_G: 4.272255, lr_G: 0.000800 2023-07-11 12:02:07,921 - INFO - [Train] step: 2899, loss_D: 0.041682, lr_D: 0.000800 2023-07-11 12:02:08,006 - INFO - [Train] step: 2899, loss_G: 4.634606, lr_G: 0.000800 2023-07-11 12:03:04,320 - INFO - [Train] step: 2999, loss_D: 0.085065, lr_D: 0.000800 2023-07-11 12:03:04,404 - INFO - [Train] step: 2999, loss_G: 4.004360, lr_G: 0.000800 2023-07-11 12:04:08,493 - INFO - [Eval] step: 2999, fid: 58.172546 2023-07-11 12:05:05,031 - INFO - [Train] step: 3099, loss_D: 0.159297, lr_D: 0.000800 2023-07-11 12:05:05,115 - INFO - [Train] step: 3099, loss_G: 4.403972, lr_G: 0.000800 2023-07-11 12:06:01,508 - INFO - [Train] step: 3199, loss_D: 0.029541, lr_D: 0.000800 2023-07-11 12:06:01,593 - INFO - [Train] step: 3199, loss_G: 4.741822, lr_G: 0.000800 2023-07-11 12:06:58,227 - INFO - [Train] step: 3299, loss_D: 0.054262, lr_D: 0.000800 2023-07-11 12:06:58,312 - INFO - [Train] step: 3299, loss_G: 4.842682, lr_G: 0.000800 2023-07-11 12:07:54,800 - INFO - [Train] step: 3399, loss_D: 0.030436, lr_D: 0.000800 2023-07-11 12:07:54,884 - INFO - [Train] step: 3399, loss_G: 5.227071, lr_G: 0.000800 2023-07-11 12:08:51,363 - INFO - [Train] step: 3499, loss_D: 0.037491, lr_D: 0.000800 2023-07-11 12:08:51,447 - INFO - [Train] step: 3499, loss_G: 5.755805, lr_G: 0.000800 2023-07-11 12:09:47,921 - INFO - [Train] step: 3599, loss_D: 0.047046, lr_D: 0.000800 2023-07-11 12:09:48,005 - INFO - [Train] step: 3599, loss_G: 4.720422, lr_G: 0.000800 2023-07-11 12:10:44,467 - INFO - [Train] step: 3699, loss_D: 0.055668, lr_D: 0.000800 2023-07-11 12:10:44,551 - INFO - [Train] step: 3699, loss_G: 3.561963, lr_G: 0.000800 2023-07-11 12:11:41,006 - INFO - [Train] step: 3799, loss_D: 0.056802, lr_D: 0.000800 2023-07-11 12:11:41,091 - INFO - [Train] step: 3799, loss_G: 5.189616, lr_G: 0.000800 2023-07-11 12:12:37,681 - INFO - [Train] step: 3899, loss_D: 0.032559, lr_D: 0.000800 2023-07-11 12:12:37,765 - INFO - [Train] step: 3899, loss_G: 6.290652, lr_G: 0.000800 2023-07-11 12:13:34,217 - INFO - [Train] step: 3999, loss_D: 0.039090, lr_D: 0.000800 2023-07-11 12:13:34,301 - INFO - [Train] step: 3999, loss_G: 4.379457, lr_G: 0.000800 2023-07-11 12:14:38,292 - INFO - [Eval] step: 3999, fid: 44.029579 2023-07-11 12:15:34,705 - INFO - [Train] step: 4099, loss_D: 0.050695, lr_D: 0.000800 2023-07-11 12:15:34,789 - INFO - [Train] step: 4099, loss_G: 4.415603, lr_G: 0.000800 2023-07-11 12:16:30,916 - INFO - [Train] step: 4199, loss_D: 0.073558, lr_D: 0.000800 2023-07-11 12:16:30,999 - INFO - [Train] step: 4199, loss_G: 4.393678, lr_G: 0.000800 2023-07-11 12:17:27,145 - INFO - [Train] step: 4299, loss_D: 0.001833, lr_D: 0.000800 2023-07-11 12:17:27,229 - INFO - [Train] step: 4299, loss_G: 7.804952, lr_G: 0.000800 2023-07-11 12:18:23,353 - INFO - [Train] step: 4399, loss_D: 0.079716, lr_D: 0.000800 2023-07-11 12:18:23,436 - INFO - [Train] step: 4399, loss_G: 4.749404, lr_G: 0.000800 2023-07-11 12:19:19,549 - INFO - [Train] step: 4499, loss_D: 0.129455, lr_D: 0.000800 2023-07-11 12:19:19,632 - INFO - [Train] step: 4499, loss_G: 4.564599, lr_G: 0.000800 2023-07-11 12:20:15,908 - INFO - [Train] step: 4599, loss_D: 0.001934, lr_D: 0.000800 2023-07-11 12:20:15,992 - INFO - [Train] step: 4599, loss_G: 8.201621, lr_G: 0.000800 2023-07-11 12:21:12,108 - INFO - [Train] step: 4699, loss_D: 0.097809, lr_D: 0.000800 2023-07-11 12:21:12,192 - INFO - [Train] step: 4699, loss_G: 3.315424, lr_G: 0.000800 2023-07-11 12:22:08,337 - INFO - [Train] step: 4799, loss_D: 0.009425, lr_D: 0.000800 2023-07-11 12:22:08,421 - INFO - [Train] step: 4799, loss_G: 6.000487, lr_G: 0.000800 2023-07-11 12:23:04,554 - INFO - [Train] step: 4899, loss_D: 0.010454, lr_D: 0.000800 2023-07-11 12:23:04,637 - INFO - [Train] step: 4899, loss_G: 5.965707, lr_G: 0.000800 2023-07-11 12:24:00,754 - INFO - [Train] step: 4999, loss_D: 0.017259, lr_D: 0.000800 2023-07-11 12:24:00,837 - INFO - [Train] step: 4999, loss_G: 6.603471, lr_G: 0.000800 2023-07-11 12:25:04,819 - INFO - [Eval] step: 4999, fid: 43.723356 2023-07-11 12:26:01,289 - INFO - [Train] step: 5099, loss_D: 0.064753, lr_D: 0.000800 2023-07-11 12:26:01,373 - INFO - [Train] step: 5099, loss_G: 4.521230, lr_G: 0.000800 2023-07-11 12:26:57,690 - INFO - [Train] step: 5199, loss_D: 0.075535, lr_D: 0.000800 2023-07-11 12:26:57,774 - INFO - [Train] step: 5199, loss_G: 4.297753, lr_G: 0.000800 2023-07-11 12:27:53,955 - INFO - [Train] step: 5299, loss_D: 0.058116, lr_D: 0.000800 2023-07-11 12:27:54,039 - INFO - [Train] step: 5299, loss_G: 5.717802, lr_G: 0.000800 2023-07-11 12:28:50,193 - INFO - [Train] step: 5399, loss_D: 0.025380, lr_D: 0.000800 2023-07-11 12:28:50,277 - INFO - [Train] step: 5399, loss_G: 5.295789, lr_G: 0.000800 2023-07-11 12:29:46,411 - INFO - [Train] step: 5499, loss_D: 0.056128, lr_D: 0.000800 2023-07-11 12:29:46,494 - INFO - [Train] step: 5499, loss_G: 6.103138, lr_G: 0.000800 2023-07-11 12:30:42,646 - INFO - [Train] step: 5599, loss_D: 0.098828, lr_D: 0.000800 2023-07-11 12:30:42,729 - INFO - [Train] step: 5599, loss_G: 4.553205, lr_G: 0.000800 2023-07-11 12:31:38,895 - INFO - [Train] step: 5699, loss_D: 0.020657, lr_D: 0.000800 2023-07-11 12:31:38,979 - INFO - [Train] step: 5699, loss_G: 5.729069, lr_G: 0.000800 2023-07-11 12:32:35,107 - INFO - [Train] step: 5799, loss_D: 0.001277, lr_D: 0.000800 2023-07-11 12:32:35,191 - INFO - [Train] step: 5799, loss_G: 9.115376, lr_G: 0.000800 2023-07-11 12:33:31,470 - INFO - [Train] step: 5899, loss_D: 0.048693, lr_D: 0.000800 2023-07-11 12:33:31,554 - INFO - [Train] step: 5899, loss_G: 4.729529, lr_G: 0.000800 2023-07-11 12:34:27,700 - INFO - [Train] step: 5999, loss_D: 0.704835, lr_D: 0.000800 2023-07-11 12:34:27,783 - INFO - [Train] step: 5999, loss_G: 1.363551, lr_G: 0.000800 2023-07-11 12:35:31,857 - INFO - [Eval] step: 5999, fid: 34.022448 2023-07-11 12:36:28,276 - INFO - [Train] step: 6099, loss_D: 0.210027, lr_D: 0.000800 2023-07-11 12:36:28,359 - INFO - [Train] step: 6099, loss_G: 3.310365, lr_G: 0.000800 2023-07-11 12:37:24,510 - INFO - [Train] step: 6199, loss_D: 0.020091, lr_D: 0.000800 2023-07-11 12:37:24,593 - INFO - [Train] step: 6199, loss_G: 4.991518, lr_G: 0.000800 2023-07-11 12:38:20,780 - INFO - [Train] step: 6299, loss_D: 0.056295, lr_D: 0.000800 2023-07-11 12:38:20,864 - INFO - [Train] step: 6299, loss_G: 5.282824, lr_G: 0.000800 2023-07-11 12:39:17,031 - INFO - [Train] step: 6399, loss_D: 0.051391, lr_D: 0.000800 2023-07-11 12:39:17,115 - INFO - [Train] step: 6399, loss_G: 4.308142, lr_G: 0.000800 2023-07-11 12:40:13,412 - INFO - [Train] step: 6499, loss_D: 0.011647, lr_D: 0.000800 2023-07-11 12:40:13,496 - INFO - [Train] step: 6499, loss_G: 6.623148, lr_G: 0.000800 2023-07-11 12:41:09,676 - INFO - [Train] step: 6599, loss_D: 0.032993, lr_D: 0.000800 2023-07-11 12:41:09,759 - INFO - [Train] step: 6599, loss_G: 5.757629, lr_G: 0.000800 2023-07-11 12:42:05,905 - INFO - [Train] step: 6699, loss_D: 0.007489, lr_D: 0.000800 2023-07-11 12:42:05,989 - INFO - [Train] step: 6699, loss_G: 7.138266, lr_G: 0.000800 2023-07-11 12:43:02,151 - INFO - [Train] step: 6799, loss_D: 0.010770, lr_D: 0.000800 2023-07-11 12:43:02,235 - INFO - [Train] step: 6799, loss_G: 5.752389, lr_G: 0.000800 2023-07-11 12:43:58,402 - INFO - [Train] step: 6899, loss_D: 0.038809, lr_D: 0.000800 2023-07-11 12:43:58,486 - INFO - [Train] step: 6899, loss_G: 5.210541, lr_G: 0.000800 2023-07-11 12:44:54,656 - INFO - [Train] step: 6999, loss_D: 0.017666, lr_D: 0.000800 2023-07-11 12:44:54,740 - INFO - [Train] step: 6999, loss_G: 5.626320, lr_G: 0.000800 2023-07-11 12:45:58,784 - INFO - [Eval] step: 6999, fid: 33.058625 2023-07-11 12:46:55,267 - INFO - [Train] step: 7099, loss_D: 0.091532, lr_D: 0.000800 2023-07-11 12:46:55,350 - INFO - [Train] step: 7099, loss_G: 5.154208, lr_G: 0.000800 2023-07-11 12:47:51,675 - INFO - [Train] step: 7199, loss_D: 0.490369, lr_D: 0.000800 2023-07-11 12:47:51,758 - INFO - [Train] step: 7199, loss_G: 2.006269, lr_G: 0.000800 2023-07-11 12:48:47,931 - INFO - [Train] step: 7299, loss_D: 0.014261, lr_D: 0.000800 2023-07-11 12:48:48,014 - INFO - [Train] step: 7299, loss_G: 5.799419, lr_G: 0.000800 2023-07-11 12:49:44,218 - INFO - [Train] step: 7399, loss_D: 0.032790, lr_D: 0.000800 2023-07-11 12:49:44,302 - INFO - [Train] step: 7399, loss_G: 5.115887, lr_G: 0.000800 2023-07-11 12:50:40,481 - INFO - [Train] step: 7499, loss_D: 0.024623, lr_D: 0.000800 2023-07-11 12:50:40,564 - INFO - [Train] step: 7499, loss_G: 5.416479, lr_G: 0.000800 2023-07-11 12:51:36,754 - INFO - [Train] step: 7599, loss_D: 0.022694, lr_D: 0.000800 2023-07-11 12:51:36,838 - INFO - [Train] step: 7599, loss_G: 4.639852, lr_G: 0.000800 2023-07-11 12:52:32,991 - INFO - [Train] step: 7699, loss_D: 0.020744, lr_D: 0.000800 2023-07-11 12:52:33,075 - INFO - [Train] step: 7699, loss_G: 6.460749, lr_G: 0.000800 2023-07-11 12:53:29,378 - INFO - [Train] step: 7799, loss_D: 0.013265, lr_D: 0.000800 2023-07-11 12:53:29,462 - INFO - [Train] step: 7799, loss_G: 6.499502, lr_G: 0.000800 2023-07-11 12:54:25,638 - INFO - [Train] step: 7899, loss_D: 0.010489, lr_D: 0.000800 2023-07-11 12:54:25,722 - INFO - [Train] step: 7899, loss_G: 6.309391, lr_G: 0.000800 2023-07-11 12:55:21,889 - INFO - [Train] step: 7999, loss_D: 0.011140, lr_D: 0.000800 2023-07-11 12:55:21,973 - INFO - [Train] step: 7999, loss_G: 5.999728, lr_G: 0.000800 2023-07-11 12:56:25,962 - INFO - [Eval] step: 7999, fid: 35.348561 2023-07-11 12:57:22,101 - INFO - [Train] step: 8099, loss_D: 0.003056, lr_D: 0.000800 2023-07-11 12:57:22,185 - INFO - [Train] step: 8099, loss_G: 7.496294, lr_G: 0.000800 2023-07-11 12:58:18,348 - INFO - [Train] step: 8199, loss_D: 0.000705, lr_D: 0.000800 2023-07-11 12:58:18,432 - INFO - [Train] step: 8199, loss_G: 11.359093, lr_G: 0.000800 2023-07-11 12:59:14,608 - INFO - [Train] step: 8299, loss_D: 0.025060, lr_D: 0.000800 2023-07-11 12:59:14,692 - INFO - [Train] step: 8299, loss_G: 6.100769, lr_G: 0.000800 2023-07-11 13:00:10,868 - INFO - [Train] step: 8399, loss_D: 0.029371, lr_D: 0.000800 2023-07-11 13:00:10,951 - INFO - [Train] step: 8399, loss_G: 5.231078, lr_G: 0.000800 2023-07-11 13:01:07,253 - INFO - [Train] step: 8499, loss_D: 0.019515, lr_D: 0.000800 2023-07-11 13:01:07,337 - INFO - [Train] step: 8499, loss_G: 6.361640, lr_G: 0.000800 2023-07-11 13:02:03,498 - INFO - [Train] step: 8599, loss_D: 0.071161, lr_D: 0.000800 2023-07-11 13:02:03,581 - INFO - [Train] step: 8599, loss_G: 4.338478, lr_G: 0.000800 2023-07-11 13:02:59,709 - INFO - [Train] step: 8699, loss_D: 0.040499, lr_D: 0.000800 2023-07-11 13:02:59,793 - INFO - [Train] step: 8699, loss_G: 5.633650, lr_G: 0.000800 2023-07-11 13:03:55,911 - INFO - [Train] step: 8799, loss_D: 0.011353, lr_D: 0.000800 2023-07-11 13:03:55,995 - INFO - [Train] step: 8799, loss_G: 5.696527, lr_G: 0.000800 2023-07-11 13:04:52,159 - INFO - [Train] step: 8899, loss_D: 0.065517, lr_D: 0.000800 2023-07-11 13:04:52,243 - INFO - [Train] step: 8899, loss_G: 4.653852, lr_G: 0.000800 2023-07-11 13:05:48,380 - INFO - [Train] step: 8999, loss_D: 0.018624, lr_D: 0.000800 2023-07-11 13:05:48,463 - INFO - [Train] step: 8999, loss_G: 5.674169, lr_G: 0.000800 2023-07-11 13:06:52,552 - INFO - [Eval] step: 8999, fid: 30.621362 2023-07-11 13:07:49,109 - INFO - [Train] step: 9099, loss_D: 0.028935, lr_D: 0.000800 2023-07-11 13:07:49,193 - INFO - [Train] step: 9099, loss_G: 5.719067, lr_G: 0.000800 2023-07-11 13:08:45,357 - INFO - [Train] step: 9199, loss_D: 0.020407, lr_D: 0.000800 2023-07-11 13:08:45,440 - INFO - [Train] step: 9199, loss_G: 5.857404, lr_G: 0.000800 2023-07-11 13:09:41,616 - INFO - [Train] step: 9299, loss_D: 0.044054, lr_D: 0.000800 2023-07-11 13:09:41,700 - INFO - [Train] step: 9299, loss_G: 4.781527, lr_G: 0.000800 2023-07-11 13:10:37,882 - INFO - [Train] step: 9399, loss_D: 0.023854, lr_D: 0.000800 2023-07-11 13:10:37,966 - INFO - [Train] step: 9399, loss_G: 4.744816, lr_G: 0.000800 2023-07-11 13:11:34,145 - INFO - [Train] step: 9499, loss_D: 0.023837, lr_D: 0.000800 2023-07-11 13:11:34,229 - INFO - [Train] step: 9499, loss_G: 6.172730, lr_G: 0.000800 2023-07-11 13:12:30,406 - INFO - [Train] step: 9599, loss_D: 0.468906, lr_D: 0.000800 2023-07-11 13:12:30,490 - INFO - [Train] step: 9599, loss_G: 4.899601, lr_G: 0.000800 2023-07-11 13:13:26,673 - INFO - [Train] step: 9699, loss_D: 0.001909, lr_D: 0.000800 2023-07-11 13:13:26,757 - INFO - [Train] step: 9699, loss_G: 8.229082, lr_G: 0.000800 2023-07-11 13:14:23,049 - INFO - [Train] step: 9799, loss_D: 0.058375, lr_D: 0.000800 2023-07-11 13:14:23,133 - INFO - [Train] step: 9799, loss_G: 4.454073, lr_G: 0.000800 2023-07-11 13:15:19,297 - INFO - [Train] step: 9899, loss_D: 0.020404, lr_D: 0.000800 2023-07-11 13:15:19,381 - INFO - [Train] step: 9899, loss_G: 6.260936, lr_G: 0.000800 2023-07-11 13:16:15,525 - INFO - [Train] step: 9999, loss_D: 0.022226, lr_D: 0.000800 2023-07-11 13:16:15,609 - INFO - [Train] step: 9999, loss_G: 6.552267, lr_G: 0.000800 2023-07-11 13:17:19,649 - INFO - [Eval] step: 9999, fid: 31.728809 2023-07-11 13:18:15,942 - INFO - [Train] step: 10099, loss_D: 0.013560, lr_D: 0.000800 2023-07-11 13:18:16,026 - INFO - [Train] step: 10099, loss_G: 6.183954, lr_G: 0.000800 2023-07-11 13:19:12,202 - INFO - [Train] step: 10199, loss_D: 0.301225, lr_D: 0.000800 2023-07-11 13:19:12,286 - INFO - [Train] step: 10199, loss_G: 3.235008, lr_G: 0.000800 2023-07-11 13:20:08,467 - INFO - [Train] step: 10299, loss_D: 0.015755, lr_D: 0.000800 2023-07-11 13:20:08,550 - INFO - [Train] step: 10299, loss_G: 5.884558, lr_G: 0.000800 2023-07-11 13:21:04,888 - INFO - [Train] step: 10399, loss_D: 0.033889, lr_D: 0.000800 2023-07-11 13:21:04,972 - INFO - [Train] step: 10399, loss_G: 6.869841, lr_G: 0.000800 2023-07-11 13:22:01,153 - INFO - [Train] step: 10499, loss_D: 0.017708, lr_D: 0.000800 2023-07-11 13:22:01,236 - INFO - [Train] step: 10499, loss_G: 7.041650, lr_G: 0.000800 2023-07-11 13:22:57,410 - INFO - [Train] step: 10599, loss_D: 0.017894, lr_D: 0.000800 2023-07-11 13:22:57,494 - INFO - [Train] step: 10599, loss_G: 6.538893, lr_G: 0.000800 2023-07-11 13:23:53,722 - INFO - [Train] step: 10699, loss_D: 0.001114, lr_D: 0.000800 2023-07-11 13:23:53,806 - INFO - [Train] step: 10699, loss_G: 8.781487, lr_G: 0.000800 2023-07-11 13:24:49,988 - INFO - [Train] step: 10799, loss_D: 0.027659, lr_D: 0.000800 2023-07-11 13:24:50,072 - INFO - [Train] step: 10799, loss_G: 6.549875, lr_G: 0.000800 2023-07-11 13:25:46,269 - INFO - [Train] step: 10899, loss_D: 0.031984, lr_D: 0.000800 2023-07-11 13:25:46,353 - INFO - [Train] step: 10899, loss_G: 6.759466, lr_G: 0.000800 2023-07-11 13:26:42,669 - INFO - [Train] step: 10999, loss_D: 0.006145, lr_D: 0.000800 2023-07-11 13:26:42,753 - INFO - [Train] step: 10999, loss_G: 7.112816, lr_G: 0.000800 2023-07-11 13:27:46,758 - INFO - [Eval] step: 10999, fid: 30.327368 2023-07-11 13:28:43,254 - INFO - [Train] step: 11099, loss_D: 0.003850, lr_D: 0.000800 2023-07-11 13:28:43,337 - INFO - [Train] step: 11099, loss_G: 7.746220, lr_G: 0.000800 2023-07-11 13:29:39,518 - INFO - [Train] step: 11199, loss_D: 0.067815, lr_D: 0.000800 2023-07-11 13:29:39,602 - INFO - [Train] step: 11199, loss_G: 5.223897, lr_G: 0.000800 2023-07-11 13:30:35,813 - INFO - [Train] step: 11299, loss_D: 0.002763, lr_D: 0.000800 2023-07-11 13:30:35,896 - INFO - [Train] step: 11299, loss_G: 8.084093, lr_G: 0.000800 2023-07-11 13:31:32,110 - INFO - [Train] step: 11399, loss_D: 0.012797, lr_D: 0.000800 2023-07-11 13:31:32,193 - INFO - [Train] step: 11399, loss_G: 6.798376, lr_G: 0.000800 2023-07-11 13:32:28,392 - INFO - [Train] step: 11499, loss_D: 0.009886, lr_D: 0.000800 2023-07-11 13:32:28,476 - INFO - [Train] step: 11499, loss_G: 7.411889, lr_G: 0.000800 2023-07-11 13:33:24,688 - INFO - [Train] step: 11599, loss_D: 0.011459, lr_D: 0.000800 2023-07-11 13:33:24,771 - INFO - [Train] step: 11599, loss_G: 6.962460, lr_G: 0.000800 2023-07-11 13:34:21,090 - INFO - [Train] step: 11699, loss_D: 0.024009, lr_D: 0.000800 2023-07-11 13:34:21,174 - INFO - [Train] step: 11699, loss_G: 7.555803, lr_G: 0.000800 2023-07-11 13:35:17,359 - INFO - [Train] step: 11799, loss_D: 0.015610, lr_D: 0.000800 2023-07-11 13:35:17,443 - INFO - [Train] step: 11799, loss_G: 6.682260, lr_G: 0.000800 2023-07-11 13:36:13,643 - INFO - [Train] step: 11899, loss_D: 3.467544, lr_D: 0.000800 2023-07-11 13:36:13,727 - INFO - [Train] step: 11899, loss_G: 3.764105, lr_G: 0.000800 2023-07-11 13:37:09,918 - INFO - [Train] step: 11999, loss_D: 0.004966, lr_D: 0.000800 2023-07-11 13:37:10,002 - INFO - [Train] step: 11999, loss_G: 8.321743, lr_G: 0.000800 2023-07-11 13:38:14,082 - INFO - [Eval] step: 11999, fid: 29.305782 2023-07-11 13:39:10,580 - INFO - [Train] step: 12099, loss_D: 0.009038, lr_D: 0.000800 2023-07-11 13:39:10,664 - INFO - [Train] step: 12099, loss_G: 8.330870, lr_G: 0.000800 2023-07-11 13:40:06,889 - INFO - [Train] step: 12199, loss_D: 0.023681, lr_D: 0.000800 2023-07-11 13:40:06,973 - INFO - [Train] step: 12199, loss_G: 6.975578, lr_G: 0.000800 2023-07-11 13:41:03,360 - INFO - [Train] step: 12299, loss_D: 0.004188, lr_D: 0.000800 2023-07-11 13:41:03,444 - INFO - [Train] step: 12299, loss_G: 7.406260, lr_G: 0.000800 2023-07-11 13:41:59,658 - INFO - [Train] step: 12399, loss_D: 0.086447, lr_D: 0.000800 2023-07-11 13:41:59,742 - INFO - [Train] step: 12399, loss_G: 5.287652, lr_G: 0.000800 2023-07-11 13:42:55,956 - INFO - [Train] step: 12499, loss_D: 0.000366, lr_D: 0.000800 2023-07-11 13:42:56,040 - INFO - [Train] step: 12499, loss_G: 11.014597, lr_G: 0.000800 2023-07-11 13:43:52,269 - INFO - [Train] step: 12599, loss_D: 0.013249, lr_D: 0.000800 2023-07-11 13:43:52,352 - INFO - [Train] step: 12599, loss_G: 7.486347, lr_G: 0.000800 2023-07-11 13:44:48,578 - INFO - [Train] step: 12699, loss_D: 0.008541, lr_D: 0.000800 2023-07-11 13:44:48,662 - INFO - [Train] step: 12699, loss_G: 7.364436, lr_G: 0.000800 2023-07-11 13:45:44,859 - INFO - [Train] step: 12799, loss_D: 0.101281, lr_D: 0.000800 2023-07-11 13:45:44,943 - INFO - [Train] step: 12799, loss_G: 4.887404, lr_G: 0.000800 2023-07-11 13:46:41,150 - INFO - [Train] step: 12899, loss_D: 0.046161, lr_D: 0.000800 2023-07-11 13:46:41,234 - INFO - [Train] step: 12899, loss_G: 6.781134, lr_G: 0.000800 2023-07-11 13:47:37,593 - INFO - [Train] step: 12999, loss_D: 0.003869, lr_D: 0.000800 2023-07-11 13:47:37,677 - INFO - [Train] step: 12999, loss_G: 7.521938, lr_G: 0.000800 2023-07-11 13:48:41,726 - INFO - [Eval] step: 12999, fid: 31.145196 2023-07-11 13:49:37,899 - INFO - [Train] step: 13099, loss_D: 0.010063, lr_D: 0.000800 2023-07-11 13:49:37,983 - INFO - [Train] step: 13099, loss_G: 6.120701, lr_G: 0.000800 2023-07-11 13:50:34,174 - INFO - [Train] step: 13199, loss_D: 0.003973, lr_D: 0.000800 2023-07-11 13:50:34,258 - INFO - [Train] step: 13199, loss_G: 7.335663, lr_G: 0.000800 2023-07-11 13:51:30,410 - INFO - [Train] step: 13299, loss_D: 0.105317, lr_D: 0.000800 2023-07-11 13:51:30,494 - INFO - [Train] step: 13299, loss_G: 3.828239, lr_G: 0.000800 2023-07-11 13:52:26,637 - INFO - [Train] step: 13399, loss_D: 0.010208, lr_D: 0.000800 2023-07-11 13:52:26,721 - INFO - [Train] step: 13399, loss_G: 8.042526, lr_G: 0.000800 2023-07-11 13:53:22,877 - INFO - [Train] step: 13499, loss_D: 0.027225, lr_D: 0.000800 2023-07-11 13:53:22,960 - INFO - [Train] step: 13499, loss_G: 6.426303, lr_G: 0.000800 2023-07-11 13:54:19,215 - INFO - [Train] step: 13599, loss_D: 0.007277, lr_D: 0.000800 2023-07-11 13:54:19,300 - INFO - [Train] step: 13599, loss_G: 7.899753, lr_G: 0.000800 2023-07-11 13:55:15,450 - INFO - [Train] step: 13699, loss_D: 0.002837, lr_D: 0.000800 2023-07-11 13:55:15,534 - INFO - [Train] step: 13699, loss_G: 8.021205, lr_G: 0.000800 2023-07-11 13:56:11,641 - INFO - [Train] step: 13799, loss_D: 0.240107, lr_D: 0.000800 2023-07-11 13:56:11,725 - INFO - [Train] step: 13799, loss_G: 3.456906, lr_G: 0.000800 2023-07-11 13:57:07,841 - INFO - [Train] step: 13899, loss_D: 0.003996, lr_D: 0.000800 2023-07-11 13:57:07,925 - INFO - [Train] step: 13899, loss_G: 8.493683, lr_G: 0.000800 2023-07-11 13:58:04,072 - INFO - [Train] step: 13999, loss_D: 0.014825, lr_D: 0.000800 2023-07-11 13:58:04,156 - INFO - [Train] step: 13999, loss_G: 6.566381, lr_G: 0.000800 2023-07-11 13:59:08,169 - INFO - [Eval] step: 13999, fid: 29.862665 2023-07-11 14:00:04,339 - INFO - [Train] step: 14099, loss_D: 0.007222, lr_D: 0.000800 2023-07-11 14:00:04,423 - INFO - [Train] step: 14099, loss_G: 7.093710, lr_G: 0.000800 2023-07-11 14:01:00,637 - INFO - [Train] step: 14199, loss_D: 0.007936, lr_D: 0.000800 2023-07-11 14:01:00,721 - INFO - [Train] step: 14199, loss_G: 7.119807, lr_G: 0.000800 2023-07-11 14:01:57,085 - INFO - [Train] step: 14299, loss_D: 0.003573, lr_D: 0.000800 2023-07-11 14:01:57,169 - INFO - [Train] step: 14299, loss_G: 9.360214, lr_G: 0.000800 2023-07-11 14:02:53,362 - INFO - [Train] step: 14399, loss_D: 0.051941, lr_D: 0.000800 2023-07-11 14:02:53,446 - INFO - [Train] step: 14399, loss_G: 6.090578, lr_G: 0.000800 2023-07-11 14:03:49,602 - INFO - [Train] step: 14499, loss_D: 0.007884, lr_D: 0.000800 2023-07-11 14:03:49,686 - INFO - [Train] step: 14499, loss_G: 6.830848, lr_G: 0.000800 2023-07-11 14:04:45,814 - INFO - [Train] step: 14599, loss_D: 0.008319, lr_D: 0.000800 2023-07-11 14:04:45,898 - INFO - [Train] step: 14599, loss_G: 7.291912, lr_G: 0.000800 2023-07-11 14:05:42,016 - INFO - [Train] step: 14699, loss_D: 0.003675, lr_D: 0.000800 2023-07-11 14:05:42,100 - INFO - [Train] step: 14699, loss_G: 8.325859, lr_G: 0.000800 2023-07-11 14:06:38,219 - INFO - [Train] step: 14799, loss_D: 0.028460, lr_D: 0.000800 2023-07-11 14:06:38,303 - INFO - [Train] step: 14799, loss_G: 6.904002, lr_G: 0.000800 2023-07-11 14:07:34,540 - INFO - [Train] step: 14899, loss_D: 0.010624, lr_D: 0.000800 2023-07-11 14:07:34,624 - INFO - [Train] step: 14899, loss_G: 7.824240, lr_G: 0.000800 2023-07-11 14:08:30,729 - INFO - [Train] step: 14999, loss_D: 0.000340, lr_D: 0.000800 2023-07-11 14:08:30,813 - INFO - [Train] step: 14999, loss_G: 10.208112, lr_G: 0.000800 2023-07-11 14:09:34,758 - INFO - [Eval] step: 14999, fid: 29.454847 2023-07-11 14:10:30,878 - INFO - [Train] step: 15099, loss_D: 0.001534, lr_D: 0.000800 2023-07-11 14:10:30,961 - INFO - [Train] step: 15099, loss_G: 8.764018, lr_G: 0.000800 2023-07-11 14:11:27,105 - INFO - [Train] step: 15199, loss_D: 0.011419, lr_D: 0.000800 2023-07-11 14:11:27,189 - INFO - [Train] step: 15199, loss_G: 7.759914, lr_G: 0.000800 2023-07-11 14:12:23,394 - INFO - [Train] step: 15299, loss_D: 0.008819, lr_D: 0.000800 2023-07-11 14:12:23,478 - INFO - [Train] step: 15299, loss_G: 8.962471, lr_G: 0.000800 2023-07-11 14:13:19,693 - INFO - [Train] step: 15399, loss_D: 0.020425, lr_D: 0.000800 2023-07-11 14:13:19,777 - INFO - [Train] step: 15399, loss_G: 7.879280, lr_G: 0.000800 2023-07-11 14:14:15,986 - INFO - [Train] step: 15499, loss_D: 0.006072, lr_D: 0.000800 2023-07-11 14:14:16,069 - INFO - [Train] step: 15499, loss_G: 7.211442, lr_G: 0.000800 2023-07-11 14:15:12,433 - INFO - [Train] step: 15599, loss_D: 0.002150, lr_D: 0.000800 2023-07-11 14:15:12,516 - INFO - [Train] step: 15599, loss_G: 8.549648, lr_G: 0.000800 2023-07-11 14:16:08,707 - INFO - [Train] step: 15699, loss_D: 0.187069, lr_D: 0.000800 2023-07-11 14:16:08,790 - INFO - [Train] step: 15699, loss_G: 4.555290, lr_G: 0.000800 2023-07-11 14:17:04,922 - INFO - [Train] step: 15799, loss_D: 0.006582, lr_D: 0.000800 2023-07-11 14:17:05,005 - INFO - [Train] step: 15799, loss_G: 7.604278, lr_G: 0.000800 2023-07-11 14:18:01,177 - INFO - [Train] step: 15899, loss_D: 0.007547, lr_D: 0.000800 2023-07-11 14:18:01,261 - INFO - [Train] step: 15899, loss_G: 7.642211, lr_G: 0.000800 2023-07-11 14:18:57,369 - INFO - [Train] step: 15999, loss_D: 2.618884, lr_D: 0.000800 2023-07-11 14:18:57,453 - INFO - [Train] step: 15999, loss_G: 5.931014, lr_G: 0.000800 2023-07-11 14:20:01,404 - INFO - [Eval] step: 15999, fid: 30.130104 2023-07-11 14:20:57,507 - INFO - [Train] step: 16099, loss_D: 0.007411, lr_D: 0.000800 2023-07-11 14:20:57,590 - INFO - [Train] step: 16099, loss_G: 8.274845, lr_G: 0.000800 2023-07-11 14:21:53,873 - INFO - [Train] step: 16199, loss_D: 0.042619, lr_D: 0.000800 2023-07-11 14:21:53,957 - INFO - [Train] step: 16199, loss_G: 6.934038, lr_G: 0.000800 2023-07-11 14:22:50,088 - INFO - [Train] step: 16299, loss_D: 0.000987, lr_D: 0.000800 2023-07-11 14:22:50,172 - INFO - [Train] step: 16299, loss_G: 9.771923, lr_G: 0.000800 2023-07-11 14:23:46,302 - INFO - [Train] step: 16399, loss_D: 0.009464, lr_D: 0.000800 2023-07-11 14:23:46,386 - INFO - [Train] step: 16399, loss_G: 7.184815, lr_G: 0.000800 2023-07-11 14:24:42,507 - INFO - [Train] step: 16499, loss_D: 0.005335, lr_D: 0.000800 2023-07-11 14:24:42,591 - INFO - [Train] step: 16499, loss_G: 8.571962, lr_G: 0.000800 2023-07-11 14:25:38,697 - INFO - [Train] step: 16599, loss_D: 2.028174, lr_D: 0.000800 2023-07-11 14:25:38,780 - INFO - [Train] step: 16599, loss_G: 3.879996, lr_G: 0.000800 2023-07-11 14:26:34,941 - INFO - [Train] step: 16699, loss_D: 0.007578, lr_D: 0.000800 2023-07-11 14:26:35,025 - INFO - [Train] step: 16699, loss_G: 6.618587, lr_G: 0.000800 2023-07-11 14:27:31,231 - INFO - [Train] step: 16799, loss_D: 0.011151, lr_D: 0.000800 2023-07-11 14:27:31,314 - INFO - [Train] step: 16799, loss_G: 7.938923, lr_G: 0.000800 2023-07-11 14:28:27,679 - INFO - [Train] step: 16899, loss_D: 0.007307, lr_D: 0.000800 2023-07-11 14:28:27,763 - INFO - [Train] step: 16899, loss_G: 8.276512, lr_G: 0.000800 2023-07-11 14:29:23,964 - INFO - [Train] step: 16999, loss_D: 0.112136, lr_D: 0.000800 2023-07-11 14:29:24,048 - INFO - [Train] step: 16999, loss_G: 4.869758, lr_G: 0.000800 2023-07-11 14:30:28,192 - INFO - [Eval] step: 16999, fid: 28.441528 2023-07-11 14:31:24,660 - INFO - [Train] step: 17099, loss_D: 0.011750, lr_D: 0.000800 2023-07-11 14:31:24,744 - INFO - [Train] step: 17099, loss_G: 7.757559, lr_G: 0.000800 2023-07-11 14:32:20,933 - INFO - [Train] step: 17199, loss_D: 0.003757, lr_D: 0.000800 2023-07-11 14:32:21,017 - INFO - [Train] step: 17199, loss_G: 7.643621, lr_G: 0.000800 2023-07-11 14:33:17,206 - INFO - [Train] step: 17299, loss_D: 0.006172, lr_D: 0.000800 2023-07-11 14:33:17,289 - INFO - [Train] step: 17299, loss_G: 8.143820, lr_G: 0.000800 2023-07-11 14:34:13,464 - INFO - [Train] step: 17399, loss_D: 0.045059, lr_D: 0.000800 2023-07-11 14:34:13,548 - INFO - [Train] step: 17399, loss_G: 7.102817, lr_G: 0.000800 2023-07-11 14:35:09,859 - INFO - [Train] step: 17499, loss_D: 0.008678, lr_D: 0.000800 2023-07-11 14:35:09,943 - INFO - [Train] step: 17499, loss_G: 8.260544, lr_G: 0.000800 2023-07-11 14:36:06,086 - INFO - [Train] step: 17599, loss_D: 0.001129, lr_D: 0.000800 2023-07-11 14:36:06,169 - INFO - [Train] step: 17599, loss_G: 9.633171, lr_G: 0.000800 2023-07-11 14:37:02,313 - INFO - [Train] step: 17699, loss_D: 0.002065, lr_D: 0.000800 2023-07-11 14:37:02,397 - INFO - [Train] step: 17699, loss_G: 8.164239, lr_G: 0.000800 2023-07-11 14:37:58,535 - INFO - [Train] step: 17799, loss_D: 0.003847, lr_D: 0.000800 2023-07-11 14:37:58,619 - INFO - [Train] step: 17799, loss_G: 10.379612, lr_G: 0.000800 2023-07-11 14:38:54,750 - INFO - [Train] step: 17899, loss_D: 0.029135, lr_D: 0.000800 2023-07-11 14:38:54,834 - INFO - [Train] step: 17899, loss_G: 8.965570, lr_G: 0.000800 2023-07-11 14:39:50,977 - INFO - [Train] step: 17999, loss_D: 0.016164, lr_D: 0.000800 2023-07-11 14:39:51,061 - INFO - [Train] step: 17999, loss_G: 7.357069, lr_G: 0.000800 2023-07-11 14:40:55,161 - INFO - [Eval] step: 17999, fid: 27.078599 2023-07-11 14:41:51,551 - INFO - [Train] step: 18099, loss_D: 0.001106, lr_D: 0.000800 2023-07-11 14:41:51,634 - INFO - [Train] step: 18099, loss_G: 10.010357, lr_G: 0.000800 2023-07-11 14:42:47,921 - INFO - [Train] step: 18199, loss_D: 0.233529, lr_D: 0.000800 2023-07-11 14:42:48,005 - INFO - [Train] step: 18199, loss_G: 3.818181, lr_G: 0.000800 2023-07-11 14:43:44,156 - INFO - [Train] step: 18299, loss_D: 0.004615, lr_D: 0.000800 2023-07-11 14:43:44,240 - INFO - [Train] step: 18299, loss_G: 9.297379, lr_G: 0.000800 2023-07-11 14:44:40,425 - INFO - [Train] step: 18399, loss_D: 0.001716, lr_D: 0.000800 2023-07-11 14:44:40,509 - INFO - [Train] step: 18399, loss_G: 9.390348, lr_G: 0.000800 2023-07-11 14:45:36,726 - INFO - [Train] step: 18499, loss_D: 0.003523, lr_D: 0.000800 2023-07-11 14:45:36,810 - INFO - [Train] step: 18499, loss_G: 9.848138, lr_G: 0.000800 2023-07-11 14:46:33,029 - INFO - [Train] step: 18599, loss_D: 0.018795, lr_D: 0.000800 2023-07-11 14:46:33,113 - INFO - [Train] step: 18599, loss_G: 8.085789, lr_G: 0.000800 2023-07-11 14:47:29,340 - INFO - [Train] step: 18699, loss_D: 0.145551, lr_D: 0.000800 2023-07-11 14:47:29,423 - INFO - [Train] step: 18699, loss_G: 6.390501, lr_G: 0.000800 2023-07-11 14:48:25,761 - INFO - [Train] step: 18799, loss_D: 0.004858, lr_D: 0.000800 2023-07-11 14:48:25,844 - INFO - [Train] step: 18799, loss_G: 8.706554, lr_G: 0.000800 2023-07-11 14:49:22,023 - INFO - [Train] step: 18899, loss_D: 0.000915, lr_D: 0.000800 2023-07-11 14:49:22,107 - INFO - [Train] step: 18899, loss_G: 11.438213, lr_G: 0.000800 2023-07-11 14:50:18,280 - INFO - [Train] step: 18999, loss_D: 0.005635, lr_D: 0.000800 2023-07-11 14:50:18,364 - INFO - [Train] step: 18999, loss_G: 8.079836, lr_G: 0.000800 2023-07-11 14:51:22,382 - INFO - [Eval] step: 18999, fid: 30.342733 2023-07-11 14:52:18,512 - INFO - [Train] step: 19099, loss_D: 0.495687, lr_D: 0.000800 2023-07-11 14:52:18,596 - INFO - [Train] step: 19099, loss_G: 2.701498, lr_G: 0.000800 2023-07-11 14:53:14,777 - INFO - [Train] step: 19199, loss_D: 0.010718, lr_D: 0.000800 2023-07-11 14:53:14,861 - INFO - [Train] step: 19199, loss_G: 8.037912, lr_G: 0.000800 2023-07-11 14:54:11,025 - INFO - [Train] step: 19299, loss_D: 0.001784, lr_D: 0.000800 2023-07-11 14:54:11,109 - INFO - [Train] step: 19299, loss_G: 9.724527, lr_G: 0.000800 2023-07-11 14:55:07,284 - INFO - [Train] step: 19399, loss_D: 0.000590, lr_D: 0.000800 2023-07-11 14:55:07,368 - INFO - [Train] step: 19399, loss_G: 10.502163, lr_G: 0.000800 2023-07-11 14:56:03,628 - INFO - [Train] step: 19499, loss_D: 0.012816, lr_D: 0.000800 2023-07-11 14:56:03,711 - INFO - [Train] step: 19499, loss_G: 7.751922, lr_G: 0.000800 2023-07-11 14:56:59,856 - INFO - [Train] step: 19599, loss_D: 0.005400, lr_D: 0.000800 2023-07-11 14:56:59,939 - INFO - [Train] step: 19599, loss_G: 8.535320, lr_G: 0.000800 2023-07-11 14:57:56,065 - INFO - [Train] step: 19699, loss_D: 0.002196, lr_D: 0.000800 2023-07-11 14:57:56,149 - INFO - [Train] step: 19699, loss_G: 10.263439, lr_G: 0.000800 2023-07-11 14:58:52,317 - INFO - [Train] step: 19799, loss_D: 0.021812, lr_D: 0.000800 2023-07-11 14:58:52,401 - INFO - [Train] step: 19799, loss_G: 7.788321, lr_G: 0.000800 2023-07-11 14:59:48,604 - INFO - [Train] step: 19899, loss_D: 0.003188, lr_D: 0.000800 2023-07-11 14:59:48,688 - INFO - [Train] step: 19899, loss_G: 10.278534, lr_G: 0.000800 2023-07-11 15:00:44,920 - INFO - [Train] step: 19999, loss_D: 0.360845, lr_D: 0.000800 2023-07-11 15:00:45,004 - INFO - [Train] step: 19999, loss_G: 4.274442, lr_G: 0.000800 2023-07-11 15:01:49,053 - INFO - [Eval] step: 19999, fid: 27.008773 2023-07-11 15:02:45,811 - INFO - [Train] step: 20099, loss_D: 0.006600, lr_D: 0.000800 2023-07-11 15:02:45,895 - INFO - [Train] step: 20099, loss_G: 8.291388, lr_G: 0.000800 2023-07-11 15:03:42,117 - INFO - [Train] step: 20199, loss_D: 0.004649, lr_D: 0.000800 2023-07-11 15:03:42,200 - INFO - [Train] step: 20199, loss_G: 8.462478, lr_G: 0.000800 2023-07-11 15:04:38,385 - INFO - [Train] step: 20299, loss_D: 0.002957, lr_D: 0.000800 2023-07-11 15:04:38,468 - INFO - [Train] step: 20299, loss_G: 9.305570, lr_G: 0.000800 2023-07-11 15:05:34,636 - INFO - [Train] step: 20399, loss_D: 0.158959, lr_D: 0.000800 2023-07-11 15:05:34,720 - INFO - [Train] step: 20399, loss_G: 11.202217, lr_G: 0.000800 2023-07-11 15:06:30,882 - INFO - [Train] step: 20499, loss_D: 1.208866, lr_D: 0.000800 2023-07-11 15:06:30,966 - INFO - [Train] step: 20499, loss_G: 9.777437, lr_G: 0.000800 2023-07-11 15:07:27,109 - INFO - [Train] step: 20599, loss_D: 0.000840, lr_D: 0.000800 2023-07-11 15:07:27,193 - INFO - [Train] step: 20599, loss_G: 11.558846, lr_G: 0.000800 2023-07-11 15:08:23,454 - INFO - [Train] step: 20699, loss_D: 0.003777, lr_D: 0.000800 2023-07-11 15:08:23,538 - INFO - [Train] step: 20699, loss_G: 9.691971, lr_G: 0.000800 2023-07-11 15:09:19,662 - INFO - [Train] step: 20799, loss_D: 0.003992, lr_D: 0.000800 2023-07-11 15:09:19,746 - INFO - [Train] step: 20799, loss_G: 7.911158, lr_G: 0.000800 2023-07-11 15:10:15,899 - INFO - [Train] step: 20899, loss_D: 0.005836, lr_D: 0.000800 2023-07-11 15:10:15,982 - INFO - [Train] step: 20899, loss_G: 7.689766, lr_G: 0.000800 2023-07-11 15:11:12,147 - INFO - [Train] step: 20999, loss_D: 0.003656, lr_D: 0.000800 2023-07-11 15:11:12,230 - INFO - [Train] step: 20999, loss_G: 9.950825, lr_G: 0.000800 2023-07-11 15:12:16,243 - INFO - [Eval] step: 20999, fid: 25.323420 2023-07-11 15:13:12,716 - INFO - [Train] step: 21099, loss_D: 0.002893, lr_D: 0.000800 2023-07-11 15:13:12,800 - INFO - [Train] step: 21099, loss_G: 11.599640, lr_G: 0.000800 2023-07-11 15:14:08,986 - INFO - [Train] step: 21199, loss_D: 0.049744, lr_D: 0.000800 2023-07-11 15:14:09,070 - INFO - [Train] step: 21199, loss_G: 6.993627, lr_G: 0.000800 2023-07-11 15:15:05,248 - INFO - [Train] step: 21299, loss_D: 0.008271, lr_D: 0.000800 2023-07-11 15:15:05,332 - INFO - [Train] step: 21299, loss_G: 8.385923, lr_G: 0.000800 2023-07-11 15:16:01,663 - INFO - [Train] step: 21399, loss_D: 0.002594, lr_D: 0.000800 2023-07-11 15:16:01,746 - INFO - [Train] step: 21399, loss_G: 8.507057, lr_G: 0.000800 2023-07-11 15:16:57,916 - INFO - [Train] step: 21499, loss_D: 0.002587, lr_D: 0.000800 2023-07-11 15:16:58,000 - INFO - [Train] step: 21499, loss_G: 10.397579, lr_G: 0.000800 2023-07-11 15:17:54,167 - INFO - [Train] step: 21599, loss_D: 0.015398, lr_D: 0.000800 2023-07-11 15:17:54,251 - INFO - [Train] step: 21599, loss_G: 7.068629, lr_G: 0.000800 2023-07-11 15:18:50,386 - INFO - [Train] step: 21699, loss_D: 0.005399, lr_D: 0.000800 2023-07-11 15:18:50,470 - INFO - [Train] step: 21699, loss_G: 7.438874, lr_G: 0.000800 2023-07-11 15:19:46,641 - INFO - [Train] step: 21799, loss_D: 0.003675, lr_D: 0.000800 2023-07-11 15:19:46,725 - INFO - [Train] step: 21799, loss_G: 9.521066, lr_G: 0.000800 2023-07-11 15:20:42,865 - INFO - [Train] step: 21899, loss_D: 0.006019, lr_D: 0.000800 2023-07-11 15:20:42,949 - INFO - [Train] step: 21899, loss_G: 8.693350, lr_G: 0.000800 2023-07-11 15:21:39,236 - INFO - [Train] step: 21999, loss_D: 0.007369, lr_D: 0.000800 2023-07-11 15:21:39,319 - INFO - [Train] step: 21999, loss_G: 9.730040, lr_G: 0.000800 2023-07-11 15:22:43,392 - INFO - [Eval] step: 21999, fid: 27.574415 2023-07-11 15:23:39,529 - INFO - [Train] step: 22099, loss_D: 0.007360, lr_D: 0.000800 2023-07-11 15:23:39,613 - INFO - [Train] step: 22099, loss_G: 7.748243, lr_G: 0.000800 2023-07-11 15:24:35,779 - INFO - [Train] step: 22199, loss_D: 0.000811, lr_D: 0.000800 2023-07-11 15:24:35,862 - INFO - [Train] step: 22199, loss_G: 10.671335, lr_G: 0.000800 2023-07-11 15:25:32,048 - INFO - [Train] step: 22299, loss_D: 0.001396, lr_D: 0.000800 2023-07-11 15:25:32,132 - INFO - [Train] step: 22299, loss_G: 10.386802, lr_G: 0.000800 2023-07-11 15:26:28,291 - INFO - [Train] step: 22399, loss_D: 0.288791, lr_D: 0.000800 2023-07-11 15:26:28,374 - INFO - [Train] step: 22399, loss_G: 3.395542, lr_G: 0.000800 2023-07-11 15:27:24,528 - INFO - [Train] step: 22499, loss_D: 0.011659, lr_D: 0.000800 2023-07-11 15:27:24,613 - INFO - [Train] step: 22499, loss_G: 8.573141, lr_G: 0.000800 2023-07-11 15:28:20,788 - INFO - [Train] step: 22599, loss_D: 0.003066, lr_D: 0.000800 2023-07-11 15:28:20,872 - INFO - [Train] step: 22599, loss_G: 8.549353, lr_G: 0.000800 2023-07-11 15:29:17,190 - INFO - [Train] step: 22699, loss_D: 0.004410, lr_D: 0.000800 2023-07-11 15:29:17,274 - INFO - [Train] step: 22699, loss_G: 9.382833, lr_G: 0.000800 2023-07-11 15:30:13,405 - INFO - [Train] step: 22799, loss_D: 0.037453, lr_D: 0.000800 2023-07-11 15:30:13,489 - INFO - [Train] step: 22799, loss_G: 7.477693, lr_G: 0.000800 2023-07-11 15:31:09,624 - INFO - [Train] step: 22899, loss_D: 0.109032, lr_D: 0.000800 2023-07-11 15:31:09,708 - INFO - [Train] step: 22899, loss_G: 5.265463, lr_G: 0.000800 2023-07-11 15:32:05,851 - INFO - [Train] step: 22999, loss_D: 0.003763, lr_D: 0.000800 2023-07-11 15:32:05,934 - INFO - [Train] step: 22999, loss_G: 9.396175, lr_G: 0.000800 2023-07-11 15:33:09,889 - INFO - [Eval] step: 22999, fid: 29.516297 2023-07-11 15:34:06,058 - INFO - [Train] step: 23099, loss_D: 0.003253, lr_D: 0.000800 2023-07-11 15:34:06,142 - INFO - [Train] step: 23099, loss_G: 9.109964, lr_G: 0.000800 2023-07-11 15:35:02,365 - INFO - [Train] step: 23199, loss_D: 0.000366, lr_D: 0.000800 2023-07-11 15:35:02,449 - INFO - [Train] step: 23199, loss_G: 11.528555, lr_G: 0.000800 2023-07-11 15:35:58,768 - INFO - [Train] step: 23299, loss_D: 0.079994, lr_D: 0.000800 2023-07-11 15:35:58,852 - INFO - [Train] step: 23299, loss_G: 5.743853, lr_G: 0.000800 2023-07-11 15:36:54,983 - INFO - [Train] step: 23399, loss_D: 0.002688, lr_D: 0.000800 2023-07-11 15:36:55,067 - INFO - [Train] step: 23399, loss_G: 11.234402, lr_G: 0.000800 2023-07-11 15:37:51,260 - INFO - [Train] step: 23499, loss_D: 0.148203, lr_D: 0.000800 2023-07-11 15:37:51,344 - INFO - [Train] step: 23499, loss_G: 4.712931, lr_G: 0.000800 2023-07-11 15:38:47,518 - INFO - [Train] step: 23599, loss_D: 0.001541, lr_D: 0.000800 2023-07-11 15:38:47,602 - INFO - [Train] step: 23599, loss_G: 9.591396, lr_G: 0.000800 2023-07-11 15:39:43,792 - INFO - [Train] step: 23699, loss_D: 0.002986, lr_D: 0.000800 2023-07-11 15:39:43,877 - INFO - [Train] step: 23699, loss_G: 9.700461, lr_G: 0.000800 2023-07-11 15:40:40,038 - INFO - [Train] step: 23799, loss_D: 0.010304, lr_D: 0.000800 2023-07-11 15:40:40,122 - INFO - [Train] step: 23799, loss_G: 8.965643, lr_G: 0.000800 2023-07-11 15:41:36,319 - INFO - [Train] step: 23899, loss_D: 0.001316, lr_D: 0.000800 2023-07-11 15:41:36,402 - INFO - [Train] step: 23899, loss_G: 9.987736, lr_G: 0.000800 2023-07-11 15:42:32,766 - INFO - [Train] step: 23999, loss_D: 0.001019, lr_D: 0.000800 2023-07-11 15:42:32,850 - INFO - [Train] step: 23999, loss_G: 11.542675, lr_G: 0.000800 2023-07-11 15:43:36,944 - INFO - [Eval] step: 23999, fid: 28.743679 2023-07-11 15:44:33,120 - INFO - [Train] step: 24099, loss_D: 0.209719, lr_D: 0.000800 2023-07-11 15:44:33,204 - INFO - [Train] step: 24099, loss_G: 4.494919, lr_G: 0.000800 2023-07-11 15:45:29,426 - INFO - [Train] step: 24199, loss_D: 0.007436, lr_D: 0.000800 2023-07-11 15:45:29,510 - INFO - [Train] step: 24199, loss_G: 9.161270, lr_G: 0.000800 2023-07-11 15:46:25,708 - INFO - [Train] step: 24299, loss_D: 0.005351, lr_D: 0.000800 2023-07-11 15:46:25,792 - INFO - [Train] step: 24299, loss_G: 9.379765, lr_G: 0.000800 2023-07-11 15:47:22,009 - INFO - [Train] step: 24399, loss_D: 0.002128, lr_D: 0.000800 2023-07-11 15:47:22,092 - INFO - [Train] step: 24399, loss_G: 9.378587, lr_G: 0.000800 2023-07-11 15:48:18,317 - INFO - [Train] step: 24499, loss_D: 0.003422, lr_D: 0.000800 2023-07-11 15:48:18,401 - INFO - [Train] step: 24499, loss_G: 9.513985, lr_G: 0.000800 2023-07-11 15:49:14,735 - INFO - [Train] step: 24599, loss_D: 0.006654, lr_D: 0.000800 2023-07-11 15:49:14,819 - INFO - [Train] step: 24599, loss_G: 9.272728, lr_G: 0.000800 2023-07-11 15:50:11,028 - INFO - [Train] step: 24699, loss_D: 0.002792, lr_D: 0.000800 2023-07-11 15:50:11,111 - INFO - [Train] step: 24699, loss_G: 9.233520, lr_G: 0.000800 2023-07-11 15:51:07,306 - INFO - [Train] step: 24799, loss_D: 0.004219, lr_D: 0.000800 2023-07-11 15:51:07,389 - INFO - [Train] step: 24799, loss_G: 9.143127, lr_G: 0.000800 2023-07-11 15:52:03,599 - INFO - [Train] step: 24899, loss_D: 0.160338, lr_D: 0.000800 2023-07-11 15:52:03,683 - INFO - [Train] step: 24899, loss_G: 4.610417, lr_G: 0.000800 2023-07-11 15:52:59,895 - INFO - [Train] step: 24999, loss_D: 0.015072, lr_D: 0.000800 2023-07-11 15:52:59,979 - INFO - [Train] step: 24999, loss_G: 8.884357, lr_G: 0.000800 2023-07-11 15:54:04,042 - INFO - [Eval] step: 24999, fid: 28.120965 2023-07-11 15:55:00,217 - INFO - [Train] step: 25099, loss_D: 0.003968, lr_D: 0.000800 2023-07-11 15:55:00,301 - INFO - [Train] step: 25099, loss_G: 10.388786, lr_G: 0.000800 2023-07-11 15:55:56,534 - INFO - [Train] step: 25199, loss_D: 0.203878, lr_D: 0.000800 2023-07-11 15:55:56,618 - INFO - [Train] step: 25199, loss_G: 5.433864, lr_G: 0.000800 2023-07-11 15:56:52,988 - INFO - [Train] step: 25299, loss_D: 0.001827, lr_D: 0.000800 2023-07-11 15:56:53,072 - INFO - [Train] step: 25299, loss_G: 10.073749, lr_G: 0.000800 2023-07-11 15:57:49,282 - INFO - [Train] step: 25399, loss_D: 0.001059, lr_D: 0.000800 2023-07-11 15:57:49,366 - INFO - [Train] step: 25399, loss_G: 11.256536, lr_G: 0.000800 2023-07-11 15:58:45,566 - INFO - [Train] step: 25499, loss_D: 0.028165, lr_D: 0.000800 2023-07-11 15:58:45,650 - INFO - [Train] step: 25499, loss_G: 7.335596, lr_G: 0.000800 2023-07-11 15:59:41,867 - INFO - [Train] step: 25599, loss_D: 0.013606, lr_D: 0.000800 2023-07-11 15:59:41,950 - INFO - [Train] step: 25599, loss_G: 8.838924, lr_G: 0.000800 2023-07-11 16:00:38,189 - INFO - [Train] step: 25699, loss_D: 0.010900, lr_D: 0.000800 2023-07-11 16:00:38,272 - INFO - [Train] step: 25699, loss_G: 7.900687, lr_G: 0.000800 2023-07-11 16:01:34,495 - INFO - [Train] step: 25799, loss_D: 0.001805, lr_D: 0.000800 2023-07-11 16:01:34,579 - INFO - [Train] step: 25799, loss_G: 8.660160, lr_G: 0.000800 2023-07-11 16:02:30,939 - INFO - [Train] step: 25899, loss_D: 0.001893, lr_D: 0.000800 2023-07-11 16:02:31,023 - INFO - [Train] step: 25899, loss_G: 11.859098, lr_G: 0.000800 2023-07-11 16:03:27,226 - INFO - [Train] step: 25999, loss_D: 0.003861, lr_D: 0.000800 2023-07-11 16:03:27,310 - INFO - [Train] step: 25999, loss_G: 10.326664, lr_G: 0.000800 2023-07-11 16:04:31,317 - INFO - [Eval] step: 25999, fid: 29.586841 2023-07-11 16:05:27,502 - INFO - [Train] step: 26099, loss_D: 0.012158, lr_D: 0.000800 2023-07-11 16:05:27,586 - INFO - [Train] step: 26099, loss_G: 8.987256, lr_G: 0.000800 2023-07-11 16:06:23,733 - INFO - [Train] step: 26199, loss_D: 0.002671, lr_D: 0.000800 2023-07-11 16:06:23,817 - INFO - [Train] step: 26199, loss_G: 11.424615, lr_G: 0.000800 2023-07-11 16:07:20,007 - INFO - [Train] step: 26299, loss_D: 0.015238, lr_D: 0.000800 2023-07-11 16:07:20,091 - INFO - [Train] step: 26299, loss_G: 8.505151, lr_G: 0.000800 2023-07-11 16:08:16,254 - INFO - [Train] step: 26399, loss_D: 0.004004, lr_D: 0.000800 2023-07-11 16:08:16,338 - INFO - [Train] step: 26399, loss_G: 9.692331, lr_G: 0.000800 2023-07-11 16:09:12,517 - INFO - [Train] step: 26499, loss_D: 0.000488, lr_D: 0.000800 2023-07-11 16:09:12,600 - INFO - [Train] step: 26499, loss_G: 13.278828, lr_G: 0.000800 2023-07-11 16:10:08,957 - INFO - [Train] step: 26599, loss_D: 0.123925, lr_D: 0.000800 2023-07-11 16:10:09,041 - INFO - [Train] step: 26599, loss_G: 6.979785, lr_G: 0.000800 2023-07-11 16:11:05,249 - INFO - [Train] step: 26699, loss_D: 0.007197, lr_D: 0.000800 2023-07-11 16:11:05,333 - INFO - [Train] step: 26699, loss_G: 9.265303, lr_G: 0.000800 2023-07-11 16:12:01,561 - INFO - [Train] step: 26799, loss_D: 0.002805, lr_D: 0.000800 2023-07-11 16:12:01,646 - INFO - [Train] step: 26799, loss_G: 10.261986, lr_G: 0.000800 2023-07-11 16:12:57,866 - INFO - [Train] step: 26899, loss_D: 0.002603, lr_D: 0.000800 2023-07-11 16:12:57,950 - INFO - [Train] step: 26899, loss_G: 9.968462, lr_G: 0.000800 2023-07-11 16:13:54,138 - INFO - [Train] step: 26999, loss_D: 0.270974, lr_D: 0.000800 2023-07-11 16:13:54,222 - INFO - [Train] step: 26999, loss_G: 4.476009, lr_G: 0.000800 2023-07-11 16:14:58,262 - INFO - [Eval] step: 26999, fid: 28.261105 2023-07-11 16:15:54,401 - INFO - [Train] step: 27099, loss_D: 0.005396, lr_D: 0.000800 2023-07-11 16:15:54,485 - INFO - [Train] step: 27099, loss_G: 9.020653, lr_G: 0.000800 2023-07-11 16:16:50,800 - INFO - [Train] step: 27199, loss_D: 0.002239, lr_D: 0.000800 2023-07-11 16:16:50,884 - INFO - [Train] step: 27199, loss_G: 10.092320, lr_G: 0.000800 2023-07-11 16:17:47,071 - INFO - [Train] step: 27299, loss_D: 0.001601, lr_D: 0.000800 2023-07-11 16:17:47,154 - INFO - [Train] step: 27299, loss_G: 10.914949, lr_G: 0.000800 2023-07-11 16:18:43,343 - INFO - [Train] step: 27399, loss_D: 0.022859, lr_D: 0.000800 2023-07-11 16:18:43,426 - INFO - [Train] step: 27399, loss_G: 9.357937, lr_G: 0.000800 2023-07-11 16:19:39,639 - INFO - [Train] step: 27499, loss_D: 0.003744, lr_D: 0.000800 2023-07-11 16:19:39,723 - INFO - [Train] step: 27499, loss_G: 9.579683, lr_G: 0.000800 2023-07-11 16:20:35,927 - INFO - [Train] step: 27599, loss_D: 0.008932, lr_D: 0.000800 2023-07-11 16:20:36,010 - INFO - [Train] step: 27599, loss_G: 9.413516, lr_G: 0.000800 2023-07-11 16:21:32,233 - INFO - [Train] step: 27699, loss_D: 0.002887, lr_D: 0.000800 2023-07-11 16:21:32,316 - INFO - [Train] step: 27699, loss_G: 9.666740, lr_G: 0.000800 2023-07-11 16:22:28,504 - INFO - [Train] step: 27799, loss_D: 0.076950, lr_D: 0.000800 2023-07-11 16:22:28,587 - INFO - [Train] step: 27799, loss_G: 7.720177, lr_G: 0.000800 2023-07-11 16:23:24,915 - INFO - [Train] step: 27899, loss_D: 0.003379, lr_D: 0.000800 2023-07-11 16:23:24,999 - INFO - [Train] step: 27899, loss_G: 9.764106, lr_G: 0.000800 2023-07-11 16:24:21,160 - INFO - [Train] step: 27999, loss_D: 0.002163, lr_D: 0.000800 2023-07-11 16:24:21,244 - INFO - [Train] step: 27999, loss_G: 9.900660, lr_G: 0.000800 2023-07-11 16:25:25,168 - INFO - [Eval] step: 27999, fid: 31.858242 2023-07-11 16:26:21,310 - INFO - [Train] step: 28099, loss_D: 0.005257, lr_D: 0.000800 2023-07-11 16:26:21,394 - INFO - [Train] step: 28099, loss_G: 10.851337, lr_G: 0.000800 2023-07-11 16:27:17,562 - INFO - [Train] step: 28199, loss_D: 0.017695, lr_D: 0.000800 2023-07-11 16:27:17,646 - INFO - [Train] step: 28199, loss_G: 9.165024, lr_G: 0.000800 2023-07-11 16:28:13,841 - INFO - [Train] step: 28299, loss_D: 0.003889, lr_D: 0.000800 2023-07-11 16:28:13,925 - INFO - [Train] step: 28299, loss_G: 9.393435, lr_G: 0.000800 2023-07-11 16:29:10,157 - INFO - [Train] step: 28399, loss_D: 0.005450, lr_D: 0.000800 2023-07-11 16:29:10,241 - INFO - [Train] step: 28399, loss_G: 10.037657, lr_G: 0.000800 2023-07-11 16:30:06,672 - INFO - [Train] step: 28499, loss_D: 0.081876, lr_D: 0.000800 2023-07-11 16:30:06,756 - INFO - [Train] step: 28499, loss_G: 6.137455, lr_G: 0.000800 2023-07-11 16:31:03,178 - INFO - [Train] step: 28599, loss_D: 0.009347, lr_D: 0.000800 2023-07-11 16:31:03,262 - INFO - [Train] step: 28599, loss_G: 8.486110, lr_G: 0.000800 2023-07-11 16:31:59,783 - INFO - [Train] step: 28699, loss_D: 0.003453, lr_D: 0.000800 2023-07-11 16:31:59,867 - INFO - [Train] step: 28699, loss_G: 9.374587, lr_G: 0.000800 2023-07-11 16:32:56,372 - INFO - [Train] step: 28799, loss_D: 0.007726, lr_D: 0.000800 2023-07-11 16:32:56,457 - INFO - [Train] step: 28799, loss_G: 9.320904, lr_G: 0.000800 2023-07-11 16:33:52,947 - INFO - [Train] step: 28899, loss_D: 2.428608, lr_D: 0.000800 2023-07-11 16:33:53,031 - INFO - [Train] step: 28899, loss_G: 16.081322, lr_G: 0.000800 2023-07-11 16:34:49,510 - INFO - [Train] step: 28999, loss_D: 0.006730, lr_D: 0.000800 2023-07-11 16:34:49,594 - INFO - [Train] step: 28999, loss_G: 10.405189, lr_G: 0.000800 2023-07-11 16:35:53,637 - INFO - [Eval] step: 28999, fid: 28.514210 2023-07-11 16:36:50,081 - INFO - [Train] step: 29099, loss_D: 0.000476, lr_D: 0.000800 2023-07-11 16:36:50,165 - INFO - [Train] step: 29099, loss_G: 11.495661, lr_G: 0.000800 2023-07-11 16:37:46,916 - INFO - [Train] step: 29199, loss_D: 0.021919, lr_D: 0.000800 2023-07-11 16:37:47,001 - INFO - [Train] step: 29199, loss_G: 7.978640, lr_G: 0.000800 2023-07-11 16:38:43,615 - INFO - [Train] step: 29299, loss_D: 0.005424, lr_D: 0.000800 2023-07-11 16:38:43,700 - INFO - [Train] step: 29299, loss_G: 10.110363, lr_G: 0.000800 2023-07-11 16:39:40,317 - INFO - [Train] step: 29399, loss_D: 0.001903, lr_D: 0.000800 2023-07-11 16:39:40,401 - INFO - [Train] step: 29399, loss_G: 10.540430, lr_G: 0.000800 2023-07-11 16:40:37,009 - INFO - [Train] step: 29499, loss_D: 0.001573, lr_D: 0.000800 2023-07-11 16:40:37,094 - INFO - [Train] step: 29499, loss_G: 10.988358, lr_G: 0.000800 2023-07-11 16:41:33,715 - INFO - [Train] step: 29599, loss_D: 0.006324, lr_D: 0.000800 2023-07-11 16:41:33,799 - INFO - [Train] step: 29599, loss_G: 8.904610, lr_G: 0.000800 2023-07-11 16:42:30,448 - INFO - [Train] step: 29699, loss_D: 0.002145, lr_D: 0.000800 2023-07-11 16:42:30,532 - INFO - [Train] step: 29699, loss_G: 10.345118, lr_G: 0.000800 2023-07-11 16:43:27,268 - INFO - [Train] step: 29799, loss_D: 0.001737, lr_D: 0.000800 2023-07-11 16:43:27,352 - INFO - [Train] step: 29799, loss_G: 10.607989, lr_G: 0.000800 2023-07-11 16:44:23,956 - INFO - [Train] step: 29899, loss_D: 0.014888, lr_D: 0.000800 2023-07-11 16:44:24,041 - INFO - [Train] step: 29899, loss_G: 8.812881, lr_G: 0.000800 2023-07-11 16:45:20,649 - INFO - [Train] step: 29999, loss_D: 0.003554, lr_D: 0.000800 2023-07-11 16:45:20,733 - INFO - [Train] step: 29999, loss_G: 10.038866, lr_G: 0.000800 2023-07-11 16:46:24,833 - INFO - [Eval] step: 29999, fid: 31.425861 2023-07-11 16:47:21,431 - INFO - [Train] step: 30099, loss_D: 0.001823, lr_D: 0.000800 2023-07-11 16:47:21,516 - INFO - [Train] step: 30099, loss_G: 10.117667, lr_G: 0.000800 2023-07-11 16:48:18,135 - INFO - [Train] step: 30199, loss_D: 0.009171, lr_D: 0.000800 2023-07-11 16:48:18,220 - INFO - [Train] step: 30199, loss_G: 11.614449, lr_G: 0.000800 2023-07-11 16:49:14,819 - INFO - [Train] step: 30299, loss_D: 0.002240, lr_D: 0.000800 2023-07-11 16:49:14,903 - INFO - [Train] step: 30299, loss_G: 10.939562, lr_G: 0.000800 2023-07-11 16:50:11,654 - INFO - [Train] step: 30399, loss_D: 0.001206, lr_D: 0.000800 2023-07-11 16:50:11,739 - INFO - [Train] step: 30399, loss_G: 11.431597, lr_G: 0.000800 2023-07-11 16:51:08,375 - INFO - [Train] step: 30499, loss_D: 0.002355, lr_D: 0.000800 2023-07-11 16:51:08,459 - INFO - [Train] step: 30499, loss_G: 10.659374, lr_G: 0.000800 2023-07-11 16:52:05,088 - INFO - [Train] step: 30599, loss_D: 0.001289, lr_D: 0.000800 2023-07-11 16:52:05,172 - INFO - [Train] step: 30599, loss_G: 10.584681, lr_G: 0.000800 2023-07-11 16:53:01,794 - INFO - [Train] step: 30699, loss_D: 0.018748, lr_D: 0.000800 2023-07-11 16:53:01,878 - INFO - [Train] step: 30699, loss_G: 11.520538, lr_G: 0.000800 2023-07-11 16:53:58,477 - INFO - [Train] step: 30799, loss_D: 0.000794, lr_D: 0.000800 2023-07-11 16:53:58,561 - INFO - [Train] step: 30799, loss_G: 12.048307, lr_G: 0.000800 2023-07-11 16:54:55,161 - INFO - [Train] step: 30899, loss_D: 0.017922, lr_D: 0.000800 2023-07-11 16:54:55,245 - INFO - [Train] step: 30899, loss_G: 8.961665, lr_G: 0.000800 2023-07-11 16:55:51,858 - INFO - [Train] step: 30999, loss_D: 0.003480, lr_D: 0.000800 2023-07-11 16:55:51,942 - INFO - [Train] step: 30999, loss_G: 10.417709, lr_G: 0.000800 2023-07-11 16:56:55,954 - INFO - [Eval] step: 30999, fid: 31.124594 2023-07-11 16:57:52,496 - INFO - [Train] step: 31099, loss_D: 0.004688, lr_D: 0.000800 2023-07-11 16:57:52,581 - INFO - [Train] step: 31099, loss_G: 11.631746, lr_G: 0.000800 2023-07-11 16:58:49,209 - INFO - [Train] step: 31199, loss_D: 0.005057, lr_D: 0.000800 2023-07-11 16:58:49,294 - INFO - [Train] step: 31199, loss_G: 12.383636, lr_G: 0.000800 2023-07-11 16:59:45,910 - INFO - [Train] step: 31299, loss_D: 0.018138, lr_D: 0.000800 2023-07-11 16:59:45,995 - INFO - [Train] step: 31299, loss_G: 9.527775, lr_G: 0.000800 2023-07-11 17:00:42,623 - INFO - [Train] step: 31399, loss_D: 0.008496, lr_D: 0.000800 2023-07-11 17:00:42,707 - INFO - [Train] step: 31399, loss_G: 8.838847, lr_G: 0.000800 2023-07-11 17:01:39,307 - INFO - [Train] step: 31499, loss_D: 0.000398, lr_D: 0.000800 2023-07-11 17:01:39,391 - INFO - [Train] step: 31499, loss_G: 11.916531, lr_G: 0.000800 2023-07-11 17:02:36,018 - INFO - [Train] step: 31599, loss_D: 0.000577, lr_D: 0.000800 2023-07-11 17:02:36,102 - INFO - [Train] step: 31599, loss_G: 12.267941, lr_G: 0.000800 2023-07-11 17:03:32,842 - INFO - [Train] step: 31699, loss_D: 0.001245, lr_D: 0.000800 2023-07-11 17:03:32,926 - INFO - [Train] step: 31699, loss_G: 11.535707, lr_G: 0.000800 2023-07-11 17:04:29,560 - INFO - [Train] step: 31799, loss_D: 0.007009, lr_D: 0.000800 2023-07-11 17:04:29,644 - INFO - [Train] step: 31799, loss_G: 9.988715, lr_G: 0.000800 2023-07-11 17:05:26,260 - INFO - [Train] step: 31899, loss_D: 0.002843, lr_D: 0.000800 2023-07-11 17:05:26,344 - INFO - [Train] step: 31899, loss_G: 11.165958, lr_G: 0.000800 2023-07-11 17:06:22,959 - INFO - [Train] step: 31999, loss_D: 0.001144, lr_D: 0.000800 2023-07-11 17:06:23,044 - INFO - [Train] step: 31999, loss_G: 11.805136, lr_G: 0.000800 2023-07-11 17:07:27,156 - INFO - [Eval] step: 31999, fid: 28.932226 2023-07-11 17:08:23,529 - INFO - [Train] step: 32099, loss_D: 0.015821, lr_D: 0.000800 2023-07-11 17:08:23,613 - INFO - [Train] step: 32099, loss_G: 9.272867, lr_G: 0.000800 2023-07-11 17:09:20,136 - INFO - [Train] step: 32199, loss_D: 0.004023, lr_D: 0.000800 2023-07-11 17:09:20,221 - INFO - [Train] step: 32199, loss_G: 9.855824, lr_G: 0.000800 2023-07-11 17:10:16,768 - INFO - [Train] step: 32299, loss_D: 0.002352, lr_D: 0.000800 2023-07-11 17:10:16,852 - INFO - [Train] step: 32299, loss_G: 10.211768, lr_G: 0.000800 2023-07-11 17:11:13,525 - INFO - [Train] step: 32399, loss_D: 0.002465, lr_D: 0.000800 2023-07-11 17:11:13,609 - INFO - [Train] step: 32399, loss_G: 10.366522, lr_G: 0.000800 2023-07-11 17:12:10,126 - INFO - [Train] step: 32499, loss_D: 0.040789, lr_D: 0.000800 2023-07-11 17:12:10,210 - INFO - [Train] step: 32499, loss_G: 6.815432, lr_G: 0.000800 2023-07-11 17:13:06,741 - INFO - [Train] step: 32599, loss_D: 0.004047, lr_D: 0.000800 2023-07-11 17:13:06,825 - INFO - [Train] step: 32599, loss_G: 10.256836, lr_G: 0.000800 2023-07-11 17:14:03,348 - INFO - [Train] step: 32699, loss_D: 0.002375, lr_D: 0.000800 2023-07-11 17:14:03,433 - INFO - [Train] step: 32699, loss_G: 10.991968, lr_G: 0.000800 2023-07-11 17:14:59,968 - INFO - [Train] step: 32799, loss_D: 0.196997, lr_D: 0.000800 2023-07-11 17:15:00,053 - INFO - [Train] step: 32799, loss_G: 5.590190, lr_G: 0.000800 2023-07-11 17:15:56,560 - INFO - [Train] step: 32899, loss_D: 0.003315, lr_D: 0.000800 2023-07-11 17:15:56,644 - INFO - [Train] step: 32899, loss_G: 10.307394, lr_G: 0.000800 2023-07-11 17:16:53,330 - INFO - [Train] step: 32999, loss_D: 0.003419, lr_D: 0.000800 2023-07-11 17:16:53,414 - INFO - [Train] step: 32999, loss_G: 10.800374, lr_G: 0.000800 2023-07-11 17:17:57,613 - INFO - [Eval] step: 32999, fid: 34.970089 2023-07-11 17:18:54,095 - INFO - [Train] step: 33099, loss_D: 0.001734, lr_D: 0.000800 2023-07-11 17:18:54,179 - INFO - [Train] step: 33099, loss_G: 11.038864, lr_G: 0.000800 2023-07-11 17:19:50,801 - INFO - [Train] step: 33199, loss_D: 0.068180, lr_D: 0.000800 2023-07-11 17:19:50,885 - INFO - [Train] step: 33199, loss_G: 8.782141, lr_G: 0.000800 2023-07-11 17:20:47,517 - INFO - [Train] step: 33299, loss_D: 0.007126, lr_D: 0.000800 2023-07-11 17:20:47,601 - INFO - [Train] step: 33299, loss_G: 10.125813, lr_G: 0.000800 2023-07-11 17:21:44,231 - INFO - [Train] step: 33399, loss_D: 0.002722, lr_D: 0.000800 2023-07-11 17:21:44,315 - INFO - [Train] step: 33399, loss_G: 10.206334, lr_G: 0.000800 2023-07-11 17:22:40,947 - INFO - [Train] step: 33499, loss_D: 0.001919, lr_D: 0.000800 2023-07-11 17:22:41,032 - INFO - [Train] step: 33499, loss_G: 11.129574, lr_G: 0.000800 2023-07-11 17:23:37,660 - INFO - [Train] step: 33599, loss_D: 0.010751, lr_D: 0.000800 2023-07-11 17:23:37,744 - INFO - [Train] step: 33599, loss_G: 9.334590, lr_G: 0.000800 2023-07-11 17:24:34,507 - INFO - [Train] step: 33699, loss_D: 0.003594, lr_D: 0.000800 2023-07-11 17:24:34,592 - INFO - [Train] step: 33699, loss_G: 10.852534, lr_G: 0.000800 2023-07-11 17:25:31,200 - INFO - [Train] step: 33799, loss_D: 0.002941, lr_D: 0.000800 2023-07-11 17:25:31,285 - INFO - [Train] step: 33799, loss_G: 10.506541, lr_G: 0.000800 2023-07-11 17:26:27,895 - INFO - [Train] step: 33899, loss_D: 0.001559, lr_D: 0.000800 2023-07-11 17:26:27,979 - INFO - [Train] step: 33899, loss_G: 11.605881, lr_G: 0.000800 2023-07-11 17:27:24,603 - INFO - [Train] step: 33999, loss_D: 0.010180, lr_D: 0.000800 2023-07-11 17:27:24,687 - INFO - [Train] step: 33999, loss_G: 9.430072, lr_G: 0.000800 2023-07-11 17:28:28,800 - INFO - [Eval] step: 33999, fid: 26.479848 2023-07-11 17:29:25,216 - INFO - [Train] step: 34099, loss_D: 0.088297, lr_D: 0.000800 2023-07-11 17:29:25,301 - INFO - [Train] step: 34099, loss_G: 6.721412, lr_G: 0.000800 2023-07-11 17:30:21,940 - INFO - [Train] step: 34199, loss_D: 0.001062, lr_D: 0.000800 2023-07-11 17:30:22,024 - INFO - [Train] step: 34199, loss_G: 11.332446, lr_G: 0.000800 2023-07-11 17:31:18,784 - INFO - [Train] step: 34299, loss_D: 0.001985, lr_D: 0.000800 2023-07-11 17:31:18,869 - INFO - [Train] step: 34299, loss_G: 12.944086, lr_G: 0.000800 2023-07-11 17:32:15,489 - INFO - [Train] step: 34399, loss_D: 0.001062, lr_D: 0.000800 2023-07-11 17:32:15,574 - INFO - [Train] step: 34399, loss_G: 11.664156, lr_G: 0.000800 2023-07-11 17:33:12,188 - INFO - [Train] step: 34499, loss_D: 0.052018, lr_D: 0.000800 2023-07-11 17:33:12,273 - INFO - [Train] step: 34499, loss_G: 6.532789, lr_G: 0.000800 2023-07-11 17:34:08,895 - INFO - [Train] step: 34599, loss_D: 0.007955, lr_D: 0.000800 2023-07-11 17:34:08,979 - INFO - [Train] step: 34599, loss_G: 11.164508, lr_G: 0.000800 2023-07-11 17:35:05,598 - INFO - [Train] step: 34699, loss_D: 0.003598, lr_D: 0.000800 2023-07-11 17:35:05,682 - INFO - [Train] step: 34699, loss_G: 9.783045, lr_G: 0.000800 2023-07-11 17:36:02,320 - INFO - [Train] step: 34799, loss_D: 0.001351, lr_D: 0.000800 2023-07-11 17:36:02,404 - INFO - [Train] step: 34799, loss_G: 10.909458, lr_G: 0.000800 2023-07-11 17:36:58,998 - INFO - [Train] step: 34899, loss_D: 0.027341, lr_D: 0.000800 2023-07-11 17:36:59,082 - INFO - [Train] step: 34899, loss_G: 8.103909, lr_G: 0.000800 2023-07-11 17:37:55,873 - INFO - [Train] step: 34999, loss_D: 0.003608, lr_D: 0.000800 2023-07-11 17:37:55,957 - INFO - [Train] step: 34999, loss_G: 10.162874, lr_G: 0.000800 2023-07-11 17:39:00,132 - INFO - [Eval] step: 34999, fid: 30.250333 2023-07-11 17:39:56,520 - INFO - [Train] step: 35099, loss_D: 0.002374, lr_D: 0.000800 2023-07-11 17:39:56,605 - INFO - [Train] step: 35099, loss_G: 12.336600, lr_G: 0.000800 2023-07-11 17:40:53,151 - INFO - [Train] step: 35199, loss_D: 0.000961, lr_D: 0.000800 2023-07-11 17:40:53,235 - INFO - [Train] step: 35199, loss_G: 12.600614, lr_G: 0.000800 2023-07-11 17:41:49,767 - INFO - [Train] step: 35299, loss_D: 0.014972, lr_D: 0.000800 2023-07-11 17:41:49,851 - INFO - [Train] step: 35299, loss_G: 9.577575, lr_G: 0.000800 2023-07-11 17:42:46,374 - INFO - [Train] step: 35399, loss_D: 0.006320, lr_D: 0.000800 2023-07-11 17:42:46,458 - INFO - [Train] step: 35399, loss_G: 10.768517, lr_G: 0.000800 2023-07-11 17:43:43,005 - INFO - [Train] step: 35499, loss_D: 0.001321, lr_D: 0.000800 2023-07-11 17:43:43,089 - INFO - [Train] step: 35499, loss_G: 10.944354, lr_G: 0.000800 2023-07-11 17:44:39,774 - INFO - [Train] step: 35599, loss_D: 0.001155, lr_D: 0.000800 2023-07-11 17:44:39,858 - INFO - [Train] step: 35599, loss_G: 11.848622, lr_G: 0.000800 2023-07-11 17:45:36,395 - INFO - [Train] step: 35699, loss_D: 0.004498, lr_D: 0.000800 2023-07-11 17:45:36,479 - INFO - [Train] step: 35699, loss_G: 10.615336, lr_G: 0.000800 2023-07-11 17:46:33,017 - INFO - [Train] step: 35799, loss_D: 0.002211, lr_D: 0.000800 2023-07-11 17:46:33,102 - INFO - [Train] step: 35799, loss_G: 11.308196, lr_G: 0.000800 2023-07-11 17:47:29,620 - INFO - [Train] step: 35899, loss_D: 0.112851, lr_D: 0.000800 2023-07-11 17:47:29,705 - INFO - [Train] step: 35899, loss_G: 5.648012, lr_G: 0.000800 2023-07-11 17:48:26,224 - INFO - [Train] step: 35999, loss_D: 0.003291, lr_D: 0.000800 2023-07-11 17:48:26,309 - INFO - [Train] step: 35999, loss_G: 11.379570, lr_G: 0.000800 2023-07-11 17:49:30,385 - INFO - [Eval] step: 35999, fid: 29.220229 2023-07-11 17:50:26,725 - INFO - [Train] step: 36099, loss_D: 0.002197, lr_D: 0.000800 2023-07-11 17:50:26,809 - INFO - [Train] step: 36099, loss_G: 12.284069, lr_G: 0.000800 2023-07-11 17:51:23,326 - INFO - [Train] step: 36199, loss_D: 0.001456, lr_D: 0.000800 2023-07-11 17:51:23,410 - INFO - [Train] step: 36199, loss_G: 13.320007, lr_G: 0.000800 2023-07-11 17:52:20,128 - INFO - [Train] step: 36299, loss_D: 0.014802, lr_D: 0.000800 2023-07-11 17:52:20,213 - INFO - [Train] step: 36299, loss_G: 9.526157, lr_G: 0.000800 2023-07-11 17:53:16,737 - INFO - [Train] step: 36399, loss_D: 0.004870, lr_D: 0.000800 2023-07-11 17:53:16,822 - INFO - [Train] step: 36399, loss_G: 10.647831, lr_G: 0.000800 2023-07-11 17:54:13,356 - INFO - [Train] step: 36499, loss_D: 0.002279, lr_D: 0.000800 2023-07-11 17:54:13,441 - INFO - [Train] step: 36499, loss_G: 10.133535, lr_G: 0.000800 2023-07-11 17:55:09,987 - INFO - [Train] step: 36599, loss_D: 0.003005, lr_D: 0.000800 2023-07-11 17:55:10,071 - INFO - [Train] step: 36599, loss_G: 11.771160, lr_G: 0.000800 2023-07-11 17:56:06,598 - INFO - [Train] step: 36699, loss_D: 0.034414, lr_D: 0.000800 2023-07-11 17:56:06,682 - INFO - [Train] step: 36699, loss_G: 8.730915, lr_G: 0.000800 2023-07-11 17:57:03,224 - INFO - [Train] step: 36799, loss_D: 0.001324, lr_D: 0.000800 2023-07-11 17:57:03,308 - INFO - [Train] step: 36799, loss_G: 11.915476, lr_G: 0.000800 2023-07-11 17:58:00,002 - INFO - [Train] step: 36899, loss_D: 0.001386, lr_D: 0.000800 2023-07-11 17:58:00,086 - INFO - [Train] step: 36899, loss_G: 11.080511, lr_G: 0.000800 2023-07-11 17:58:56,607 - INFO - [Train] step: 36999, loss_D: 0.002617, lr_D: 0.000800 2023-07-11 17:58:56,691 - INFO - [Train] step: 36999, loss_G: 12.768784, lr_G: 0.000800 2023-07-11 18:00:00,808 - INFO - [Eval] step: 36999, fid: 29.393289 2023-07-11 18:00:57,169 - INFO - [Train] step: 37099, loss_D: 0.005409, lr_D: 0.000800 2023-07-11 18:00:57,254 - INFO - [Train] step: 37099, loss_G: 10.404194, lr_G: 0.000800 2023-07-11 18:01:53,793 - INFO - [Train] step: 37199, loss_D: 0.000680, lr_D: 0.000800 2023-07-11 18:01:53,877 - INFO - [Train] step: 37199, loss_G: 11.750727, lr_G: 0.000800 2023-07-11 18:02:50,408 - INFO - [Train] step: 37299, loss_D: 0.001456, lr_D: 0.000800 2023-07-11 18:02:50,492 - INFO - [Train] step: 37299, loss_G: 11.308887, lr_G: 0.000800 2023-07-11 18:03:47,008 - INFO - [Train] step: 37399, loss_D: 2.423407, lr_D: 0.000800 2023-07-11 18:03:47,092 - INFO - [Train] step: 37399, loss_G: 9.273409, lr_G: 0.000800 2023-07-11 18:04:43,623 - INFO - [Train] step: 37499, loss_D: 0.002194, lr_D: 0.000800 2023-07-11 18:04:43,707 - INFO - [Train] step: 37499, loss_G: 10.400314, lr_G: 0.000800 2023-07-11 18:05:40,382 - INFO - [Train] step: 37599, loss_D: 0.001529, lr_D: 0.000800 2023-07-11 18:05:40,466 - INFO - [Train] step: 37599, loss_G: 11.470114, lr_G: 0.000800 2023-07-11 18:06:36,980 - INFO - [Train] step: 37699, loss_D: 0.000489, lr_D: 0.000800 2023-07-11 18:06:37,064 - INFO - [Train] step: 37699, loss_G: 13.643311, lr_G: 0.000800 2023-07-11 18:07:33,588 - INFO - [Train] step: 37799, loss_D: 0.018753, lr_D: 0.000800 2023-07-11 18:07:33,672 - INFO - [Train] step: 37799, loss_G: 10.013208, lr_G: 0.000800 2023-07-11 18:08:30,190 - INFO - [Train] step: 37899, loss_D: 0.000898, lr_D: 0.000800 2023-07-11 18:08:30,275 - INFO - [Train] step: 37899, loss_G: 11.948532, lr_G: 0.000800 2023-07-11 18:09:26,780 - INFO - [Train] step: 37999, loss_D: 0.000186, lr_D: 0.000800 2023-07-11 18:09:26,865 - INFO - [Train] step: 37999, loss_G: 12.791275, lr_G: 0.000800 2023-07-11 18:10:30,975 - INFO - [Eval] step: 37999, fid: 32.256753 2023-07-11 18:11:27,328 - INFO - [Train] step: 38099, loss_D: 0.001841, lr_D: 0.000800 2023-07-11 18:11:27,411 - INFO - [Train] step: 38099, loss_G: 11.418566, lr_G: 0.000800 2023-07-11 18:12:24,071 - INFO - [Train] step: 38199, loss_D: 0.001069, lr_D: 0.000800 2023-07-11 18:12:24,155 - INFO - [Train] step: 38199, loss_G: 13.107159, lr_G: 0.000800 2023-07-11 18:13:20,709 - INFO - [Train] step: 38299, loss_D: 0.004591, lr_D: 0.000800 2023-07-11 18:13:20,793 - INFO - [Train] step: 38299, loss_G: 14.307920, lr_G: 0.000800 2023-07-11 18:14:17,324 - INFO - [Train] step: 38399, loss_D: 0.015494, lr_D: 0.000800 2023-07-11 18:14:17,409 - INFO - [Train] step: 38399, loss_G: 9.397242, lr_G: 0.000800 2023-07-11 18:15:13,933 - INFO - [Train] step: 38499, loss_D: 0.001532, lr_D: 0.000800 2023-07-11 18:15:14,017 - INFO - [Train] step: 38499, loss_G: 11.354094, lr_G: 0.000800 2023-07-11 18:16:10,539 - INFO - [Train] step: 38599, loss_D: 0.002174, lr_D: 0.000800 2023-07-11 18:16:10,623 - INFO - [Train] step: 38599, loss_G: 11.296826, lr_G: 0.000800 2023-07-11 18:17:07,146 - INFO - [Train] step: 38699, loss_D: 0.122292, lr_D: 0.000800 2023-07-11 18:17:07,231 - INFO - [Train] step: 38699, loss_G: 6.858611, lr_G: 0.000800 2023-07-11 18:18:03,759 - INFO - [Train] step: 38799, loss_D: 0.002347, lr_D: 0.000800 2023-07-11 18:18:03,843 - INFO - [Train] step: 38799, loss_G: 11.651567, lr_G: 0.000800 2023-07-11 18:19:00,491 - INFO - [Train] step: 38899, loss_D: 0.000883, lr_D: 0.000800 2023-07-11 18:19:00,576 - INFO - [Train] step: 38899, loss_G: 11.204956, lr_G: 0.000800 2023-07-11 18:19:57,079 - INFO - [Train] step: 38999, loss_D: 0.005888, lr_D: 0.000800 2023-07-11 18:19:57,163 - INFO - [Train] step: 38999, loss_G: 11.321920, lr_G: 0.000800 2023-07-11 18:21:01,292 - INFO - [Eval] step: 38999, fid: 30.338257 2023-07-11 18:21:57,630 - INFO - [Train] step: 39099, loss_D: 0.009130, lr_D: 0.000800 2023-07-11 18:21:57,713 - INFO - [Train] step: 39099, loss_G: 11.101431, lr_G: 0.000800 2023-07-11 18:22:54,062 - INFO - [Train] step: 39199, loss_D: 0.001463, lr_D: 0.000800 2023-07-11 18:22:54,146 - INFO - [Train] step: 39199, loss_G: 11.452006, lr_G: 0.000800 2023-07-11 18:23:50,514 - INFO - [Train] step: 39299, loss_D: 0.000898, lr_D: 0.000800 2023-07-11 18:23:50,598 - INFO - [Train] step: 39299, loss_G: 11.766135, lr_G: 0.000800 2023-07-11 18:24:46,944 - INFO - [Train] step: 39399, loss_D: 0.031015, lr_D: 0.000800 2023-07-11 18:24:47,028 - INFO - [Train] step: 39399, loss_G: 8.246071, lr_G: 0.000800 2023-07-11 18:25:43,511 - INFO - [Train] step: 39499, loss_D: 0.006617, lr_D: 0.000800 2023-07-11 18:25:43,596 - INFO - [Train] step: 39499, loss_G: 10.835472, lr_G: 0.000800 2023-07-11 18:26:39,936 - INFO - [Train] step: 39599, loss_D: 0.000876, lr_D: 0.000800 2023-07-11 18:26:40,020 - INFO - [Train] step: 39599, loss_G: 10.974045, lr_G: 0.000800 2023-07-11 18:27:36,382 - INFO - [Train] step: 39699, loss_D: 0.000632, lr_D: 0.000800 2023-07-11 18:27:36,466 - INFO - [Train] step: 39699, loss_G: 12.644368, lr_G: 0.000800 2023-07-11 18:28:32,806 - INFO - [Train] step: 39799, loss_D: 0.027200, lr_D: 0.000800 2023-07-11 18:28:32,890 - INFO - [Train] step: 39799, loss_G: 11.607603, lr_G: 0.000800 2023-07-11 18:29:29,246 - INFO - [Train] step: 39899, loss_D: 0.004701, lr_D: 0.000800 2023-07-11 18:29:29,330 - INFO - [Train] step: 39899, loss_G: 11.546965, lr_G: 0.000800 2023-07-11 18:30:25,704 - INFO - [Train] step: 39999, loss_D: 0.002906, lr_D: 0.000800 2023-07-11 18:30:25,788 - INFO - [Train] step: 39999, loss_G: 10.704437, lr_G: 0.000800 2023-07-11 18:31:29,906 - INFO - [Eval] step: 39999, fid: 34.076071 2023-07-11 18:31:30,079 - INFO - End of training