2023-02-05 02:44:32,354 - INFO - Experiment directory: runs/exp-2023-02-05-02-44-31 2023-02-05 02:44:32,354 - INFO - Device: cuda 2023-02-05 02:44:32,354 - INFO - Number of devices: 1 2023-02-05 02:44:32,495 - INFO - Size of training set: 24183 2023-02-05 02:44:32,496 - INFO - Size of validation set: 2993 2023-02-05 02:44:32,496 - INFO - Batch size per device: 256 2023-02-05 02:44:32,496 - INFO - Effective batch size: 256 2023-02-05 02:44:33,574 - INFO - Start training... 2023-02-05 02:45:09,328 - INFO - [Train] step: 99, loss_advD: 0.004236, lr_D: 0.000200 2023-02-05 02:45:09,399 - INFO - [Train] step: 99, loss_advG: 8.841215, loss_rec: 0.065191, lr_G: 0.002000 2023-02-05 02:45:43,448 - INFO - [Train] step: 199, loss_advD: 0.002633, lr_D: 0.000200 2023-02-05 02:45:43,520 - INFO - [Train] step: 199, loss_advG: 9.334277, loss_rec: 0.053929, lr_G: 0.002000 2023-02-05 02:46:19,008 - INFO - [Train] step: 299, loss_advD: 0.001042, lr_D: 0.000200 2023-02-05 02:46:19,080 - INFO - [Train] step: 299, loss_advG: 9.800692, loss_rec: 0.043716, lr_G: 0.002000 2023-02-05 02:46:52,703 - INFO - [Train] step: 399, loss_advD: 0.000655, lr_D: 0.000200 2023-02-05 02:46:52,775 - INFO - [Train] step: 399, loss_advG: 10.199879, loss_rec: 0.047668, lr_G: 0.002000 2023-02-05 02:47:02,117 - INFO - [Eval] step: 399, psnr: 19.303057, ssim: 0.600675, l1: 0.077202 2023-02-05 02:47:30,603 - INFO - [Train] step: 499, loss_advD: 0.000314, lr_D: 0.000200 2023-02-05 02:47:30,676 - INFO - [Train] step: 499, loss_advG: 11.545143, loss_rec: 0.044224, lr_G: 0.002000 2023-02-05 02:48:05,395 - INFO - [Train] step: 599, loss_advD: 0.000355, lr_D: 0.000200 2023-02-05 02:48:05,467 - INFO - [Train] step: 599, loss_advG: 12.403799, loss_rec: 0.041540, lr_G: 0.002000 2023-02-05 02:48:39,175 - INFO - [Train] step: 699, loss_advD: 0.000163, lr_D: 0.000200 2023-02-05 02:48:39,246 - INFO - [Train] step: 699, loss_advG: 12.309727, loss_rec: 0.040358, lr_G: 0.002000 2023-02-05 02:49:13,238 - INFO - [Train] step: 799, loss_advD: 0.000160, lr_D: 0.000200 2023-02-05 02:49:13,310 - INFO - [Train] step: 799, loss_advG: 11.204846, loss_rec: 0.035781, lr_G: 0.002000 2023-02-05 02:49:22,239 - INFO - [Eval] step: 799, psnr: 19.704002, ssim: 0.626610, l1: 0.071904 2023-02-05 02:49:51,234 - INFO - [Train] step: 899, loss_advD: 0.009815, lr_D: 0.000200 2023-02-05 02:49:51,306 - INFO - [Train] step: 899, loss_advG: 8.964287, loss_rec: 0.034490, lr_G: 0.002000 2023-02-05 02:50:25,291 - INFO - [Train] step: 999, loss_advD: 0.001038, lr_D: 0.000200 2023-02-05 02:50:25,363 - INFO - [Train] step: 999, loss_advG: 9.776756, loss_rec: 0.030339, lr_G: 0.002000 2023-02-05 02:51:00,113 - INFO - [Train] step: 1099, loss_advD: 0.001086, lr_D: 0.000200 2023-02-05 02:51:00,184 - INFO - [Train] step: 1099, loss_advG: 9.421621, loss_rec: 0.029719, lr_G: 0.002000 2023-02-05 02:51:33,839 - INFO - [Train] step: 1199, loss_advD: 0.000948, lr_D: 0.000200 2023-02-05 02:51:33,911 - INFO - [Train] step: 1199, loss_advG: 11.210997, loss_rec: 0.029902, lr_G: 0.002000 2023-02-05 02:51:41,841 - INFO - [Eval] step: 1199, psnr: 19.753839, ssim: 0.638024, l1: 0.070382 2023-02-05 02:52:13,583 - INFO - [Train] step: 1299, loss_advD: 0.000633, lr_D: 0.000200 2023-02-05 02:52:13,655 - INFO - [Train] step: 1299, loss_advG: 10.546900, loss_rec: 0.027522, lr_G: 0.002000 2023-02-05 02:52:47,958 - INFO - [Train] step: 1399, loss_advD: 0.000433, lr_D: 0.000200 2023-02-05 02:52:48,029 - INFO - [Train] step: 1399, loss_advG: 12.136958, loss_rec: 0.026269, lr_G: 0.002000 2023-02-05 02:53:22,499 - INFO - [Train] step: 1499, loss_advD: 33.222527, lr_D: 0.000200 2023-02-05 02:53:22,569 - INFO - [Train] step: 1499, loss_advG: 0.663852, loss_rec: 0.023117, lr_G: 0.002000 2023-02-05 02:53:58,715 - INFO - [Train] step: 1599, loss_advD: 0.010084, lr_D: 0.000200 2023-02-05 02:53:58,786 - INFO - [Train] step: 1599, loss_advG: 6.838681, loss_rec: 0.021032, lr_G: 0.002000 2023-02-05 02:54:07,758 - INFO - [Eval] step: 1599, psnr: 19.801996, ssim: 0.645922, l1: 0.068897 2023-02-05 02:54:35,659 - INFO - [Train] step: 1699, loss_advD: 0.002627, lr_D: 0.000200 2023-02-05 02:54:35,730 - INFO - [Train] step: 1699, loss_advG: 8.249655, loss_rec: 0.020625, lr_G: 0.002000 2023-02-05 02:55:09,976 - INFO - [Train] step: 1799, loss_advD: 0.002406, lr_D: 0.000200 2023-02-05 02:55:10,048 - INFO - [Train] step: 1799, loss_advG: 9.596415, loss_rec: 0.019060, lr_G: 0.002000 2023-02-05 02:55:44,622 - INFO - [Train] step: 1899, loss_advD: 0.001832, lr_D: 0.000200 2023-02-05 02:55:44,694 - INFO - [Train] step: 1899, loss_advG: 8.849054, loss_rec: 0.017149, lr_G: 0.002000 2023-02-05 02:56:19,580 - INFO - [Train] step: 1999, loss_advD: 0.000453, lr_D: 0.000200 2023-02-05 02:56:19,652 - INFO - [Train] step: 1999, loss_advG: 10.581168, loss_rec: 0.016474, lr_G: 0.002000 2023-02-05 02:56:28,714 - INFO - [Eval] step: 1999, psnr: 19.759607, ssim: 0.640241, l1: 0.068659 2023-02-05 02:56:55,627 - INFO - [Train] step: 2099, loss_advD: 0.001079, lr_D: 0.000200 2023-02-05 02:56:55,698 - INFO - [Train] step: 2099, loss_advG: 9.387468, loss_rec: 0.015656, lr_G: 0.002000 2023-02-05 02:57:31,273 - INFO - [Train] step: 2199, loss_advD: 0.000592, lr_D: 0.000200 2023-02-05 02:57:31,345 - INFO - [Train] step: 2199, loss_advG: 10.402010, loss_rec: 0.015673, lr_G: 0.002000 2023-02-05 02:58:05,502 - INFO - [Train] step: 2299, loss_advD: 0.000334, lr_D: 0.000200 2023-02-05 02:58:05,574 - INFO - [Train] step: 2299, loss_advG: 10.992560, loss_rec: 0.015339, lr_G: 0.002000 2023-02-05 02:58:39,611 - INFO - [Train] step: 2399, loss_advD: 0.057976, lr_D: 0.000200 2023-02-05 02:58:39,682 - INFO - [Train] step: 2399, loss_advG: 4.322793, loss_rec: 0.016230, lr_G: 0.002000 2023-02-05 02:58:48,379 - INFO - [Eval] step: 2399, psnr: 19.645548, ssim: 0.635985, l1: 0.070579 2023-02-05 02:59:15,904 - INFO - [Train] step: 2499, loss_advD: 0.005028, lr_D: 0.000200 2023-02-05 02:59:15,976 - INFO - [Train] step: 2499, loss_advG: 7.413066, loss_rec: 0.014326, lr_G: 0.002000 2023-02-05 02:59:50,288 - INFO - [Train] step: 2599, loss_advD: 0.003219, lr_D: 0.000200 2023-02-05 02:59:50,359 - INFO - [Train] step: 2599, loss_advG: 8.856352, loss_rec: 0.013697, lr_G: 0.002000 2023-02-05 03:00:24,982 - INFO - [Train] step: 2699, loss_advD: 0.001754, lr_D: 0.000200 2023-02-05 03:00:25,054 - INFO - [Train] step: 2699, loss_advG: 9.388381, loss_rec: 0.012103, lr_G: 0.002000 2023-02-05 03:00:59,423 - INFO - [Train] step: 2799, loss_advD: 0.000814, lr_D: 0.000200 2023-02-05 03:00:59,494 - INFO - [Train] step: 2799, loss_advG: 9.338200, loss_rec: 0.012384, lr_G: 0.002000 2023-02-05 03:01:07,432 - INFO - [Eval] step: 2799, psnr: 19.655394, ssim: 0.636449, l1: 0.070222 2023-02-05 03:01:37,439 - INFO - [Train] step: 2899, loss_advD: 0.001416, lr_D: 0.000200 2023-02-05 03:01:37,511 - INFO - [Train] step: 2899, loss_advG: 9.875728, loss_rec: 0.013151, lr_G: 0.002000 2023-02-05 03:02:13,116 - INFO - [Train] step: 2999, loss_advD: 0.000500, lr_D: 0.000200 2023-02-05 03:02:13,188 - INFO - [Train] step: 2999, loss_advG: 10.212419, loss_rec: 0.012056, lr_G: 0.002000 2023-02-05 03:02:47,712 - INFO - [Train] step: 3099, loss_advD: 0.000460, lr_D: 0.000200 2023-02-05 03:02:47,783 - INFO - [Train] step: 3099, loss_advG: 10.264345, loss_rec: 0.011568, lr_G: 0.002000 2023-02-05 03:03:23,980 - INFO - [Train] step: 3199, loss_advD: 0.000597, lr_D: 0.000200 2023-02-05 03:03:24,052 - INFO - [Train] step: 3199, loss_advG: 10.028607, loss_rec: 0.011195, lr_G: 0.002000 2023-02-05 03:03:32,941 - INFO - [Eval] step: 3199, psnr: 19.710810, ssim: 0.634687, l1: 0.069024 2023-02-05 03:04:01,895 - INFO - [Train] step: 3299, loss_advD: 0.000807, lr_D: 0.000200 2023-02-05 03:04:01,967 - INFO - [Train] step: 3299, loss_advG: 10.411753, loss_rec: 0.010301, lr_G: 0.002000 2023-02-05 03:04:34,528 - INFO - [Train] step: 3399, loss_advD: 0.000353, lr_D: 0.000200 2023-02-05 03:04:34,600 - INFO - [Train] step: 3399, loss_advG: 11.656036, loss_rec: 0.009968, lr_G: 0.002000 2023-02-05 03:05:10,126 - INFO - [Train] step: 3499, loss_advD: 0.000527, lr_D: 0.000200 2023-02-05 03:05:10,198 - INFO - [Train] step: 3499, loss_advG: 11.721670, loss_rec: 0.009337, lr_G: 0.002000 2023-02-05 03:05:44,281 - INFO - [Train] step: 3599, loss_advD: 1.247373, lr_D: 0.000200 2023-02-05 03:05:44,353 - INFO - [Train] step: 3599, loss_advG: 0.734195, loss_rec: 0.010463, lr_G: 0.002000 2023-02-05 03:05:53,428 - INFO - [Eval] step: 3599, psnr: 19.687819, ssim: 0.632273, l1: 0.069020 2023-02-05 03:06:22,622 - INFO - [Train] step: 3699, loss_advD: 0.222994, lr_D: 0.000200 2023-02-05 03:06:22,696 - INFO - [Train] step: 3699, loss_advG: 2.964945, loss_rec: 0.009144, lr_G: 0.002000 2023-02-05 03:06:55,248 - INFO - [Train] step: 3799, loss_advD: 0.005277, lr_D: 0.000200 2023-02-05 03:06:55,320 - INFO - [Train] step: 3799, loss_advG: 7.934243, loss_rec: 0.008816, lr_G: 0.002000 2023-02-05 03:07:30,193 - INFO - [Train] step: 3899, loss_advD: 0.002168, lr_D: 0.000200 2023-02-05 03:07:30,265 - INFO - [Train] step: 3899, loss_advG: 8.597430, loss_rec: 0.009144, lr_G: 0.002000 2023-02-05 03:08:04,855 - INFO - [Train] step: 3999, loss_advD: 0.001897, lr_D: 0.000200 2023-02-05 03:08:04,927 - INFO - [Train] step: 3999, loss_advG: 9.116765, loss_rec: 0.009247, lr_G: 0.002000 2023-02-05 03:08:13,974 - INFO - [Eval] step: 3999, psnr: 19.678066, ssim: 0.630547, l1: 0.069829 2023-02-05 03:08:42,816 - INFO - [Train] step: 4099, loss_advD: 0.001331, lr_D: 0.000200 2023-02-05 03:08:42,888 - INFO - [Train] step: 4099, loss_advG: 9.342583, loss_rec: 0.008989, lr_G: 0.002000 2023-02-05 03:09:15,804 - INFO - [Train] step: 4199, loss_advD: 0.000668, lr_D: 0.000200 2023-02-05 03:09:15,876 - INFO - [Train] step: 4199, loss_advG: 10.298091, loss_rec: 0.008580, lr_G: 0.002000 2023-02-05 03:09:50,486 - INFO - [Train] step: 4299, loss_advD: 0.000886, lr_D: 0.000200 2023-02-05 03:09:50,558 - INFO - [Train] step: 4299, loss_advG: 9.433932, loss_rec: 0.008425, lr_G: 0.002000 2023-02-05 03:10:25,317 - INFO - [Train] step: 4399, loss_advD: 0.000446, lr_D: 0.000200 2023-02-05 03:10:25,389 - INFO - [Train] step: 4399, loss_advG: 11.063719, loss_rec: 0.008059, lr_G: 0.002000 2023-02-05 03:10:32,582 - INFO - [Eval] step: 4399, psnr: 19.694334, ssim: 0.630657, l1: 0.069027 2023-02-05 03:11:03,882 - INFO - [Train] step: 4499, loss_advD: 0.000545, lr_D: 0.000200 2023-02-05 03:11:03,953 - INFO - [Train] step: 4499, loss_advG: 10.442323, loss_rec: 0.007858, lr_G: 0.002000 2023-02-05 03:11:36,996 - INFO - [Train] step: 4599, loss_advD: 0.000274, lr_D: 0.000200 2023-02-05 03:11:37,068 - INFO - [Train] step: 4599, loss_advG: 10.579621, loss_rec: 0.008826, lr_G: 0.002000 2023-02-05 03:12:11,382 - INFO - [Train] step: 4699, loss_advD: 0.000694, lr_D: 0.000200 2023-02-05 03:12:11,453 - INFO - [Train] step: 4699, loss_advG: 10.163288, loss_rec: 0.007892, lr_G: 0.002000 2023-02-05 03:12:47,734 - INFO - [Train] step: 4799, loss_advD: 0.000465, lr_D: 0.000200 2023-02-05 03:12:47,806 - INFO - [Train] step: 4799, loss_advG: 11.316122, loss_rec: 0.008258, lr_G: 0.002000 2023-02-05 03:12:56,125 - INFO - [Eval] step: 4799, psnr: 19.624626, ssim: 0.630958, l1: 0.069593 2023-02-05 03:13:25,744 - INFO - [Train] step: 4899, loss_advD: 0.000660, lr_D: 0.000200 2023-02-05 03:13:25,816 - INFO - [Train] step: 4899, loss_advG: 10.668560, loss_rec: 0.007540, lr_G: 0.002000 2023-02-05 03:14:00,699 - INFO - [Train] step: 4999, loss_advD: 0.000491, lr_D: 0.000200 2023-02-05 03:14:00,770 - INFO - [Train] step: 4999, loss_advG: 10.861528, loss_rec: 0.008622, lr_G: 0.002000 2023-02-05 03:14:34,668 - INFO - [Train] step: 5099, loss_advD: 0.000400, lr_D: 0.000200 2023-02-05 03:14:34,739 - INFO - [Train] step: 5099, loss_advG: 11.291448, loss_rec: 0.007136, lr_G: 0.002000 2023-02-05 03:15:09,264 - INFO - [Train] step: 5199, loss_advD: 0.000258, lr_D: 0.000200 2023-02-05 03:15:09,336 - INFO - [Train] step: 5199, loss_advG: 11.613050, loss_rec: 0.007741, lr_G: 0.002000 2023-02-05 03:15:18,368 - INFO - [Eval] step: 5199, psnr: 19.615789, ssim: 0.630001, l1: 0.069536 2023-02-05 03:15:46,887 - INFO - [Train] step: 5299, loss_advD: 0.004664, lr_D: 0.000200 2023-02-05 03:15:46,959 - INFO - [Train] step: 5299, loss_advG: 2.645979, loss_rec: 0.006973, lr_G: 0.002000 2023-02-05 03:16:22,027 - INFO - [Train] step: 5399, loss_advD: 0.126145, lr_D: 0.000200 2023-02-05 03:16:22,099 - INFO - [Train] step: 5399, loss_advG: 3.852100, loss_rec: 0.007155, lr_G: 0.002000 2023-02-05 03:16:54,911 - INFO - [Train] step: 5499, loss_advD: 0.008191, lr_D: 0.000200 2023-02-05 03:16:54,983 - INFO - [Train] step: 5499, loss_advG: 7.517085, loss_rec: 0.008129, lr_G: 0.002000 2023-02-05 03:17:30,654 - INFO - [Train] step: 5599, loss_advD: 0.001083, lr_D: 0.000200 2023-02-05 03:17:30,726 - INFO - [Train] step: 5599, loss_advG: 9.739358, loss_rec: 0.007496, lr_G: 0.002000 2023-02-05 03:17:39,734 - INFO - [Eval] step: 5599, psnr: 19.674599, ssim: 0.625275, l1: 0.069445 2023-02-05 03:18:07,956 - INFO - [Train] step: 5699, loss_advD: 0.002800, lr_D: 0.000200 2023-02-05 03:18:08,027 - INFO - [Train] step: 5699, loss_advG: 10.629820, loss_rec: 0.007541, lr_G: 0.002000 2023-02-05 03:18:42,950 - INFO - [Train] step: 5799, loss_advD: 0.000612, lr_D: 0.000200 2023-02-05 03:18:43,025 - INFO - [Train] step: 5799, loss_advG: 10.468132, loss_rec: 0.006545, lr_G: 0.002000 2023-02-05 03:19:17,199 - INFO - [Train] step: 5899, loss_advD: 0.000338, lr_D: 0.000200 2023-02-05 03:19:17,271 - INFO - [Train] step: 5899, loss_advG: 10.040205, loss_rec: 0.007551, lr_G: 0.002000 2023-02-05 03:19:51,691 - INFO - [Train] step: 5999, loss_advD: 0.000685, lr_D: 0.000200 2023-02-05 03:19:51,763 - INFO - [Train] step: 5999, loss_advG: 9.942476, loss_rec: 0.006859, lr_G: 0.002000 2023-02-05 03:19:59,356 - INFO - [Eval] step: 5999, psnr: 19.699810, ssim: 0.628562, l1: 0.069296 2023-02-05 03:20:30,947 - INFO - [Train] step: 6099, loss_advD: 0.000242, lr_D: 0.000200 2023-02-05 03:20:31,019 - INFO - [Train] step: 6099, loss_advG: 11.505769, loss_rec: 0.006505, lr_G: 0.002000 2023-02-05 03:21:04,715 - INFO - [Train] step: 6199, loss_advD: 0.000132, lr_D: 0.000200 2023-02-05 03:21:04,786 - INFO - [Train] step: 6199, loss_advG: 11.357449, loss_rec: 0.006899, lr_G: 0.002000 2023-02-05 03:21:40,454 - INFO - [Train] step: 6299, loss_advD: 0.000208, lr_D: 0.000200 2023-02-05 03:21:40,526 - INFO - [Train] step: 6299, loss_advG: 10.447915, loss_rec: 0.006133, lr_G: 0.002000 2023-02-05 03:22:13,743 - INFO - [Train] step: 6399, loss_advD: 0.000497, lr_D: 0.000200 2023-02-05 03:22:13,815 - INFO - [Train] step: 6399, loss_advG: 10.310512, loss_rec: 0.006256, lr_G: 0.002000 2023-02-05 03:22:22,890 - INFO - [Eval] step: 6399, psnr: 19.727221, ssim: 0.629508, l1: 0.069097 2023-02-05 03:22:51,479 - INFO - [Train] step: 6499, loss_advD: 0.000163, lr_D: 0.000200 2023-02-05 03:22:51,551 - INFO - [Train] step: 6499, loss_advG: 11.736381, loss_rec: 0.006770, lr_G: 0.002000 2023-02-05 03:23:26,177 - INFO - [Train] step: 6599, loss_advD: 0.000196, lr_D: 0.000200 2023-02-05 03:23:26,249 - INFO - [Train] step: 6599, loss_advG: 11.299549, loss_rec: 0.006962, lr_G: 0.002000 2023-02-05 03:24:01,011 - INFO - [Train] step: 6699, loss_advD: 0.000169, lr_D: 0.000200 2023-02-05 03:24:01,082 - INFO - [Train] step: 6699, loss_advG: 11.554781, loss_rec: 0.006398, lr_G: 0.002000 2023-02-05 03:24:34,515 - INFO - [Train] step: 6799, loss_advD: 0.000388, lr_D: 0.000200 2023-02-05 03:24:34,587 - INFO - [Train] step: 6799, loss_advG: 10.279026, loss_rec: 0.005789, lr_G: 0.002000 2023-02-05 03:24:43,645 - INFO - [Eval] step: 6799, psnr: 19.687290, ssim: 0.625953, l1: 0.069517 2023-02-05 03:25:12,616 - INFO - [Train] step: 6899, loss_advD: 0.000436, lr_D: 0.000200 2023-02-05 03:25:12,687 - INFO - [Train] step: 6899, loss_advG: 10.553237, loss_rec: 0.005911, lr_G: 0.002000 2023-02-05 03:25:46,801 - INFO - [Train] step: 6999, loss_advD: 0.000220, lr_D: 0.000200 2023-02-05 03:25:46,873 - INFO - [Train] step: 6999, loss_advG: 12.450248, loss_rec: 0.005755, lr_G: 0.002000 2023-02-05 03:26:22,635 - INFO - [Train] step: 7099, loss_advD: 0.000098, lr_D: 0.000200 2023-02-05 03:26:22,709 - INFO - [Train] step: 7099, loss_advG: 12.586573, loss_rec: 0.006488, lr_G: 0.002000 2023-02-05 03:26:56,118 - INFO - [Train] step: 7199, loss_advD: 0.000234, lr_D: 0.000200 2023-02-05 03:26:56,190 - INFO - [Train] step: 7199, loss_advG: 11.692434, loss_rec: 0.005480, lr_G: 0.002000 2023-02-05 03:27:05,275 - INFO - [Eval] step: 7199, psnr: 19.672794, ssim: 0.624957, l1: 0.069866 2023-02-05 03:27:34,095 - INFO - [Train] step: 7299, loss_advD: 0.000109, lr_D: 0.000200 2023-02-05 03:27:34,167 - INFO - [Train] step: 7299, loss_advG: 11.974163, loss_rec: 0.005687, lr_G: 0.002000 2023-02-05 03:28:08,241 - INFO - [Train] step: 7399, loss_advD: 0.000200, lr_D: 0.000200 2023-02-05 03:28:08,313 - INFO - [Train] step: 7399, loss_advG: 13.628834, loss_rec: 0.005640, lr_G: 0.002000 2023-02-05 03:28:42,467 - INFO - [Train] step: 7499, loss_advD: 1.666471, lr_D: 0.000200 2023-02-05 03:28:42,539 - INFO - [Train] step: 7499, loss_advG: 0.265937, loss_rec: 0.007443, lr_G: 0.002000 2023-02-05 03:29:16,014 - INFO - [Train] step: 7599, loss_advD: 0.866305, lr_D: 0.000200 2023-02-05 03:29:16,086 - INFO - [Train] step: 7599, loss_advG: 1.282902, loss_rec: 0.005651, lr_G: 0.002000 2023-02-05 03:29:23,647 - INFO - [Eval] step: 7599, psnr: 19.679535, ssim: 0.626192, l1: 0.069336 2023-02-05 03:29:55,308 - INFO - [Train] step: 7699, loss_advD: 0.005697, lr_D: 0.000200 2023-02-05 03:29:55,380 - INFO - [Train] step: 7699, loss_advG: 7.827493, loss_rec: 0.005932, lr_G: 0.002000 2023-02-05 03:30:29,505 - INFO - [Train] step: 7799, loss_advD: 0.002610, lr_D: 0.000200 2023-02-05 03:30:29,576 - INFO - [Train] step: 7799, loss_advG: 8.615435, loss_rec: 0.005201, lr_G: 0.002000 2023-02-05 03:31:05,109 - INFO - [Train] step: 7899, loss_advD: 0.001319, lr_D: 0.000200 2023-02-05 03:31:05,180 - INFO - [Train] step: 7899, loss_advG: 8.740076, loss_rec: 0.005343, lr_G: 0.002000 2023-02-05 03:31:40,614 - INFO - [Train] step: 7999, loss_advD: 0.000466, lr_D: 0.000200 2023-02-05 03:31:40,686 - INFO - [Train] step: 7999, loss_advG: 9.387142, loss_rec: 0.005185, lr_G: 0.002000 2023-02-05 03:31:48,776 - INFO - [Eval] step: 7999, psnr: 19.702988, ssim: 0.626845, l1: 0.069229 2023-02-05 03:32:16,911 - INFO - [Train] step: 8099, loss_advD: 0.000457, lr_D: 0.000200 2023-02-05 03:32:16,983 - INFO - [Train] step: 8099, loss_advG: 11.156689, loss_rec: 0.005334, lr_G: 0.002000 2023-02-05 03:32:52,817 - INFO - [Train] step: 8199, loss_advD: 0.002096, lr_D: 0.000200 2023-02-05 03:32:52,889 - INFO - [Train] step: 8199, loss_advG: 8.575163, loss_rec: 0.004970, lr_G: 0.002000 2023-02-05 03:33:27,159 - INFO - [Train] step: 8299, loss_advD: 0.000714, lr_D: 0.000200 2023-02-05 03:33:27,231 - INFO - [Train] step: 8299, loss_advG: 9.134371, loss_rec: 0.005641, lr_G: 0.002000 2023-02-05 03:34:01,492 - INFO - [Train] step: 8399, loss_advD: 0.000155, lr_D: 0.000200 2023-02-05 03:34:01,564 - INFO - [Train] step: 8399, loss_advG: 11.639325, loss_rec: 0.005901, lr_G: 0.002000 2023-02-05 03:34:10,525 - INFO - [Eval] step: 8399, psnr: 19.704407, ssim: 0.626307, l1: 0.069162 2023-02-05 03:34:38,052 - INFO - [Train] step: 8499, loss_advD: 0.000239, lr_D: 0.000200 2023-02-05 03:34:38,124 - INFO - [Train] step: 8499, loss_advG: 10.419901, loss_rec: 0.005192, lr_G: 0.002000 2023-02-05 03:35:12,545 - INFO - [Train] step: 8599, loss_advD: 0.000367, lr_D: 0.000200 2023-02-05 03:35:12,616 - INFO - [Train] step: 8599, loss_advG: 10.411985, loss_rec: 0.004840, lr_G: 0.002000 2023-02-05 03:35:47,501 - INFO - [Train] step: 8699, loss_advD: 0.000284, lr_D: 0.000200 2023-02-05 03:35:47,572 - INFO - [Train] step: 8699, loss_advG: 10.234587, loss_rec: 0.005267, lr_G: 0.002000 2023-02-05 03:36:22,111 - INFO - [Train] step: 8799, loss_advD: 0.000619, lr_D: 0.000200 2023-02-05 03:36:22,183 - INFO - [Train] step: 8799, loss_advG: 9.793444, loss_rec: 0.004985, lr_G: 0.002000 2023-02-05 03:36:31,170 - INFO - [Eval] step: 8799, psnr: 19.731133, ssim: 0.626663, l1: 0.069008 2023-02-05 03:36:58,545 - INFO - [Train] step: 8899, loss_advD: 0.000901, lr_D: 0.000200 2023-02-05 03:36:58,617 - INFO - [Train] step: 8899, loss_advG: 11.384820, loss_rec: 0.005492, lr_G: 0.002000 2023-02-05 03:37:33,045 - INFO - [Train] step: 8999, loss_advD: 0.000135, lr_D: 0.000200 2023-02-05 03:37:33,117 - INFO - [Train] step: 8999, loss_advG: 12.514279, loss_rec: 0.005427, lr_G: 0.002000 2023-02-05 03:38:07,596 - INFO - [Train] step: 9099, loss_advD: 0.000310, lr_D: 0.000200 2023-02-05 03:38:07,668 - INFO - [Train] step: 9099, loss_advG: 12.564369, loss_rec: 0.004984, lr_G: 0.002000 2023-02-05 03:38:42,429 - INFO - [Train] step: 9199, loss_advD: 0.000312, lr_D: 0.000200 2023-02-05 03:38:42,500 - INFO - [Train] step: 9199, loss_advG: 10.814880, loss_rec: 0.004957, lr_G: 0.002000 2023-02-05 03:38:48,704 - INFO - [Eval] step: 9199, psnr: 19.730156, ssim: 0.625459, l1: 0.069412 2023-02-05 03:39:21,888 - INFO - [Train] step: 9299, loss_advD: 0.000047, lr_D: 0.000200 2023-02-05 03:39:21,958 - INFO - [Train] step: 9299, loss_advG: 13.654831, loss_rec: 0.004693, lr_G: 0.002000 2023-02-05 03:39:56,025 - INFO - [Train] step: 9399, loss_advD: 0.466755, lr_D: 0.000200 2023-02-05 03:39:56,095 - INFO - [Train] step: 9399, loss_advG: 4.125422, loss_rec: 0.005382, lr_G: 0.002000 2023-02-05 03:40:32,648 - INFO - [Train] step: 9499, loss_advD: 0.010376, lr_D: 0.000200 2023-02-05 03:40:32,720 - INFO - [Train] step: 9499, loss_advG: 8.084116, loss_rec: 0.004683, lr_G: 0.002000 2023-02-05 03:41:07,728 - INFO - [Train] step: 9599, loss_advD: 0.002480, lr_D: 0.000200 2023-02-05 03:41:07,800 - INFO - [Train] step: 9599, loss_advG: 8.993934, loss_rec: 0.004760, lr_G: 0.002000 2023-02-05 03:41:16,711 - INFO - [Eval] step: 9599, psnr: 19.704975, ssim: 0.626789, l1: 0.069246 2023-02-05 03:41:45,277 - INFO - [Train] step: 9699, loss_advD: 0.003267, lr_D: 0.000200 2023-02-05 03:41:45,348 - INFO - [Train] step: 9699, loss_advG: 10.609365, loss_rec: 0.004467, lr_G: 0.002000 2023-02-05 03:42:18,484 - INFO - [Train] step: 9799, loss_advD: 0.000715, lr_D: 0.000200 2023-02-05 03:42:18,556 - INFO - [Train] step: 9799, loss_advG: 10.877984, loss_rec: 0.004624, lr_G: 0.002000 2023-02-05 03:42:53,647 - INFO - [Train] step: 9899, loss_advD: 0.001474, lr_D: 0.000200 2023-02-05 03:42:53,719 - INFO - [Train] step: 9899, loss_advG: 12.033285, loss_rec: 0.004497, lr_G: 0.002000 2023-02-05 03:43:28,866 - INFO - [Train] step: 9999, loss_advD: 0.000762, lr_D: 0.000200 2023-02-05 03:43:28,937 - INFO - [Train] step: 9999, loss_advG: 8.995229, loss_rec: 0.004613, lr_G: 0.002000 2023-02-05 03:43:37,957 - INFO - [Eval] step: 9999, psnr: 19.706984, ssim: 0.628017, l1: 0.069347 2023-02-05 03:44:06,613 - INFO - [Train] step: 10099, loss_advD: 0.000278, lr_D: 0.000200 2023-02-05 03:44:06,685 - INFO - [Train] step: 10099, loss_advG: 9.586099, loss_rec: 0.004806, lr_G: 0.002000 2023-02-05 03:44:38,986 - INFO - [Train] step: 10199, loss_advD: 0.000650, lr_D: 0.000200 2023-02-05 03:44:39,058 - INFO - [Train] step: 10199, loss_advG: 11.392051, loss_rec: 0.004559, lr_G: 0.002000 2023-02-05 03:45:13,332 - INFO - [Train] step: 10299, loss_advD: 0.000267, lr_D: 0.000200 2023-02-05 03:45:13,404 - INFO - [Train] step: 10299, loss_advG: 10.305177, loss_rec: 0.004830, lr_G: 0.002000 2023-02-05 03:45:47,899 - INFO - [Train] step: 10399, loss_advD: 0.000188, lr_D: 0.000200 2023-02-05 03:45:47,971 - INFO - [Train] step: 10399, loss_advG: 12.032029, loss_rec: 0.004768, lr_G: 0.002000 2023-02-05 03:45:57,016 - INFO - [Eval] step: 10399, psnr: 19.699211, ssim: 0.622629, l1: 0.069304 2023-02-05 03:46:24,974 - INFO - [Train] step: 10499, loss_advD: 0.000141, lr_D: 0.000200 2023-02-05 03:46:25,044 - INFO - [Train] step: 10499, loss_advG: 11.786245, loss_rec: 0.004898, lr_G: 0.002000 2023-02-05 03:46:58,073 - INFO - [Train] step: 10599, loss_advD: 0.000451, lr_D: 0.000200 2023-02-05 03:46:58,145 - INFO - [Train] step: 10599, loss_advG: 10.243942, loss_rec: 0.004277, lr_G: 0.002000 2023-02-05 03:47:32,789 - INFO - [Train] step: 10699, loss_advD: 0.000122, lr_D: 0.000200 2023-02-05 03:47:32,860 - INFO - [Train] step: 10699, loss_advG: 12.313167, loss_rec: 0.004334, lr_G: 0.002000 2023-02-05 03:48:06,941 - INFO - [Train] step: 10799, loss_advD: 0.000230, lr_D: 0.000200 2023-02-05 03:48:07,012 - INFO - [Train] step: 10799, loss_advG: 11.187414, loss_rec: 0.004331, lr_G: 0.002000 2023-02-05 03:48:13,483 - INFO - [Eval] step: 10799, psnr: 19.695419, ssim: 0.622167, l1: 0.069325 2023-02-05 03:48:46,059 - INFO - [Train] step: 10899, loss_advD: 0.000147, lr_D: 0.000200 2023-02-05 03:48:46,130 - INFO - [Train] step: 10899, loss_advG: 12.769921, loss_rec: 0.004233, lr_G: 0.002000 2023-02-05 03:49:23,036 - INFO - [Train] step: 10999, loss_advD: 0.000152, lr_D: 0.000200 2023-02-05 03:49:23,109 - INFO - [Train] step: 10999, loss_advG: 11.394972, loss_rec: 0.004229, lr_G: 0.002000 2023-02-05 03:49:55,926 - INFO - [Train] step: 11099, loss_advD: 0.000073, lr_D: 0.000200 2023-02-05 03:49:55,998 - INFO - [Train] step: 11099, loss_advG: 13.711700, loss_rec: 0.004587, lr_G: 0.002000 2023-02-05 03:50:30,380 - INFO - [Train] step: 11199, loss_advD: 0.960989, lr_D: 0.000200 2023-02-05 03:50:30,452 - INFO - [Train] step: 11199, loss_advG: 0.456974, loss_rec: 0.004355, lr_G: 0.002000 2023-02-05 03:50:39,366 - INFO - [Eval] step: 11199, psnr: 19.714510, ssim: 0.623692, l1: 0.069660 2023-02-05 03:51:08,721 - INFO - [Train] step: 11299, loss_advD: 0.005987, lr_D: 0.000200 2023-02-05 03:51:08,793 - INFO - [Train] step: 11299, loss_advG: 7.825544, loss_rec: 0.004291, lr_G: 0.002000 2023-02-05 03:51:43,228 - INFO - [Train] step: 11399, loss_advD: 0.000713, lr_D: 0.000200 2023-02-05 03:51:43,300 - INFO - [Train] step: 11399, loss_advG: 9.464728, loss_rec: 0.005075, lr_G: 0.002000 2023-02-05 03:52:16,578 - INFO - [Train] step: 11499, loss_advD: 0.001885, lr_D: 0.000200 2023-02-05 03:52:16,650 - INFO - [Train] step: 11499, loss_advG: 8.319887, loss_rec: 0.004210, lr_G: 0.002000 2023-02-05 03:52:51,822 - INFO - [Train] step: 11599, loss_advD: 0.000462, lr_D: 0.000200 2023-02-05 03:52:51,893 - INFO - [Train] step: 11599, loss_advG: 9.905111, loss_rec: 0.004051, lr_G: 0.002000 2023-02-05 03:53:00,222 - INFO - [Eval] step: 11599, psnr: 19.762608, ssim: 0.625073, l1: 0.069015 2023-02-05 03:53:28,825 - INFO - [Train] step: 11699, loss_advD: 0.000783, lr_D: 0.000200 2023-02-05 03:53:28,897 - INFO - [Train] step: 11699, loss_advG: 8.908133, loss_rec: 0.003982, lr_G: 0.002000 2023-02-05 03:54:03,293 - INFO - [Train] step: 11799, loss_advD: 0.000617, lr_D: 0.000200 2023-02-05 03:54:03,365 - INFO - [Train] step: 11799, loss_advG: 13.265127, loss_rec: 0.004480, lr_G: 0.002000 2023-02-05 03:54:35,878 - INFO - [Train] step: 11899, loss_advD: 0.000239, lr_D: 0.000200 2023-02-05 03:54:35,950 - INFO - [Train] step: 11899, loss_advG: 10.388914, loss_rec: 0.004112, lr_G: 0.002000 2023-02-05 03:55:10,635 - INFO - [Train] step: 11999, loss_advD: 0.000122, lr_D: 0.000200 2023-02-05 03:55:10,707 - INFO - [Train] step: 11999, loss_advG: 11.453613, loss_rec: 0.004361, lr_G: 0.002000 2023-02-05 03:55:19,759 - INFO - [Eval] step: 11999, psnr: 19.739294, ssim: 0.622623, l1: 0.069386 2023-02-05 03:55:47,697 - INFO - [Train] step: 12099, loss_advD: 0.000116, lr_D: 0.000200 2023-02-05 03:55:47,768 - INFO - [Train] step: 12099, loss_advG: 13.666401, loss_rec: 0.004082, lr_G: 0.002000 2023-02-05 03:56:22,009 - INFO - [Train] step: 12199, loss_advD: 0.000105, lr_D: 0.000200 2023-02-05 03:56:22,080 - INFO - [Train] step: 12199, loss_advG: 12.925543, loss_rec: 0.003967, lr_G: 0.002000 2023-02-05 03:56:55,410 - INFO - [Train] step: 12299, loss_advD: 0.000146, lr_D: 0.000200 2023-02-05 03:56:55,482 - INFO - [Train] step: 12299, loss_advG: 12.149482, loss_rec: 0.005269, lr_G: 0.002000 2023-02-05 03:57:30,161 - INFO - [Train] step: 12399, loss_advD: 0.000068, lr_D: 0.000200 2023-02-05 03:57:30,232 - INFO - [Train] step: 12399, loss_advG: 12.339405, loss_rec: 0.004191, lr_G: 0.002000 2023-02-05 03:57:36,529 - INFO - [Eval] step: 12399, psnr: 19.694405, ssim: 0.623653, l1: 0.069263 2023-02-05 03:58:10,079 - INFO - [Train] step: 12499, loss_advD: 0.000175, lr_D: 0.000200 2023-02-05 03:58:10,150 - INFO - [Train] step: 12499, loss_advG: 12.073913, loss_rec: 0.004112, lr_G: 0.002000 2023-02-05 03:58:46,401 - INFO - [Train] step: 12599, loss_advD: 0.000051, lr_D: 0.000200 2023-02-05 03:58:46,473 - INFO - [Train] step: 12599, loss_advG: 12.417063, loss_rec: 0.003911, lr_G: 0.002000 2023-02-05 03:59:21,602 - INFO - [Train] step: 12699, loss_advD: 0.000116, lr_D: 0.000200 2023-02-05 03:59:21,677 - INFO - [Train] step: 12699, loss_advG: 11.097054, loss_rec: 0.004020, lr_G: 0.002000 2023-02-05 03:59:55,506 - INFO - [Train] step: 12799, loss_advD: 0.000068, lr_D: 0.000200 2023-02-05 03:59:55,578 - INFO - [Train] step: 12799, loss_advG: 12.116154, loss_rec: 0.003936, lr_G: 0.002000 2023-02-05 04:00:04,402 - INFO - [Eval] step: 12799, psnr: 19.713369, ssim: 0.623697, l1: 0.069205 2023-02-05 04:00:33,343 - INFO - [Train] step: 12899, loss_advD: 0.000151, lr_D: 0.000200 2023-02-05 04:00:33,415 - INFO - [Train] step: 12899, loss_advG: 10.464741, loss_rec: 0.003939, lr_G: 0.002000 2023-02-05 04:01:08,484 - INFO - [Train] step: 12999, loss_advD: 0.000064, lr_D: 0.000200 2023-02-05 04:01:08,556 - INFO - [Train] step: 12999, loss_advG: 15.081207, loss_rec: 0.003929, lr_G: 0.002000 2023-02-05 04:01:42,452 - INFO - [Train] step: 13099, loss_advD: 0.000147, lr_D: 0.000200 2023-02-05 04:01:42,523 - INFO - [Train] step: 13099, loss_advG: 12.457805, loss_rec: 0.004143, lr_G: 0.002000 2023-02-05 04:02:15,288 - INFO - [Train] step: 13199, loss_advD: 0.000026, lr_D: 0.000200 2023-02-05 04:02:15,360 - INFO - [Train] step: 13199, loss_advG: 14.074402, loss_rec: 0.003826, lr_G: 0.002000 2023-02-05 04:02:23,998 - INFO - [Eval] step: 13199, psnr: 19.708006, ssim: 0.624745, l1: 0.069261 2023-02-05 04:02:53,217 - INFO - [Train] step: 13299, loss_advD: 0.000020, lr_D: 0.000200 2023-02-05 04:02:53,289 - INFO - [Train] step: 13299, loss_advG: 13.767408, loss_rec: 0.003963, lr_G: 0.002000 2023-02-05 04:03:28,169 - INFO - [Train] step: 13399, loss_advD: 0.000348, lr_D: 0.000200 2023-02-05 04:03:28,240 - INFO - [Train] step: 13399, loss_advG: 14.418949, loss_rec: 0.003730, lr_G: 0.002000 2023-02-05 04:04:02,276 - INFO - [Train] step: 13499, loss_advD: 0.000035, lr_D: 0.000200 2023-02-05 04:04:02,348 - INFO - [Train] step: 13499, loss_advG: 14.339125, loss_rec: 0.003853, lr_G: 0.002000 2023-02-05 04:04:36,156 - INFO - [Train] step: 13599, loss_advD: 0.000042, lr_D: 0.000200 2023-02-05 04:04:36,227 - INFO - [Train] step: 13599, loss_advG: 12.399504, loss_rec: 0.004353, lr_G: 0.002000 2023-02-05 04:04:45,211 - INFO - [Eval] step: 13599, psnr: 19.649940, ssim: 0.620742, l1: 0.069855 2023-02-05 04:05:14,378 - INFO - [Train] step: 13699, loss_advD: 0.000046, lr_D: 0.000200 2023-02-05 04:05:14,450 - INFO - [Train] step: 13699, loss_advG: 11.752925, loss_rec: 0.003873, lr_G: 0.002000 2023-02-05 04:05:48,945 - INFO - [Train] step: 13799, loss_advD: 0.000012, lr_D: 0.000200 2023-02-05 04:05:49,016 - INFO - [Train] step: 13799, loss_advG: 14.527533, loss_rec: 0.003830, lr_G: 0.002000 2023-02-05 04:06:23,754 - INFO - [Train] step: 13899, loss_advD: 0.000023, lr_D: 0.000200 2023-02-05 04:06:23,825 - INFO - [Train] step: 13899, loss_advG: 13.833992, loss_rec: 0.004294, lr_G: 0.002000 2023-02-05 04:06:56,671 - INFO - [Train] step: 13999, loss_advD: 0.000117, lr_D: 0.000200 2023-02-05 04:06:56,742 - INFO - [Train] step: 13999, loss_advG: 16.077839, loss_rec: 0.003977, lr_G: 0.002000 2023-02-05 04:07:02,719 - INFO - [Eval] step: 13999, psnr: 19.723738, ssim: 0.622631, l1: 0.069251 2023-02-05 04:07:36,069 - INFO - [Train] step: 14099, loss_advD: 0.000063, lr_D: 0.000200 2023-02-05 04:07:36,140 - INFO - [Train] step: 14099, loss_advG: 11.433438, loss_rec: 0.003838, lr_G: 0.002000 2023-02-05 04:08:12,052 - INFO - [Train] step: 14199, loss_advD: 0.787659, lr_D: 0.000200 2023-02-05 04:08:12,123 - INFO - [Train] step: 14199, loss_advG: 1.694443, loss_rec: 0.004007, lr_G: 0.002000 2023-02-05 04:08:46,627 - INFO - [Train] step: 14299, loss_advD: 0.014021, lr_D: 0.000200 2023-02-05 04:08:46,699 - INFO - [Train] step: 14299, loss_advG: 5.924788, loss_rec: 0.003646, lr_G: 0.002000 2023-02-05 04:09:21,116 - INFO - [Train] step: 14399, loss_advD: 0.002627, lr_D: 0.000200 2023-02-05 04:09:21,188 - INFO - [Train] step: 14399, loss_advG: 7.661895, loss_rec: 0.003804, lr_G: 0.002000 2023-02-05 04:09:30,118 - INFO - [Eval] step: 14399, psnr: 19.717094, ssim: 0.623622, l1: 0.069087 2023-02-05 04:09:57,394 - INFO - [Train] step: 14499, loss_advD: 0.002447, lr_D: 0.000200 2023-02-05 04:09:57,465 - INFO - [Train] step: 14499, loss_advG: 7.366674, loss_rec: 0.004682, lr_G: 0.002000 2023-02-05 04:10:32,011 - INFO - [Train] step: 14599, loss_advD: 0.000489, lr_D: 0.000200 2023-02-05 04:10:32,083 - INFO - [Train] step: 14599, loss_advG: 10.171965, loss_rec: 0.003704, lr_G: 0.002000 2023-02-05 04:11:07,443 - INFO - [Train] step: 14699, loss_advD: 0.000917, lr_D: 0.000200 2023-02-05 04:11:07,514 - INFO - [Train] step: 14699, loss_advG: 9.521008, loss_rec: 0.003861, lr_G: 0.002000 2023-02-05 04:11:41,710 - INFO - [Train] step: 14799, loss_advD: 0.000971, lr_D: 0.000200 2023-02-05 04:11:41,782 - INFO - [Train] step: 14799, loss_advG: 8.375810, loss_rec: 0.004045, lr_G: 0.002000 2023-02-05 04:11:50,853 - INFO - [Eval] step: 14799, psnr: 19.738028, ssim: 0.622594, l1: 0.069213 2023-02-05 04:12:18,450 - INFO - [Train] step: 14899, loss_advD: 0.000560, lr_D: 0.000200 2023-02-05 04:12:18,522 - INFO - [Train] step: 14899, loss_advG: 9.460289, loss_rec: 0.003608, lr_G: 0.002000 2023-02-05 04:12:53,022 - INFO - [Train] step: 14999, loss_advD: 0.000514, lr_D: 0.000200 2023-02-05 04:12:53,094 - INFO - [Train] step: 14999, loss_advG: 10.017071, loss_rec: 0.003330, lr_G: 0.002000 2023-02-05 04:13:27,870 - INFO - [Train] step: 15099, loss_advD: 0.000491, lr_D: 0.000200 2023-02-05 04:13:27,942 - INFO - [Train] step: 15099, loss_advG: 10.008247, loss_rec: 0.003412, lr_G: 0.002000 2023-02-05 04:14:02,734 - INFO - [Train] step: 15199, loss_advD: 0.000192, lr_D: 0.000200 2023-02-05 04:14:02,805 - INFO - [Train] step: 15199, loss_advG: 10.382524, loss_rec: 0.003939, lr_G: 0.002000 2023-02-05 04:14:11,282 - INFO - [Eval] step: 15199, psnr: 19.712259, ssim: 0.623037, l1: 0.069822 2023-02-05 04:14:39,939 - INFO - [Train] step: 15299, loss_advD: 0.000258, lr_D: 0.000200 2023-02-05 04:14:40,011 - INFO - [Train] step: 15299, loss_advG: 9.816211, loss_rec: 0.003950, lr_G: 0.002000 2023-02-05 04:15:14,409 - INFO - [Train] step: 15399, loss_advD: 0.000204, lr_D: 0.000200 2023-02-05 04:15:14,481 - INFO - [Train] step: 15399, loss_advG: 11.542559, loss_rec: 0.003773, lr_G: 0.002000 2023-02-05 04:15:49,028 - INFO - [Train] step: 15499, loss_advD: 0.000343, lr_D: 0.000200 2023-02-05 04:15:49,099 - INFO - [Train] step: 15499, loss_advG: 10.802710, loss_rec: 0.003661, lr_G: 0.002000 2023-02-05 04:16:23,386 - INFO - [Train] step: 15599, loss_advD: 0.000208, lr_D: 0.000200 2023-02-05 04:16:23,456 - INFO - [Train] step: 15599, loss_advG: 12.795889, loss_rec: 0.003612, lr_G: 0.002000 2023-02-05 04:16:28,690 - INFO - [Eval] step: 15599, psnr: 19.732052, ssim: 0.623473, l1: 0.069090 2023-02-05 04:17:04,082 - INFO - [Train] step: 15699, loss_advD: 0.000072, lr_D: 0.000200 2023-02-05 04:17:04,156 - INFO - [Train] step: 15699, loss_advG: 11.612257, loss_rec: 0.003696, lr_G: 0.002000 2023-02-05 04:17:36,972 - INFO - [Train] step: 15799, loss_advD: 0.000148, lr_D: 0.000200 2023-02-05 04:17:37,043 - INFO - [Train] step: 15799, loss_advG: 10.401220, loss_rec: 0.003588, lr_G: 0.002000 2023-02-05 04:18:10,996 - INFO - [Train] step: 15899, loss_advD: 0.000236, lr_D: 0.000200 2023-02-05 04:18:11,068 - INFO - [Train] step: 15899, loss_advG: 10.922489, loss_rec: 0.003645, lr_G: 0.002000 2023-02-05 04:18:45,654 - INFO - [Train] step: 15999, loss_advD: 0.000177, lr_D: 0.000200 2023-02-05 04:18:45,726 - INFO - [Train] step: 15999, loss_advG: 10.754025, loss_rec: 0.003331, lr_G: 0.002000 2023-02-05 04:18:54,796 - INFO - [Eval] step: 15999, psnr: 19.729275, ssim: 0.623625, l1: 0.069257 2023-02-05 04:19:23,266 - INFO - [Train] step: 16099, loss_advD: 0.000099, lr_D: 0.000200 2023-02-05 04:19:23,341 - INFO - [Train] step: 16099, loss_advG: 10.910352, loss_rec: 0.003663, lr_G: 0.002000 2023-02-05 04:19:55,692 - INFO - [Train] step: 16199, loss_advD: 0.000279, lr_D: 0.000200 2023-02-05 04:19:55,764 - INFO - [Train] step: 16199, loss_advG: 10.007302, loss_rec: 0.003361, lr_G: 0.002000 2023-02-05 04:20:31,236 - INFO - [Train] step: 16299, loss_advD: 0.000102, lr_D: 0.000200 2023-02-05 04:20:31,307 - INFO - [Train] step: 16299, loss_advG: 13.784885, loss_rec: 0.003203, lr_G: 0.002000 2023-02-05 04:21:06,175 - INFO - [Train] step: 16399, loss_advD: 0.000052, lr_D: 0.000200 2023-02-05 04:21:06,247 - INFO - [Train] step: 16399, loss_advG: 11.905515, loss_rec: 0.003318, lr_G: 0.002000 2023-02-05 04:21:14,886 - INFO - [Eval] step: 16399, psnr: 19.698275, ssim: 0.619515, l1: 0.069410 2023-02-05 04:21:44,159 - INFO - [Train] step: 16499, loss_advD: 0.000064, lr_D: 0.000200 2023-02-05 04:21:44,231 - INFO - [Train] step: 16499, loss_advG: 11.613094, loss_rec: 0.003249, lr_G: 0.002000 2023-02-05 04:22:17,524 - INFO - [Train] step: 16599, loss_advD: 0.000268, lr_D: 0.000200 2023-02-05 04:22:17,596 - INFO - [Train] step: 16599, loss_advG: 10.149599, loss_rec: 0.003943, lr_G: 0.002000 2023-02-05 04:22:52,809 - INFO - [Train] step: 16699, loss_advD: 0.001126, lr_D: 0.000200 2023-02-05 04:22:52,881 - INFO - [Train] step: 16699, loss_advG: 9.759493, loss_rec: 0.003508, lr_G: 0.002000 2023-02-05 04:23:28,168 - INFO - [Train] step: 16799, loss_advD: 0.000242, lr_D: 0.000200 2023-02-05 04:23:28,240 - INFO - [Train] step: 16799, loss_advG: 10.212954, loss_rec: 0.003974, lr_G: 0.002000 2023-02-05 04:23:36,613 - INFO - [Eval] step: 16799, psnr: 19.678785, ssim: 0.622471, l1: 0.069463 2023-02-05 04:24:06,242 - INFO - [Train] step: 16899, loss_advD: 0.000033, lr_D: 0.000200 2023-02-05 04:24:06,314 - INFO - [Train] step: 16899, loss_advG: 12.749300, loss_rec: 0.003270, lr_G: 0.002000 2023-02-05 04:24:39,921 - INFO - [Train] step: 16999, loss_advD: 0.000078, lr_D: 0.000200 2023-02-05 04:24:39,992 - INFO - [Train] step: 16999, loss_advG: 11.321468, loss_rec: 0.003831, lr_G: 0.002000 2023-02-05 04:25:14,569 - INFO - [Train] step: 17099, loss_advD: 0.000280, lr_D: 0.000200 2023-02-05 04:25:14,640 - INFO - [Train] step: 17099, loss_advG: 10.071532, loss_rec: 0.004248, lr_G: 0.002000 2023-02-05 04:25:49,211 - INFO - [Train] step: 17199, loss_advD: 0.000032, lr_D: 0.000200 2023-02-05 04:25:49,282 - INFO - [Train] step: 17199, loss_advG: 14.885925, loss_rec: 0.003223, lr_G: 0.002000 2023-02-05 04:25:54,508 - INFO - [Eval] step: 17199, psnr: 19.732506, ssim: 0.621099, l1: 0.069106 2023-02-05 04:26:30,847 - INFO - [Train] step: 17299, loss_advD: 0.000025, lr_D: 0.000200 2023-02-05 04:26:30,918 - INFO - [Train] step: 17299, loss_advG: 15.961886, loss_rec: 0.003582, lr_G: 0.002000 2023-02-05 04:27:05,059 - INFO - [Train] step: 17399, loss_advD: 0.000025, lr_D: 0.000200 2023-02-05 04:27:05,131 - INFO - [Train] step: 17399, loss_advG: 13.423854, loss_rec: 0.003446, lr_G: 0.002000 2023-02-05 04:27:38,143 - INFO - [Train] step: 17499, loss_advD: 0.000056, lr_D: 0.000200 2023-02-05 04:27:38,215 - INFO - [Train] step: 17499, loss_advG: 13.319838, loss_rec: 0.003115, lr_G: 0.002000 2023-02-05 04:28:12,769 - INFO - [Train] step: 17599, loss_advD: 1.473935, lr_D: 0.000200 2023-02-05 04:28:12,841 - INFO - [Train] step: 17599, loss_advG: 0.626415, loss_rec: 0.003746, lr_G: 0.002000 2023-02-05 04:28:21,122 - INFO - [Eval] step: 17599, psnr: 19.722540, ssim: 0.618800, l1: 0.069473 2023-02-05 04:28:50,057 - INFO - [Train] step: 17699, loss_advD: 0.068805, lr_D: 0.000200 2023-02-05 04:28:50,129 - INFO - [Train] step: 17699, loss_advG: 4.321709, loss_rec: 0.003104, lr_G: 0.002000 2023-02-05 04:29:24,110 - INFO - [Train] step: 17799, loss_advD: 0.035838, lr_D: 0.000200 2023-02-05 04:29:24,182 - INFO - [Train] step: 17799, loss_advG: 5.284463, loss_rec: 0.003034, lr_G: 0.002000 2023-02-05 04:29:57,045 - INFO - [Train] step: 17899, loss_advD: 0.023005, lr_D: 0.000200 2023-02-05 04:29:57,117 - INFO - [Train] step: 17899, loss_advG: 6.173244, loss_rec: 0.003343, lr_G: 0.002000 2023-02-05 04:30:31,267 - INFO - [Train] step: 17999, loss_advD: 0.004186, lr_D: 0.000200 2023-02-05 04:30:31,339 - INFO - [Train] step: 17999, loss_advG: 9.231730, loss_rec: 0.003049, lr_G: 0.002000 2023-02-05 04:30:40,306 - INFO - [Eval] step: 17999, psnr: 19.723183, ssim: 0.621981, l1: 0.069184 2023-02-05 04:31:08,933 - INFO - [Train] step: 18099, loss_advD: 0.001444, lr_D: 0.000200 2023-02-05 04:31:09,005 - INFO - [Train] step: 18099, loss_advG: 10.400887, loss_rec: 0.003266, lr_G: 0.002000 2023-02-05 04:31:43,997 - INFO - [Train] step: 18199, loss_advD: 0.001496, lr_D: 0.000200 2023-02-05 04:31:44,068 - INFO - [Train] step: 18199, loss_advG: 12.474937, loss_rec: 0.003284, lr_G: 0.002000 2023-02-05 04:32:16,702 - INFO - [Train] step: 18299, loss_advD: 0.000555, lr_D: 0.000200 2023-02-05 04:32:16,774 - INFO - [Train] step: 18299, loss_advG: 10.119998, loss_rec: 0.003124, lr_G: 0.002000 2023-02-05 04:32:52,142 - INFO - [Train] step: 18399, loss_advD: 0.000923, lr_D: 0.000200 2023-02-05 04:32:52,214 - INFO - [Train] step: 18399, loss_advG: 12.020157, loss_rec: 0.003546, lr_G: 0.002000 2023-02-05 04:33:00,492 - INFO - [Eval] step: 18399, psnr: 19.699955, ssim: 0.619563, l1: 0.069573 2023-02-05 04:33:30,525 - INFO - [Train] step: 18499, loss_advD: 0.000766, lr_D: 0.000200 2023-02-05 04:33:30,596 - INFO - [Train] step: 18499, loss_advG: 11.476164, loss_rec: 0.003463, lr_G: 0.002000 2023-02-05 04:34:04,356 - INFO - [Train] step: 18599, loss_advD: 0.000171, lr_D: 0.000200 2023-02-05 04:34:04,428 - INFO - [Train] step: 18599, loss_advG: 12.585423, loss_rec: 0.003164, lr_G: 0.002000 2023-02-05 04:34:38,489 - INFO - [Train] step: 18699, loss_advD: 0.000351, lr_D: 0.000200 2023-02-05 04:34:38,561 - INFO - [Train] step: 18699, loss_advG: 10.476458, loss_rec: 0.003268, lr_G: 0.002000 2023-02-05 04:35:13,127 - INFO - [Train] step: 18799, loss_advD: 0.000558, lr_D: 0.000200 2023-02-05 04:35:13,198 - INFO - [Train] step: 18799, loss_advG: 12.567160, loss_rec: 0.003209, lr_G: 0.002000 2023-02-05 04:35:18,400 - INFO - [Eval] step: 18799, psnr: 19.695696, ssim: 0.622155, l1: 0.069423 2023-02-05 04:35:53,693 - INFO - [Train] step: 18899, loss_advD: 0.000862, lr_D: 0.000200 2023-02-05 04:35:53,765 - INFO - [Train] step: 18899, loss_advG: 8.435705, loss_rec: 0.003201, lr_G: 0.002000 2023-02-05 04:36:29,127 - INFO - [Train] step: 18999, loss_advD: 0.000137, lr_D: 0.000200 2023-02-05 04:36:29,198 - INFO - [Train] step: 18999, loss_advG: 11.603076, loss_rec: 0.003009, lr_G: 0.002000 2023-02-05 04:37:03,611 - INFO - [Train] step: 19099, loss_advD: 0.000300, lr_D: 0.000200 2023-02-05 04:37:03,683 - INFO - [Train] step: 19099, loss_advG: 9.522005, loss_rec: 0.003445, lr_G: 0.002000 2023-02-05 04:37:36,374 - INFO - [Train] step: 19199, loss_advD: 0.000219, lr_D: 0.000200 2023-02-05 04:37:36,446 - INFO - [Train] step: 19199, loss_advG: 12.784617, loss_rec: 0.003172, lr_G: 0.002000 2023-02-05 04:37:45,514 - INFO - [Eval] step: 19199, psnr: 19.700224, ssim: 0.619799, l1: 0.069657 2023-02-05 04:38:14,503 - INFO - [Train] step: 19299, loss_advD: 0.000205, lr_D: 0.000200 2023-02-05 04:38:14,575 - INFO - [Train] step: 19299, loss_advG: 10.134842, loss_rec: 0.002996, lr_G: 0.002000 2023-02-05 04:38:49,592 - INFO - [Train] step: 19399, loss_advD: 0.000137, lr_D: 0.000200 2023-02-05 04:38:49,664 - INFO - [Train] step: 19399, loss_advG: 13.998104, loss_rec: 0.003024, lr_G: 0.002000 2023-02-05 04:39:23,887 - INFO - [Train] step: 19499, loss_advD: 0.000283, lr_D: 0.000200 2023-02-05 04:39:23,958 - INFO - [Train] step: 19499, loss_advG: 9.220377, loss_rec: 0.003494, lr_G: 0.002000 2023-02-05 04:39:57,248 - INFO - [Train] step: 19599, loss_advD: 0.000126, lr_D: 0.000200 2023-02-05 04:39:57,320 - INFO - [Train] step: 19599, loss_advG: 11.479361, loss_rec: 0.003021, lr_G: 0.002000 2023-02-05 04:40:06,318 - INFO - [Eval] step: 19599, psnr: 19.702301, ssim: 0.620703, l1: 0.069315 2023-02-05 04:40:36,644 - INFO - [Train] step: 19699, loss_advD: 0.000272, lr_D: 0.000200 2023-02-05 04:40:36,723 - INFO - [Train] step: 19699, loss_advG: 12.322084, loss_rec: 0.003144, lr_G: 0.002000 2023-02-05 04:41:11,746 - INFO - [Train] step: 19799, loss_advD: 0.000127, lr_D: 0.000200 2023-02-05 04:41:11,818 - INFO - [Train] step: 19799, loss_advG: 12.018551, loss_rec: 0.003213, lr_G: 0.002000 2023-02-05 04:41:46,116 - INFO - [Train] step: 19899, loss_advD: 0.000059, lr_D: 0.000200 2023-02-05 04:41:46,187 - INFO - [Train] step: 19899, loss_advG: 12.732761, loss_rec: 0.003273, lr_G: 0.002000 2023-02-05 04:42:19,680 - INFO - [Train] step: 19999, loss_advD: 0.000101, lr_D: 0.000200 2023-02-05 04:42:19,752 - INFO - [Train] step: 19999, loss_advG: 11.524728, loss_rec: 0.003002, lr_G: 0.002000 2023-02-05 04:42:28,115 - INFO - [Eval] step: 19999, psnr: 19.676723, ssim: 0.619740, l1: 0.069475 2023-02-05 04:42:58,861 - INFO - [Train] step: 20099, loss_advD: 0.000065, lr_D: 0.000200 2023-02-05 04:42:58,933 - INFO - [Train] step: 20099, loss_advG: 11.015472, loss_rec: 0.003269, lr_G: 0.002000 2023-02-05 04:43:33,694 - INFO - [Train] step: 20199, loss_advD: 0.000144, lr_D: 0.000200 2023-02-05 04:43:33,765 - INFO - [Train] step: 20199, loss_advG: 12.742013, loss_rec: 0.002652, lr_G: 0.002000 2023-02-05 04:44:07,326 - INFO - [Train] step: 20299, loss_advD: 0.000053, lr_D: 0.000200 2023-02-05 04:44:07,397 - INFO - [Train] step: 20299, loss_advG: 11.565813, loss_rec: 0.003664, lr_G: 0.002000 2023-02-05 04:44:43,272 - INFO - [Train] step: 20399, loss_advD: 0.000042, lr_D: 0.000200 2023-02-05 04:44:43,344 - INFO - [Train] step: 20399, loss_advG: 12.732456, loss_rec: 0.003143, lr_G: 0.002000 2023-02-05 04:44:52,413 - INFO - [Eval] step: 20399, psnr: 19.723099, ssim: 0.621302, l1: 0.069263 2023-02-05 04:45:19,356 - INFO - [Train] step: 20499, loss_advD: 0.000040, lr_D: 0.000200 2023-02-05 04:45:19,428 - INFO - [Train] step: 20499, loss_advG: 14.117254, loss_rec: 0.002794, lr_G: 0.002000 2023-02-05 04:45:53,690 - INFO - [Train] step: 20599, loss_advD: 0.000189, lr_D: 0.000200 2023-02-05 04:45:53,762 - INFO - [Train] step: 20599, loss_advG: 9.951859, loss_rec: 0.002950, lr_G: 0.002000 2023-02-05 04:46:29,294 - INFO - [Train] step: 20699, loss_advD: 0.000044, lr_D: 0.000200 2023-02-05 04:46:29,365 - INFO - [Train] step: 20699, loss_advG: 14.832006, loss_rec: 0.002959, lr_G: 0.002000 2023-02-05 04:47:03,519 - INFO - [Train] step: 20799, loss_advD: 0.000020, lr_D: 0.000200 2023-02-05 04:47:03,591 - INFO - [Train] step: 20799, loss_advG: 13.109360, loss_rec: 0.002891, lr_G: 0.002000 2023-02-05 04:47:12,585 - INFO - [Eval] step: 20799, psnr: 19.720379, ssim: 0.621713, l1: 0.069389 2023-02-05 04:47:40,277 - INFO - [Train] step: 20899, loss_advD: 0.000229, lr_D: 0.000200 2023-02-05 04:47:40,349 - INFO - [Train] step: 20899, loss_advG: 9.799937, loss_rec: 0.003000, lr_G: 0.002000 2023-02-05 04:48:15,054 - INFO - [Train] step: 20999, loss_advD: 0.000189, lr_D: 0.000200 2023-02-05 04:48:15,126 - INFO - [Train] step: 20999, loss_advG: 10.298822, loss_rec: 0.002873, lr_G: 0.002000 2023-02-05 04:48:50,370 - INFO - [Train] step: 21099, loss_advD: 0.000010, lr_D: 0.000200 2023-02-05 04:48:50,441 - INFO - [Train] step: 21099, loss_advG: 13.396894, loss_rec: 0.002982, lr_G: 0.002000 2023-02-05 04:49:25,015 - INFO - [Train] step: 21199, loss_advD: 0.000088, lr_D: 0.000200 2023-02-05 04:49:25,087 - INFO - [Train] step: 21199, loss_advG: 10.603827, loss_rec: 0.003390, lr_G: 0.002000 2023-02-05 04:49:34,158 - INFO - [Eval] step: 21199, psnr: 19.666033, ssim: 0.618219, l1: 0.070154 2023-02-05 04:50:01,485 - INFO - [Train] step: 21299, loss_advD: 0.000009, lr_D: 0.000200 2023-02-05 04:50:01,556 - INFO - [Train] step: 21299, loss_advG: 14.590105, loss_rec: 0.002765, lr_G: 0.002000 2023-02-05 04:50:36,155 - INFO - [Train] step: 21399, loss_advD: 0.193975, lr_D: 0.000200 2023-02-05 04:50:36,227 - INFO - [Train] step: 21399, loss_advG: 5.424939, loss_rec: 0.002989, lr_G: 0.002000 2023-02-05 04:51:11,360 - INFO - [Train] step: 21499, loss_advD: 0.125662, lr_D: 0.000200 2023-02-05 04:51:11,431 - INFO - [Train] step: 21499, loss_advG: 3.030506, loss_rec: 0.002853, lr_G: 0.002000 2023-02-05 04:51:46,182 - INFO - [Train] step: 21599, loss_advD: 0.002449, lr_D: 0.000200 2023-02-05 04:51:46,254 - INFO - [Train] step: 21599, loss_advG: 8.248684, loss_rec: 0.003080, lr_G: 0.002000 2023-02-05 04:51:53,801 - INFO - [Eval] step: 21599, psnr: 19.714691, ssim: 0.619028, l1: 0.069443 2023-02-05 04:52:23,709 - INFO - [Train] step: 21699, loss_advD: 0.003726, lr_D: 0.000200 2023-02-05 04:52:23,780 - INFO - [Train] step: 21699, loss_advG: 7.185606, loss_rec: 0.003320, lr_G: 0.002000 2023-02-05 04:52:57,995 - INFO - [Train] step: 21799, loss_advD: 0.001346, lr_D: 0.000200 2023-02-05 04:52:58,066 - INFO - [Train] step: 21799, loss_advG: 7.833138, loss_rec: 0.003840, lr_G: 0.002000 2023-02-05 04:53:32,267 - INFO - [Train] step: 21899, loss_advD: 0.002244, lr_D: 0.000200 2023-02-05 04:53:32,338 - INFO - [Train] step: 21899, loss_advG: 10.174490, loss_rec: 0.002902, lr_G: 0.002000 2023-02-05 04:54:07,847 - INFO - [Train] step: 21999, loss_advD: 0.043407, lr_D: 0.000200 2023-02-05 04:54:07,919 - INFO - [Train] step: 21999, loss_advG: 6.321966, loss_rec: 0.003148, lr_G: 0.002000 2023-02-05 04:54:16,779 - INFO - [Eval] step: 21999, psnr: 19.697231, ssim: 0.620791, l1: 0.069289 2023-02-05 04:54:45,332 - INFO - [Train] step: 22099, loss_advD: 0.027558, lr_D: 0.000200 2023-02-05 04:54:45,403 - INFO - [Train] step: 22099, loss_advG: 5.520649, loss_rec: 0.002805, lr_G: 0.002000 2023-02-05 04:55:18,122 - INFO - [Train] step: 22199, loss_advD: 0.002029, lr_D: 0.000200 2023-02-05 04:55:18,193 - INFO - [Train] step: 22199, loss_advG: 7.592710, loss_rec: 0.002785, lr_G: 0.002000 2023-02-05 04:55:53,090 - INFO - [Train] step: 22299, loss_advD: 0.001711, lr_D: 0.000200 2023-02-05 04:55:53,162 - INFO - [Train] step: 22299, loss_advG: 8.462732, loss_rec: 0.003451, lr_G: 0.002000 2023-02-05 04:56:27,936 - INFO - [Train] step: 22399, loss_advD: 0.002520, lr_D: 0.000200 2023-02-05 04:56:28,008 - INFO - [Train] step: 22399, loss_advG: 7.392344, loss_rec: 0.002737, lr_G: 0.002000 2023-02-05 04:56:36,846 - INFO - [Eval] step: 22399, psnr: 19.714695, ssim: 0.620610, l1: 0.069461 2023-02-05 04:57:05,225 - INFO - [Train] step: 22499, loss_advD: 0.001452, lr_D: 0.000200 2023-02-05 04:57:05,296 - INFO - [Train] step: 22499, loss_advG: 7.815187, loss_rec: 0.002786, lr_G: 0.002000 2023-02-05 04:57:39,273 - INFO - [Train] step: 22599, loss_advD: 0.000631, lr_D: 0.000200 2023-02-05 04:57:39,344 - INFO - [Train] step: 22599, loss_advG: 13.353419, loss_rec: 0.002716, lr_G: 0.002000 2023-02-05 04:58:13,467 - INFO - [Train] step: 22699, loss_advD: 0.000465, lr_D: 0.000200 2023-02-05 04:58:13,539 - INFO - [Train] step: 22699, loss_advG: 9.834945, loss_rec: 0.002833, lr_G: 0.002000 2023-02-05 04:58:47,603 - INFO - [Train] step: 22799, loss_advD: 0.000745, lr_D: 0.000200 2023-02-05 04:58:47,675 - INFO - [Train] step: 22799, loss_advG: 8.349890, loss_rec: 0.002797, lr_G: 0.002000 2023-02-05 04:58:56,662 - INFO - [Eval] step: 22799, psnr: 19.713228, ssim: 0.618653, l1: 0.069479 2023-02-05 04:59:25,086 - INFO - [Train] step: 22899, loss_advD: 0.001008, lr_D: 0.000200 2023-02-05 04:59:25,158 - INFO - [Train] step: 22899, loss_advG: 8.501411, loss_rec: 0.002803, lr_G: 0.002000 2023-02-05 04:59:58,581 - INFO - [Train] step: 22999, loss_advD: 0.001437, lr_D: 0.000200 2023-02-05 04:59:58,653 - INFO - [Train] step: 22999, loss_advG: 7.921672, loss_rec: 0.002880, lr_G: 0.002000 2023-02-05 05:00:33,322 - INFO - [Train] step: 23099, loss_advD: 0.000146, lr_D: 0.000200 2023-02-05 05:00:33,394 - INFO - [Train] step: 23099, loss_advG: 11.671935, loss_rec: 0.002937, lr_G: 0.002000 2023-02-05 05:01:08,345 - INFO - [Train] step: 23199, loss_advD: 0.000408, lr_D: 0.000200 2023-02-05 05:01:08,417 - INFO - [Train] step: 23199, loss_advG: 11.517298, loss_rec: 0.002880, lr_G: 0.002000 2023-02-05 05:01:15,749 - INFO - [Eval] step: 23199, psnr: 19.703621, ssim: 0.620972, l1: 0.069489 2023-02-05 05:01:47,079 - INFO - [Train] step: 23299, loss_advD: 0.000332, lr_D: 0.000200 2023-02-05 05:01:47,150 - INFO - [Train] step: 23299, loss_advG: 11.520428, loss_rec: 0.002734, lr_G: 0.002000 2023-02-05 05:02:20,057 - INFO - [Train] step: 23399, loss_advD: 0.000251, lr_D: 0.000200 2023-02-05 05:02:20,128 - INFO - [Train] step: 23399, loss_advG: 10.666103, loss_rec: 0.002682, lr_G: 0.002000 2023-02-05 05:02:54,939 - INFO - [Train] step: 23499, loss_advD: 0.000267, lr_D: 0.000200 2023-02-05 05:02:55,010 - INFO - [Train] step: 23499, loss_advG: 13.145149, loss_rec: 0.002711, lr_G: 0.002000 2023-02-05 05:03:30,972 - INFO - [Train] step: 23599, loss_advD: 0.000403, lr_D: 0.000200 2023-02-05 05:03:31,043 - INFO - [Train] step: 23599, loss_advG: 13.416267, loss_rec: 0.002882, lr_G: 0.002000 2023-02-05 05:03:40,065 - INFO - [Eval] step: 23599, psnr: 19.703691, ssim: 0.618050, l1: 0.069831 2023-02-05 05:04:08,522 - INFO - [Train] step: 23699, loss_advD: 0.000333, lr_D: 0.000200 2023-02-05 05:04:08,593 - INFO - [Train] step: 23699, loss_advG: 11.273625, loss_rec: 0.002780, lr_G: 0.002000 2023-02-05 05:04:43,253 - INFO - [Train] step: 23799, loss_advD: 0.000471, lr_D: 0.000200 2023-02-05 05:04:43,325 - INFO - [Train] step: 23799, loss_advG: 9.109072, loss_rec: 0.002729, lr_G: 0.002000 2023-02-05 05:05:16,605 - INFO - [Train] step: 23899, loss_advD: 0.000635, lr_D: 0.000200 2023-02-05 05:05:16,677 - INFO - [Train] step: 23899, loss_advG: 11.657007, loss_rec: 0.003539, lr_G: 0.002000 2023-02-05 05:05:52,032 - INFO - [Train] step: 23999, loss_advD: 0.000586, lr_D: 0.000200 2023-02-05 05:05:52,104 - INFO - [Train] step: 23999, loss_advG: 8.888629, loss_rec: 0.002579, lr_G: 0.002000 2023-02-05 05:06:00,938 - INFO - [Eval] step: 23999, psnr: 19.671732, ssim: 0.618595, l1: 0.069626 2023-02-05 05:06:30,009 - INFO - [Train] step: 24099, loss_advD: 0.000099, lr_D: 0.000200 2023-02-05 05:06:30,081 - INFO - [Train] step: 24099, loss_advG: 12.046221, loss_rec: 0.002778, lr_G: 0.002000 2023-02-05 05:07:04,803 - INFO - [Train] step: 24199, loss_advD: 0.000156, lr_D: 0.000200 2023-02-05 05:07:04,874 - INFO - [Train] step: 24199, loss_advG: 13.224462, loss_rec: 0.002775, lr_G: 0.002000 2023-02-05 05:07:37,878 - INFO - [Train] step: 24299, loss_advD: 0.000101, lr_D: 0.000200 2023-02-05 05:07:37,950 - INFO - [Train] step: 24299, loss_advG: 12.849891, loss_rec: 0.002833, lr_G: 0.002000 2023-02-05 05:08:12,256 - INFO - [Train] step: 24399, loss_advD: 0.000366, lr_D: 0.000200 2023-02-05 05:08:12,327 - INFO - [Train] step: 24399, loss_advG: 9.323100, loss_rec: 0.002812, lr_G: 0.002000 2023-02-05 05:08:21,438 - INFO - [Eval] step: 24399, psnr: 19.722471, ssim: 0.619823, l1: 0.069234 2023-02-05 05:08:50,903 - INFO - [Train] step: 24499, loss_advD: 0.000051, lr_D: 0.000200 2023-02-05 05:08:50,975 - INFO - [Train] step: 24499, loss_advG: 12.747413, loss_rec: 0.002633, lr_G: 0.002000 2023-02-05 05:09:26,025 - INFO - [Train] step: 24599, loss_advD: 0.000067, lr_D: 0.000200 2023-02-05 05:09:26,096 - INFO - [Train] step: 24599, loss_advG: 13.141697, loss_rec: 0.002921, lr_G: 0.002000 2023-02-05 05:09:58,787 - INFO - [Train] step: 24699, loss_advD: 0.000070, lr_D: 0.000200 2023-02-05 05:09:58,858 - INFO - [Train] step: 24699, loss_advG: 12.421226, loss_rec: 0.002783, lr_G: 0.002000 2023-02-05 05:10:33,907 - INFO - [Train] step: 24799, loss_advD: 0.000159, lr_D: 0.000200 2023-02-05 05:10:33,979 - INFO - [Train] step: 24799, loss_advG: 10.437347, loss_rec: 0.002737, lr_G: 0.002000 2023-02-05 05:10:41,457 - INFO - [Eval] step: 24799, psnr: 19.669067, ssim: 0.616710, l1: 0.069760 2023-02-05 05:11:13,184 - INFO - [Train] step: 24899, loss_advD: 0.000154, lr_D: 0.000200 2023-02-05 05:11:13,255 - INFO - [Train] step: 24899, loss_advG: 10.202768, loss_rec: 0.002727, lr_G: 0.002000 2023-02-05 05:11:47,708 - INFO - [Train] step: 24999, loss_advD: 0.000211, lr_D: 0.000200 2023-02-05 05:11:47,779 - INFO - [Train] step: 24999, loss_advG: 11.451674, loss_rec: 0.002453, lr_G: 0.002000 2023-02-05 05:11:48,634 - INFO - Best valid psnr: 19.8019962310791 2023-02-05 05:11:48,634 - INFO - End of training