2023-07-01 15:00:09,799 - INFO - Experiment directory: ./runs/lsgan_cifar10/ 2023-07-01 15:00:09,800 - INFO - Number of processes: 1 2023-07-01 15:00:09,800 - INFO - Distributed type: DistributedType.NO 2023-07-01 15:00:09,800 - INFO - Mixed precision: no 2023-07-01 15:00:09,800 - INFO - ============================== 2023-07-01 15:00:10,089 - INFO - Size of training set: 50000 2023-07-01 15:00:10,089 - INFO - Batch size per process: 512 2023-07-01 15:00:10,089 - INFO - Total batch size: 512 2023-07-01 15:00:10,090 - INFO - ============================== 2023-07-01 15:00:10,633 - INFO - Start training... 2023-07-01 15:01:07,058 - INFO - [Train] step: 99, loss_D: 0.040776, lr_D: 0.000800 2023-07-01 15:01:07,139 - INFO - [Train] step: 99, loss_G: 0.467325, lr_G: 0.000800 2023-07-01 15:02:02,998 - INFO - [Train] step: 199, loss_D: 0.040615, lr_D: 0.000800 2023-07-01 15:02:03,080 - INFO - [Train] step: 199, loss_G: 0.467046, lr_G: 0.000800 2023-07-01 15:02:59,064 - INFO - [Train] step: 299, loss_D: 0.765039, lr_D: 0.000800 2023-07-01 15:02:59,145 - INFO - [Train] step: 299, loss_G: 1.567808, lr_G: 0.000800 2023-07-01 15:03:55,318 - INFO - [Train] step: 399, loss_D: 0.437440, lr_D: 0.000800 2023-07-01 15:03:55,399 - INFO - [Train] step: 399, loss_G: 1.201215, lr_G: 0.000800 2023-07-01 15:04:51,573 - INFO - [Train] step: 499, loss_D: 0.724107, lr_D: 0.000800 2023-07-01 15:04:51,654 - INFO - [Train] step: 499, loss_G: 1.377260, lr_G: 0.000800 2023-07-01 15:05:47,858 - INFO - [Train] step: 599, loss_D: 0.804712, lr_D: 0.000800 2023-07-01 15:05:47,940 - INFO - [Train] step: 599, loss_G: 1.749799, lr_G: 0.000800 2023-07-01 15:06:44,297 - INFO - [Train] step: 699, loss_D: 0.098723, lr_D: 0.000800 2023-07-01 15:06:44,379 - INFO - [Train] step: 699, loss_G: 0.767797, lr_G: 0.000800 2023-07-01 15:07:40,618 - INFO - [Train] step: 799, loss_D: 0.077412, lr_D: 0.000800 2023-07-01 15:07:40,700 - INFO - [Train] step: 799, loss_G: 0.650187, lr_G: 0.000800 2023-07-01 15:08:36,893 - INFO - [Train] step: 899, loss_D: 0.232728, lr_D: 0.000800 2023-07-01 15:08:36,974 - INFO - [Train] step: 899, loss_G: 0.196605, lr_G: 0.000800 2023-07-01 15:09:33,168 - INFO - [Train] step: 999, loss_D: 0.379194, lr_D: 0.000800 2023-07-01 15:09:33,250 - INFO - [Train] step: 999, loss_G: 0.122714, lr_G: 0.000800 2023-07-01 15:10:37,746 - INFO - [Eval] step: 999, fid: 204.688629 2023-07-01 15:11:33,900 - INFO - [Train] step: 1099, loss_D: 1.031090, lr_D: 0.000800 2023-07-01 15:11:33,982 - INFO - [Train] step: 1099, loss_G: 0.149993, lr_G: 0.000800 2023-07-01 15:12:30,236 - INFO - [Train] step: 1199, loss_D: 0.073962, lr_D: 0.000800 2023-07-01 15:12:30,317 - INFO - [Train] step: 1199, loss_G: 0.366868, lr_G: 0.000800 2023-07-01 15:13:26,801 - INFO - [Train] step: 1299, loss_D: 0.066455, lr_D: 0.000800 2023-07-01 15:13:26,883 - INFO - [Train] step: 1299, loss_G: 0.631458, lr_G: 0.000800 2023-07-01 15:14:23,158 - INFO - [Train] step: 1399, loss_D: 1.648886, lr_D: 0.000800 2023-07-01 15:14:23,240 - INFO - [Train] step: 1399, loss_G: 2.199118, lr_G: 0.000800 2023-07-01 15:15:19,538 - INFO - [Train] step: 1499, loss_D: 0.059617, lr_D: 0.000800 2023-07-01 15:15:19,620 - INFO - [Train] step: 1499, loss_G: 0.638867, lr_G: 0.000800 2023-07-01 15:16:15,921 - INFO - [Train] step: 1599, loss_D: 0.019324, lr_D: 0.000800 2023-07-01 15:16:16,002 - INFO - [Train] step: 1599, loss_G: 0.514553, lr_G: 0.000800 2023-07-01 15:17:12,276 - INFO - [Train] step: 1699, loss_D: 0.239341, lr_D: 0.000800 2023-07-01 15:17:12,357 - INFO - [Train] step: 1699, loss_G: 0.284433, lr_G: 0.000800 2023-07-01 15:18:08,625 - INFO - [Train] step: 1799, loss_D: 0.244401, lr_D: 0.000800 2023-07-01 15:18:08,707 - INFO - [Train] step: 1799, loss_G: 0.255333, lr_G: 0.000800 2023-07-01 15:19:05,019 - INFO - [Train] step: 1899, loss_D: 0.113152, lr_D: 0.000800 2023-07-01 15:19:05,100 - INFO - [Train] step: 1899, loss_G: 0.335814, lr_G: 0.000800 2023-07-01 15:20:01,560 - INFO - [Train] step: 1999, loss_D: 0.082486, lr_D: 0.000800 2023-07-01 15:20:01,641 - INFO - [Train] step: 1999, loss_G: 0.431042, lr_G: 0.000800 2023-07-01 15:21:06,142 - INFO - [Eval] step: 1999, fid: 171.175541 2023-07-01 15:22:02,570 - INFO - [Train] step: 2099, loss_D: 0.091768, lr_D: 0.000800 2023-07-01 15:22:02,652 - INFO - [Train] step: 2099, loss_G: 0.503902, lr_G: 0.000800 2023-07-01 15:22:58,937 - INFO - [Train] step: 2199, loss_D: 0.017427, lr_D: 0.000800 2023-07-01 15:22:59,019 - INFO - [Train] step: 2199, loss_G: 0.486828, lr_G: 0.000800 2023-07-01 15:23:55,303 - INFO - [Train] step: 2299, loss_D: 0.022969, lr_D: 0.000800 2023-07-01 15:23:55,384 - INFO - [Train] step: 2299, loss_G: 0.518168, lr_G: 0.000800 2023-07-01 15:24:51,668 - INFO - [Train] step: 2399, loss_D: 0.014084, lr_D: 0.000800 2023-07-01 15:24:51,749 - INFO - [Train] step: 2399, loss_G: 0.438347, lr_G: 0.000800 2023-07-01 15:25:48,011 - INFO - [Train] step: 2499, loss_D: 0.045845, lr_D: 0.000800 2023-07-01 15:25:48,092 - INFO - [Train] step: 2499, loss_G: 0.370210, lr_G: 0.000800 2023-07-01 15:26:44,519 - INFO - [Train] step: 2599, loss_D: 0.022699, lr_D: 0.000800 2023-07-01 15:26:44,600 - INFO - [Train] step: 2599, loss_G: 0.495508, lr_G: 0.000800 2023-07-01 15:27:40,898 - INFO - [Train] step: 2699, loss_D: 0.082826, lr_D: 0.000800 2023-07-01 15:27:40,980 - INFO - [Train] step: 2699, loss_G: 0.400814, lr_G: 0.000800 2023-07-01 15:28:37,275 - INFO - [Train] step: 2799, loss_D: 0.160345, lr_D: 0.000800 2023-07-01 15:28:37,356 - INFO - [Train] step: 2799, loss_G: 0.302427, lr_G: 0.000800 2023-07-01 15:29:33,632 - INFO - [Train] step: 2899, loss_D: 0.066747, lr_D: 0.000800 2023-07-01 15:29:33,714 - INFO - [Train] step: 2899, loss_G: 0.363609, lr_G: 0.000800 2023-07-01 15:30:29,989 - INFO - [Train] step: 2999, loss_D: 0.048972, lr_D: 0.000800 2023-07-01 15:30:30,071 - INFO - [Train] step: 2999, loss_G: 0.446368, lr_G: 0.000800 2023-07-01 15:31:34,357 - INFO - [Eval] step: 2999, fid: 176.967672 2023-07-01 15:32:30,442 - INFO - [Train] step: 3099, loss_D: 0.024519, lr_D: 0.000800 2023-07-01 15:32:30,524 - INFO - [Train] step: 3099, loss_G: 0.513590, lr_G: 0.000800 2023-07-01 15:33:26,846 - INFO - [Train] step: 3199, loss_D: 0.040996, lr_D: 0.000800 2023-07-01 15:33:26,927 - INFO - [Train] step: 3199, loss_G: 0.401883, lr_G: 0.000800 2023-07-01 15:34:23,361 - INFO - [Train] step: 3299, loss_D: 0.023857, lr_D: 0.000800 2023-07-01 15:34:23,442 - INFO - [Train] step: 3299, loss_G: 0.565304, lr_G: 0.000800 2023-07-01 15:35:19,724 - INFO - [Train] step: 3399, loss_D: 0.111633, lr_D: 0.000800 2023-07-01 15:35:19,805 - INFO - [Train] step: 3399, loss_G: 0.776083, lr_G: 0.000800 2023-07-01 15:36:16,086 - INFO - [Train] step: 3499, loss_D: 0.034294, lr_D: 0.000800 2023-07-01 15:36:16,167 - INFO - [Train] step: 3499, loss_G: 0.402164, lr_G: 0.000800 2023-07-01 15:37:12,439 - INFO - [Train] step: 3599, loss_D: 0.022949, lr_D: 0.000800 2023-07-01 15:37:12,521 - INFO - [Train] step: 3599, loss_G: 0.441130, lr_G: 0.000800 2023-07-01 15:38:08,811 - INFO - [Train] step: 3699, loss_D: 0.012377, lr_D: 0.000800 2023-07-01 15:38:08,892 - INFO - [Train] step: 3699, loss_G: 0.462345, lr_G: 0.000800 2023-07-01 15:39:05,195 - INFO - [Train] step: 3799, loss_D: 0.023728, lr_D: 0.000800 2023-07-01 15:39:05,277 - INFO - [Train] step: 3799, loss_G: 0.450673, lr_G: 0.000800 2023-07-01 15:40:01,722 - INFO - [Train] step: 3899, loss_D: 0.019619, lr_D: 0.000800 2023-07-01 15:40:01,804 - INFO - [Train] step: 3899, loss_G: 0.572972, lr_G: 0.000800 2023-07-01 15:40:58,117 - INFO - [Train] step: 3999, loss_D: 0.046678, lr_D: 0.000800 2023-07-01 15:40:58,198 - INFO - [Train] step: 3999, loss_G: 0.483426, lr_G: 0.000800 2023-07-01 15:42:02,133 - INFO - [Eval] step: 3999, fid: 110.422782 2023-07-01 15:42:58,550 - INFO - [Train] step: 4099, loss_D: 0.126218, lr_D: 0.000800 2023-07-01 15:42:58,632 - INFO - [Train] step: 4099, loss_G: 0.291913, lr_G: 0.000800 2023-07-01 15:43:54,935 - INFO - [Train] step: 4199, loss_D: 0.035849, lr_D: 0.000800 2023-07-01 15:43:55,017 - INFO - [Train] step: 4199, loss_G: 0.523266, lr_G: 0.000800 2023-07-01 15:44:51,309 - INFO - [Train] step: 4299, loss_D: 0.037320, lr_D: 0.000800 2023-07-01 15:44:51,391 - INFO - [Train] step: 4299, loss_G: 0.458659, lr_G: 0.000800 2023-07-01 15:45:47,676 - INFO - [Train] step: 4399, loss_D: 0.011471, lr_D: 0.000800 2023-07-01 15:45:47,758 - INFO - [Train] step: 4399, loss_G: 0.508021, lr_G: 0.000800 2023-07-01 15:46:44,053 - INFO - [Train] step: 4499, loss_D: 0.089249, lr_D: 0.000800 2023-07-01 15:46:44,134 - INFO - [Train] step: 4499, loss_G: 0.694754, lr_G: 0.000800 2023-07-01 15:47:40,575 - INFO - [Train] step: 4599, loss_D: 0.114922, lr_D: 0.000800 2023-07-01 15:47:40,657 - INFO - [Train] step: 4599, loss_G: 0.099662, lr_G: 0.000800 2023-07-01 15:48:36,973 - INFO - [Train] step: 4699, loss_D: 0.075923, lr_D: 0.000800 2023-07-01 15:48:37,055 - INFO - [Train] step: 4699, loss_G: 0.317860, lr_G: 0.000800 2023-07-01 15:49:33,327 - INFO - [Train] step: 4799, loss_D: 0.025180, lr_D: 0.000800 2023-07-01 15:49:33,409 - INFO - [Train] step: 4799, loss_G: 0.552436, lr_G: 0.000800 2023-07-01 15:50:29,667 - INFO - [Train] step: 4899, loss_D: 0.020879, lr_D: 0.000800 2023-07-01 15:50:29,749 - INFO - [Train] step: 4899, loss_G: 0.466652, lr_G: 0.000800 2023-07-01 15:51:26,027 - INFO - [Train] step: 4999, loss_D: 0.025203, lr_D: 0.000800 2023-07-01 15:51:26,109 - INFO - [Train] step: 4999, loss_G: 0.398335, lr_G: 0.000800 2023-07-01 15:52:30,087 - INFO - [Eval] step: 4999, fid: 118.414793 2023-07-01 15:53:26,158 - INFO - [Train] step: 5099, loss_D: 0.048037, lr_D: 0.000800 2023-07-01 15:53:26,240 - INFO - [Train] step: 5099, loss_G: 0.530245, lr_G: 0.000800 2023-07-01 15:54:22,652 - INFO - [Train] step: 5199, loss_D: 0.021130, lr_D: 0.000800 2023-07-01 15:54:22,733 - INFO - [Train] step: 5199, loss_G: 0.541339, lr_G: 0.000800 2023-07-01 15:55:19,015 - INFO - [Train] step: 5299, loss_D: 0.041751, lr_D: 0.000800 2023-07-01 15:55:19,097 - INFO - [Train] step: 5299, loss_G: 0.454968, lr_G: 0.000800 2023-07-01 15:56:15,395 - INFO - [Train] step: 5399, loss_D: 0.023214, lr_D: 0.000800 2023-07-01 15:56:15,477 - INFO - [Train] step: 5399, loss_G: 0.482839, lr_G: 0.000800 2023-07-01 15:57:11,742 - INFO - [Train] step: 5499, loss_D: 0.019600, lr_D: 0.000800 2023-07-01 15:57:11,823 - INFO - [Train] step: 5499, loss_G: 0.506283, lr_G: 0.000800 2023-07-01 15:58:08,120 - INFO - [Train] step: 5599, loss_D: 0.014186, lr_D: 0.000800 2023-07-01 15:58:08,201 - INFO - [Train] step: 5599, loss_G: 0.557838, lr_G: 0.000800 2023-07-01 15:59:04,493 - INFO - [Train] step: 5699, loss_D: 0.221925, lr_D: 0.000800 2023-07-01 15:59:04,574 - INFO - [Train] step: 5699, loss_G: 0.242728, lr_G: 0.000800 2023-07-01 16:00:00,834 - INFO - [Train] step: 5799, loss_D: 0.165530, lr_D: 0.000800 2023-07-01 16:00:00,915 - INFO - [Train] step: 5799, loss_G: 0.227829, lr_G: 0.000800 2023-07-01 16:00:57,358 - INFO - [Train] step: 5899, loss_D: 0.115643, lr_D: 0.000800 2023-07-01 16:00:57,440 - INFO - [Train] step: 5899, loss_G: 0.317124, lr_G: 0.000800 2023-07-01 16:01:53,721 - INFO - [Train] step: 5999, loss_D: 0.103261, lr_D: 0.000800 2023-07-01 16:01:53,803 - INFO - [Train] step: 5999, loss_G: 0.264031, lr_G: 0.000800 2023-07-01 16:02:57,825 - INFO - [Eval] step: 5999, fid: 80.259056 2023-07-01 16:03:54,238 - INFO - [Train] step: 6099, loss_D: 0.051908, lr_D: 0.000800 2023-07-01 16:03:54,319 - INFO - [Train] step: 6099, loss_G: 0.461725, lr_G: 0.000800 2023-07-01 16:04:50,585 - INFO - [Train] step: 6199, loss_D: 0.077694, lr_D: 0.000800 2023-07-01 16:04:50,667 - INFO - [Train] step: 6199, loss_G: 0.340203, lr_G: 0.000800 2023-07-01 16:05:46,954 - INFO - [Train] step: 6299, loss_D: 0.042244, lr_D: 0.000800 2023-07-01 16:05:47,035 - INFO - [Train] step: 6299, loss_G: 0.457580, lr_G: 0.000800 2023-07-01 16:06:43,308 - INFO - [Train] step: 6399, loss_D: 0.212540, lr_D: 0.000800 2023-07-01 16:06:43,390 - INFO - [Train] step: 6399, loss_G: 0.311328, lr_G: 0.000800 2023-07-01 16:07:39,849 - INFO - [Train] step: 6499, loss_D: 0.024245, lr_D: 0.000800 2023-07-01 16:07:39,931 - INFO - [Train] step: 6499, loss_G: 0.470824, lr_G: 0.000800 2023-07-01 16:08:36,216 - INFO - [Train] step: 6599, loss_D: 0.047844, lr_D: 0.000800 2023-07-01 16:08:36,298 - INFO - [Train] step: 6599, loss_G: 0.415272, lr_G: 0.000800 2023-07-01 16:09:32,579 - INFO - [Train] step: 6699, loss_D: 0.033318, lr_D: 0.000800 2023-07-01 16:09:32,661 - INFO - [Train] step: 6699, loss_G: 0.465461, lr_G: 0.000800 2023-07-01 16:10:28,943 - INFO - [Train] step: 6799, loss_D: 0.148448, lr_D: 0.000800 2023-07-01 16:10:29,025 - INFO - [Train] step: 6799, loss_G: 0.209431, lr_G: 0.000800 2023-07-01 16:11:25,292 - INFO - [Train] step: 6899, loss_D: 0.105897, lr_D: 0.000800 2023-07-01 16:11:25,373 - INFO - [Train] step: 6899, loss_G: 0.281653, lr_G: 0.000800 2023-07-01 16:12:21,643 - INFO - [Train] step: 6999, loss_D: 0.062976, lr_D: 0.000800 2023-07-01 16:12:21,725 - INFO - [Train] step: 6999, loss_G: 0.289516, lr_G: 0.000800 2023-07-01 16:13:25,882 - INFO - [Eval] step: 6999, fid: 83.707575 2023-07-01 16:14:21,910 - INFO - [Train] step: 7099, loss_D: 0.150858, lr_D: 0.000800 2023-07-01 16:14:21,991 - INFO - [Train] step: 7099, loss_G: 0.699395, lr_G: 0.000800 2023-07-01 16:15:18,400 - INFO - [Train] step: 7199, loss_D: 0.020312, lr_D: 0.000800 2023-07-01 16:15:18,481 - INFO - [Train] step: 7199, loss_G: 0.449259, lr_G: 0.000800 2023-07-01 16:16:14,742 - INFO - [Train] step: 7299, loss_D: 0.031448, lr_D: 0.000800 2023-07-01 16:16:14,824 - INFO - [Train] step: 7299, loss_G: 0.395463, lr_G: 0.000800 2023-07-01 16:17:11,071 - INFO - [Train] step: 7399, loss_D: 0.352383, lr_D: 0.000800 2023-07-01 16:17:11,153 - INFO - [Train] step: 7399, loss_G: 0.527658, lr_G: 0.000800 2023-07-01 16:18:07,400 - INFO - [Train] step: 7499, loss_D: 0.019191, lr_D: 0.000800 2023-07-01 16:18:07,481 - INFO - [Train] step: 7499, loss_G: 0.497877, lr_G: 0.000800 2023-07-01 16:19:03,731 - INFO - [Train] step: 7599, loss_D: 0.079302, lr_D: 0.000800 2023-07-01 16:19:03,812 - INFO - [Train] step: 7599, loss_G: 0.396555, lr_G: 0.000800 2023-07-01 16:20:00,088 - INFO - [Train] step: 7699, loss_D: 0.118297, lr_D: 0.000800 2023-07-01 16:20:00,170 - INFO - [Train] step: 7699, loss_G: 0.232735, lr_G: 0.000800 2023-07-01 16:20:56,583 - INFO - [Train] step: 7799, loss_D: 0.080086, lr_D: 0.000800 2023-07-01 16:20:56,664 - INFO - [Train] step: 7799, loss_G: 0.405607, lr_G: 0.000800 2023-07-01 16:21:52,884 - INFO - [Train] step: 7899, loss_D: 0.035206, lr_D: 0.000800 2023-07-01 16:21:52,966 - INFO - [Train] step: 7899, loss_G: 0.521183, lr_G: 0.000800 2023-07-01 16:22:49,232 - INFO - [Train] step: 7999, loss_D: 0.042690, lr_D: 0.000800 2023-07-01 16:22:49,314 - INFO - [Train] step: 7999, loss_G: 0.628162, lr_G: 0.000800 2023-07-01 16:23:53,332 - INFO - [Eval] step: 7999, fid: 63.824590 2023-07-01 16:24:49,626 - INFO - [Train] step: 8099, loss_D: 0.045911, lr_D: 0.000800 2023-07-01 16:24:49,707 - INFO - [Train] step: 8099, loss_G: 0.295043, lr_G: 0.000800 2023-07-01 16:25:45,949 - INFO - [Train] step: 8199, loss_D: 0.033123, lr_D: 0.000800 2023-07-01 16:25:46,031 - INFO - [Train] step: 8199, loss_G: 0.386580, lr_G: 0.000800 2023-07-01 16:26:42,281 - INFO - [Train] step: 8299, loss_D: 0.025258, lr_D: 0.000800 2023-07-01 16:26:42,363 - INFO - [Train] step: 8299, loss_G: 0.459850, lr_G: 0.000800 2023-07-01 16:27:38,620 - INFO - [Train] step: 8399, loss_D: 0.015947, lr_D: 0.000800 2023-07-01 16:27:38,701 - INFO - [Train] step: 8399, loss_G: 0.478456, lr_G: 0.000800 2023-07-01 16:28:35,096 - INFO - [Train] step: 8499, loss_D: 0.041746, lr_D: 0.000800 2023-07-01 16:28:35,177 - INFO - [Train] step: 8499, loss_G: 0.391286, lr_G: 0.000800 2023-07-01 16:29:31,402 - INFO - [Train] step: 8599, loss_D: 0.041732, lr_D: 0.000800 2023-07-01 16:29:31,483 - INFO - [Train] step: 8599, loss_G: 0.573958, lr_G: 0.000800 2023-07-01 16:30:27,710 - INFO - [Train] step: 8699, loss_D: 0.024475, lr_D: 0.000800 2023-07-01 16:30:27,792 - INFO - [Train] step: 8699, loss_G: 0.445166, lr_G: 0.000800 2023-07-01 16:31:24,034 - INFO - [Train] step: 8799, loss_D: 0.069513, lr_D: 0.000800 2023-07-01 16:31:24,116 - INFO - [Train] step: 8799, loss_G: 0.564701, lr_G: 0.000800 2023-07-01 16:32:20,329 - INFO - [Train] step: 8899, loss_D: 0.030355, lr_D: 0.000800 2023-07-01 16:32:20,410 - INFO - [Train] step: 8899, loss_G: 0.398739, lr_G: 0.000800 2023-07-01 16:33:16,666 - INFO - [Train] step: 8999, loss_D: 0.054898, lr_D: 0.000800 2023-07-01 16:33:16,747 - INFO - [Train] step: 8999, loss_G: 0.355975, lr_G: 0.000800 2023-07-01 16:34:21,042 - INFO - [Eval] step: 8999, fid: 51.856425 2023-07-01 16:35:17,557 - INFO - [Train] step: 9099, loss_D: 0.345732, lr_D: 0.000800 2023-07-01 16:35:17,639 - INFO - [Train] step: 9099, loss_G: 1.304249, lr_G: 0.000800 2023-07-01 16:36:13,902 - INFO - [Train] step: 9199, loss_D: 0.061331, lr_D: 0.000800 2023-07-01 16:36:13,983 - INFO - [Train] step: 9199, loss_G: 0.478601, lr_G: 0.000800 2023-07-01 16:37:10,210 - INFO - [Train] step: 9299, loss_D: 0.017195, lr_D: 0.000800 2023-07-01 16:37:10,292 - INFO - [Train] step: 9299, loss_G: 0.445164, lr_G: 0.000800 2023-07-01 16:38:06,527 - INFO - [Train] step: 9399, loss_D: 0.028449, lr_D: 0.000800 2023-07-01 16:38:06,609 - INFO - [Train] step: 9399, loss_G: 0.422484, lr_G: 0.000800 2023-07-01 16:39:02,889 - INFO - [Train] step: 9499, loss_D: 0.122286, lr_D: 0.000800 2023-07-01 16:39:02,970 - INFO - [Train] step: 9499, loss_G: 0.339955, lr_G: 0.000800 2023-07-01 16:39:59,192 - INFO - [Train] step: 9599, loss_D: 0.031580, lr_D: 0.000800 2023-07-01 16:39:59,273 - INFO - [Train] step: 9599, loss_G: 0.414887, lr_G: 0.000800 2023-07-01 16:40:55,534 - INFO - [Train] step: 9699, loss_D: 0.055920, lr_D: 0.000800 2023-07-01 16:40:55,615 - INFO - [Train] step: 9699, loss_G: 0.380711, lr_G: 0.000800 2023-07-01 16:41:51,983 - INFO - [Train] step: 9799, loss_D: 0.042303, lr_D: 0.000800 2023-07-01 16:41:52,065 - INFO - [Train] step: 9799, loss_G: 0.411310, lr_G: 0.000800 2023-07-01 16:42:48,305 - INFO - [Train] step: 9899, loss_D: 0.015790, lr_D: 0.000800 2023-07-01 16:42:48,387 - INFO - [Train] step: 9899, loss_G: 0.513790, lr_G: 0.000800 2023-07-01 16:43:44,627 - INFO - [Train] step: 9999, loss_D: 0.031156, lr_D: 0.000800 2023-07-01 16:43:44,709 - INFO - [Train] step: 9999, loss_G: 0.435210, lr_G: 0.000800 2023-07-01 16:44:48,944 - INFO - [Eval] step: 9999, fid: 44.175232 2023-07-01 16:45:45,437 - INFO - [Train] step: 10099, loss_D: 0.048197, lr_D: 0.000800 2023-07-01 16:45:45,519 - INFO - [Train] step: 10099, loss_G: 0.563195, lr_G: 0.000800 2023-07-01 16:46:41,735 - INFO - [Train] step: 10199, loss_D: 0.015835, lr_D: 0.000800 2023-07-01 16:46:41,816 - INFO - [Train] step: 10199, loss_G: 0.494992, lr_G: 0.000800 2023-07-01 16:47:38,028 - INFO - [Train] step: 10299, loss_D: 0.020491, lr_D: 0.000800 2023-07-01 16:47:38,109 - INFO - [Train] step: 10299, loss_G: 0.500630, lr_G: 0.000800 2023-07-01 16:48:34,465 - INFO - [Train] step: 10399, loss_D: 0.031950, lr_D: 0.000800 2023-07-01 16:48:34,547 - INFO - [Train] step: 10399, loss_G: 0.362065, lr_G: 0.000800 2023-07-01 16:49:30,770 - INFO - [Train] step: 10499, loss_D: 0.012799, lr_D: 0.000800 2023-07-01 16:49:30,852 - INFO - [Train] step: 10499, loss_G: 0.464469, lr_G: 0.000800 2023-07-01 16:50:27,066 - INFO - [Train] step: 10599, loss_D: 0.015617, lr_D: 0.000800 2023-07-01 16:50:27,148 - INFO - [Train] step: 10599, loss_G: 0.473956, lr_G: 0.000800 2023-07-01 16:51:23,381 - INFO - [Train] step: 10699, loss_D: 0.026475, lr_D: 0.000800 2023-07-01 16:51:23,463 - INFO - [Train] step: 10699, loss_G: 0.485552, lr_G: 0.000800 2023-07-01 16:52:19,674 - INFO - [Train] step: 10799, loss_D: 0.014157, lr_D: 0.000800 2023-07-01 16:52:19,755 - INFO - [Train] step: 10799, loss_G: 0.476442, lr_G: 0.000800 2023-07-01 16:53:15,958 - INFO - [Train] step: 10899, loss_D: 0.028922, lr_D: 0.000800 2023-07-01 16:53:16,040 - INFO - [Train] step: 10899, loss_G: 0.452689, lr_G: 0.000800 2023-07-01 16:54:12,409 - INFO - [Train] step: 10999, loss_D: 0.019099, lr_D: 0.000800 2023-07-01 16:54:12,491 - INFO - [Train] step: 10999, loss_G: 0.541219, lr_G: 0.000800 2023-07-01 16:55:16,605 - INFO - [Eval] step: 10999, fid: 41.510723 2023-07-01 16:56:12,939 - INFO - [Train] step: 11099, loss_D: 0.426106, lr_D: 0.000800 2023-07-01 16:56:13,020 - INFO - [Train] step: 11099, loss_G: 0.712436, lr_G: 0.000800 2023-07-01 16:57:09,286 - INFO - [Train] step: 11199, loss_D: 0.013547, lr_D: 0.000800 2023-07-01 16:57:09,368 - INFO - [Train] step: 11199, loss_G: 0.533168, lr_G: 0.000800 2023-07-01 16:58:05,595 - INFO - [Train] step: 11299, loss_D: 0.025288, lr_D: 0.000800 2023-07-01 16:58:05,676 - INFO - [Train] step: 11299, loss_G: 0.395842, lr_G: 0.000800 2023-07-01 16:59:01,888 - INFO - [Train] step: 11399, loss_D: 0.046030, lr_D: 0.000800 2023-07-01 16:59:01,969 - INFO - [Train] step: 11399, loss_G: 0.272756, lr_G: 0.000800 2023-07-01 16:59:58,217 - INFO - [Train] step: 11499, loss_D: 0.021802, lr_D: 0.000800 2023-07-01 16:59:58,299 - INFO - [Train] step: 11499, loss_G: 0.524531, lr_G: 0.000800 2023-07-01 17:00:54,516 - INFO - [Train] step: 11599, loss_D: 0.102124, lr_D: 0.000800 2023-07-01 17:00:54,597 - INFO - [Train] step: 11599, loss_G: 0.250981, lr_G: 0.000800 2023-07-01 17:01:50,982 - INFO - [Train] step: 11699, loss_D: 0.016419, lr_D: 0.000800 2023-07-01 17:01:51,064 - INFO - [Train] step: 11699, loss_G: 0.416956, lr_G: 0.000800 2023-07-01 17:02:47,278 - INFO - [Train] step: 11799, loss_D: 0.022939, lr_D: 0.000800 2023-07-01 17:02:47,360 - INFO - [Train] step: 11799, loss_G: 0.444405, lr_G: 0.000800 2023-07-01 17:03:43,572 - INFO - [Train] step: 11899, loss_D: 0.028677, lr_D: 0.000800 2023-07-01 17:03:43,654 - INFO - [Train] step: 11899, loss_G: 0.505162, lr_G: 0.000800 2023-07-01 17:04:39,891 - INFO - [Train] step: 11999, loss_D: 0.193232, lr_D: 0.000800 2023-07-01 17:04:39,973 - INFO - [Train] step: 11999, loss_G: 0.713976, lr_G: 0.000800 2023-07-01 17:05:44,158 - INFO - [Eval] step: 11999, fid: 37.335719 2023-07-01 17:06:40,488 - INFO - [Train] step: 12099, loss_D: 0.041622, lr_D: 0.000800 2023-07-01 17:06:40,569 - INFO - [Train] step: 12099, loss_G: 0.354398, lr_G: 0.000800 2023-07-01 17:07:36,806 - INFO - [Train] step: 12199, loss_D: 0.018744, lr_D: 0.000800 2023-07-01 17:07:36,887 - INFO - [Train] step: 12199, loss_G: 0.524607, lr_G: 0.000800 2023-07-01 17:08:33,267 - INFO - [Train] step: 12299, loss_D: 0.040381, lr_D: 0.000800 2023-07-01 17:08:33,349 - INFO - [Train] step: 12299, loss_G: 0.378312, lr_G: 0.000800 2023-07-01 17:09:29,602 - INFO - [Train] step: 12399, loss_D: 0.036392, lr_D: 0.000800 2023-07-01 17:09:29,683 - INFO - [Train] step: 12399, loss_G: 0.327528, lr_G: 0.000800 2023-07-01 17:10:25,908 - INFO - [Train] step: 12499, loss_D: 0.022661, lr_D: 0.000800 2023-07-01 17:10:25,989 - INFO - [Train] step: 12499, loss_G: 0.503605, lr_G: 0.000800 2023-07-01 17:11:22,227 - INFO - [Train] step: 12599, loss_D: 0.160112, lr_D: 0.000800 2023-07-01 17:11:22,309 - INFO - [Train] step: 12599, loss_G: 0.704796, lr_G: 0.000800 2023-07-01 17:12:18,564 - INFO - [Train] step: 12699, loss_D: 0.042283, lr_D: 0.000800 2023-07-01 17:12:18,646 - INFO - [Train] step: 12699, loss_G: 0.419944, lr_G: 0.000800 2023-07-01 17:13:14,892 - INFO - [Train] step: 12799, loss_D: 0.015350, lr_D: 0.000800 2023-07-01 17:13:14,974 - INFO - [Train] step: 12799, loss_G: 0.590415, lr_G: 0.000800 2023-07-01 17:14:11,201 - INFO - [Train] step: 12899, loss_D: 0.028026, lr_D: 0.000800 2023-07-01 17:14:11,282 - INFO - [Train] step: 12899, loss_G: 0.602759, lr_G: 0.000800 2023-07-01 17:15:07,686 - INFO - [Train] step: 12999, loss_D: 0.031418, lr_D: 0.000800 2023-07-01 17:15:07,767 - INFO - [Train] step: 12999, loss_G: 0.387043, lr_G: 0.000800 2023-07-01 17:16:12,106 - INFO - [Eval] step: 12999, fid: 36.465563 2023-07-01 17:17:08,377 - INFO - [Train] step: 13099, loss_D: 0.019731, lr_D: 0.000800 2023-07-01 17:17:08,458 - INFO - [Train] step: 13099, loss_G: 0.472084, lr_G: 0.000800 2023-07-01 17:18:04,480 - INFO - [Train] step: 13199, loss_D: 0.012040, lr_D: 0.000800 2023-07-01 17:18:04,561 - INFO - [Train] step: 13199, loss_G: 0.529425, lr_G: 0.000800 2023-07-01 17:19:00,770 - INFO - [Train] step: 13299, loss_D: 0.015204, lr_D: 0.000800 2023-07-01 17:19:00,852 - INFO - [Train] step: 13299, loss_G: 0.592233, lr_G: 0.000800 2023-07-01 17:19:57,077 - INFO - [Train] step: 13399, loss_D: 0.035052, lr_D: 0.000800 2023-07-01 17:19:57,159 - INFO - [Train] step: 13399, loss_G: 0.287030, lr_G: 0.000800 2023-07-01 17:20:53,407 - INFO - [Train] step: 13499, loss_D: 0.031691, lr_D: 0.000800 2023-07-01 17:20:53,488 - INFO - [Train] step: 13499, loss_G: 0.541036, lr_G: 0.000800 2023-07-01 17:21:49,845 - INFO - [Train] step: 13599, loss_D: 0.015474, lr_D: 0.000800 2023-07-01 17:21:49,927 - INFO - [Train] step: 13599, loss_G: 0.515333, lr_G: 0.000800 2023-07-01 17:22:46,157 - INFO - [Train] step: 13699, loss_D: 0.024264, lr_D: 0.000800 2023-07-01 17:22:46,239 - INFO - [Train] step: 13699, loss_G: 0.358656, lr_G: 0.000800 2023-07-01 17:23:42,463 - INFO - [Train] step: 13799, loss_D: 0.018509, lr_D: 0.000800 2023-07-01 17:23:42,544 - INFO - [Train] step: 13799, loss_G: 0.556743, lr_G: 0.000800 2023-07-01 17:24:38,784 - INFO - [Train] step: 13899, loss_D: 0.016108, lr_D: 0.000800 2023-07-01 17:24:38,866 - INFO - [Train] step: 13899, loss_G: 0.540080, lr_G: 0.000800 2023-07-01 17:25:35,095 - INFO - [Train] step: 13999, loss_D: 0.185542, lr_D: 0.000800 2023-07-01 17:25:35,176 - INFO - [Train] step: 13999, loss_G: 0.206313, lr_G: 0.000800 2023-07-01 17:26:39,437 - INFO - [Eval] step: 13999, fid: 34.038329 2023-07-01 17:27:35,694 - INFO - [Train] step: 14099, loss_D: 0.025367, lr_D: 0.000800 2023-07-01 17:27:35,775 - INFO - [Train] step: 14099, loss_G: 0.423853, lr_G: 0.000800 2023-07-01 17:28:31,852 - INFO - [Train] step: 14199, loss_D: 0.042458, lr_D: 0.000800 2023-07-01 17:28:31,934 - INFO - [Train] step: 14199, loss_G: 0.339424, lr_G: 0.000800 2023-07-01 17:29:28,330 - INFO - [Train] step: 14299, loss_D: 0.022700, lr_D: 0.000800 2023-07-01 17:29:28,411 - INFO - [Train] step: 14299, loss_G: 0.386404, lr_G: 0.000800 2023-07-01 17:30:24,670 - INFO - [Train] step: 14399, loss_D: 0.016297, lr_D: 0.000800 2023-07-01 17:30:24,752 - INFO - [Train] step: 14399, loss_G: 0.453159, lr_G: 0.000800 2023-07-01 17:31:20,985 - INFO - [Train] step: 14499, loss_D: 0.030546, lr_D: 0.000800 2023-07-01 17:31:21,066 - INFO - [Train] step: 14499, loss_G: 0.428695, lr_G: 0.000800 2023-07-01 17:32:17,281 - INFO - [Train] step: 14599, loss_D: 0.013512, lr_D: 0.000800 2023-07-01 17:32:17,362 - INFO - [Train] step: 14599, loss_G: 0.463028, lr_G: 0.000800 2023-07-01 17:33:13,608 - INFO - [Train] step: 14699, loss_D: 0.019674, lr_D: 0.000800 2023-07-01 17:33:13,690 - INFO - [Train] step: 14699, loss_G: 0.452179, lr_G: 0.000800 2023-07-01 17:34:09,910 - INFO - [Train] step: 14799, loss_D: 0.025598, lr_D: 0.000800 2023-07-01 17:34:09,992 - INFO - [Train] step: 14799, loss_G: 0.347240, lr_G: 0.000800 2023-07-01 17:35:06,380 - INFO - [Train] step: 14899, loss_D: 0.014152, lr_D: 0.000800 2023-07-01 17:35:06,461 - INFO - [Train] step: 14899, loss_G: 0.567665, lr_G: 0.000800 2023-07-01 17:36:02,698 - INFO - [Train] step: 14999, loss_D: 0.010963, lr_D: 0.000800 2023-07-01 17:36:02,780 - INFO - [Train] step: 14999, loss_G: 0.465192, lr_G: 0.000800 2023-07-01 17:37:06,934 - INFO - [Eval] step: 14999, fid: 32.735350 2023-07-01 17:38:03,239 - INFO - [Train] step: 15099, loss_D: 0.020491, lr_D: 0.000800 2023-07-01 17:38:03,320 - INFO - [Train] step: 15099, loss_G: 0.384963, lr_G: 0.000800 2023-07-01 17:38:59,470 - INFO - [Train] step: 15199, loss_D: 0.018280, lr_D: 0.000800 2023-07-01 17:38:59,552 - INFO - [Train] step: 15199, loss_G: 0.589998, lr_G: 0.000800 2023-07-01 17:39:55,773 - INFO - [Train] step: 15299, loss_D: 0.015946, lr_D: 0.000800 2023-07-01 17:39:55,854 - INFO - [Train] step: 15299, loss_G: 0.492813, lr_G: 0.000800 2023-07-01 17:40:52,077 - INFO - [Train] step: 15399, loss_D: 0.026006, lr_D: 0.000800 2023-07-01 17:40:52,159 - INFO - [Train] step: 15399, loss_G: 0.457885, lr_G: 0.000800 2023-07-01 17:41:48,413 - INFO - [Train] step: 15499, loss_D: 0.047703, lr_D: 0.000800 2023-07-01 17:41:48,494 - INFO - [Train] step: 15499, loss_G: 0.299725, lr_G: 0.000800 2023-07-01 17:42:44,883 - INFO - [Train] step: 15599, loss_D: 0.014168, lr_D: 0.000800 2023-07-01 17:42:44,965 - INFO - [Train] step: 15599, loss_G: 0.514779, lr_G: 0.000800 2023-07-01 17:43:41,181 - INFO - [Train] step: 15699, loss_D: 0.021297, lr_D: 0.000800 2023-07-01 17:43:41,263 - INFO - [Train] step: 15699, loss_G: 0.432311, lr_G: 0.000800 2023-07-01 17:44:37,493 - INFO - [Train] step: 15799, loss_D: 0.029429, lr_D: 0.000800 2023-07-01 17:44:37,575 - INFO - [Train] step: 15799, loss_G: 0.407868, lr_G: 0.000800 2023-07-01 17:45:33,799 - INFO - [Train] step: 15899, loss_D: 0.041180, lr_D: 0.000800 2023-07-01 17:45:33,881 - INFO - [Train] step: 15899, loss_G: 0.669948, lr_G: 0.000800 2023-07-01 17:46:30,117 - INFO - [Train] step: 15999, loss_D: 0.055078, lr_D: 0.000800 2023-07-01 17:46:30,199 - INFO - [Train] step: 15999, loss_G: 0.672064, lr_G: 0.000800 2023-07-01 17:47:34,364 - INFO - [Eval] step: 15999, fid: 32.581077 2023-07-01 17:48:30,631 - INFO - [Train] step: 16099, loss_D: 0.014624, lr_D: 0.000800 2023-07-01 17:48:30,712 - INFO - [Train] step: 16099, loss_G: 0.541963, lr_G: 0.000800 2023-07-01 17:49:26,885 - INFO - [Train] step: 16199, loss_D: 0.015709, lr_D: 0.000800 2023-07-01 17:49:26,966 - INFO - [Train] step: 16199, loss_G: 0.470793, lr_G: 0.000800 2023-07-01 17:50:23,135 - INFO - [Train] step: 16299, loss_D: 0.015616, lr_D: 0.000800 2023-07-01 17:50:23,216 - INFO - [Train] step: 16299, loss_G: 0.435368, lr_G: 0.000800 2023-07-01 17:51:19,433 - INFO - [Train] step: 16399, loss_D: 0.016343, lr_D: 0.000800 2023-07-01 17:51:19,514 - INFO - [Train] step: 16399, loss_G: 0.415708, lr_G: 0.000800 2023-07-01 17:52:15,754 - INFO - [Train] step: 16499, loss_D: 0.011294, lr_D: 0.000800 2023-07-01 17:52:15,836 - INFO - [Train] step: 16499, loss_G: 0.482849, lr_G: 0.000800 2023-07-01 17:53:12,051 - INFO - [Train] step: 16599, loss_D: 0.019565, lr_D: 0.000800 2023-07-01 17:53:12,133 - INFO - [Train] step: 16599, loss_G: 0.615770, lr_G: 0.000800 2023-07-01 17:54:08,413 - INFO - [Train] step: 16699, loss_D: 0.014137, lr_D: 0.000800 2023-07-01 17:54:08,495 - INFO - [Train] step: 16699, loss_G: 0.564877, lr_G: 0.000800 2023-07-01 17:55:04,714 - INFO - [Train] step: 16799, loss_D: 0.057961, lr_D: 0.000800 2023-07-01 17:55:04,795 - INFO - [Train] step: 16799, loss_G: 0.453062, lr_G: 0.000800 2023-07-01 17:56:01,182 - INFO - [Train] step: 16899, loss_D: 0.020187, lr_D: 0.000800 2023-07-01 17:56:01,263 - INFO - [Train] step: 16899, loss_G: 0.353994, lr_G: 0.000800 2023-07-01 17:56:57,505 - INFO - [Train] step: 16999, loss_D: 0.012622, lr_D: 0.000800 2023-07-01 17:56:57,587 - INFO - [Train] step: 16999, loss_G: 0.529359, lr_G: 0.000800 2023-07-01 17:58:01,832 - INFO - [Eval] step: 16999, fid: 31.229778 2023-07-01 17:58:58,123 - INFO - [Train] step: 17099, loss_D: 0.030559, lr_D: 0.000800 2023-07-01 17:58:58,204 - INFO - [Train] step: 17099, loss_G: 0.422545, lr_G: 0.000800 2023-07-01 17:59:54,344 - INFO - [Train] step: 17199, loss_D: 0.043773, lr_D: 0.000800 2023-07-01 17:59:54,425 - INFO - [Train] step: 17199, loss_G: 0.540815, lr_G: 0.000800 2023-07-01 18:00:50,692 - INFO - [Train] step: 17299, loss_D: 0.011587, lr_D: 0.000800 2023-07-01 18:00:50,773 - INFO - [Train] step: 17299, loss_G: 0.519015, lr_G: 0.000800 2023-07-01 18:01:47,007 - INFO - [Train] step: 17399, loss_D: 0.015168, lr_D: 0.000800 2023-07-01 18:01:47,089 - INFO - [Train] step: 17399, loss_G: 0.517359, lr_G: 0.000800 2023-07-01 18:02:43,479 - INFO - [Train] step: 17499, loss_D: 0.009712, lr_D: 0.000800 2023-07-01 18:02:43,561 - INFO - [Train] step: 17499, loss_G: 0.571477, lr_G: 0.000800 2023-07-01 18:03:39,809 - INFO - [Train] step: 17599, loss_D: 0.029833, lr_D: 0.000800 2023-07-01 18:03:39,891 - INFO - [Train] step: 17599, loss_G: 0.441681, lr_G: 0.000800 2023-07-01 18:04:36,148 - INFO - [Train] step: 17699, loss_D: 0.023777, lr_D: 0.000800 2023-07-01 18:04:36,229 - INFO - [Train] step: 17699, loss_G: 0.495252, lr_G: 0.000800 2023-07-01 18:05:32,458 - INFO - [Train] step: 17799, loss_D: 0.011950, lr_D: 0.000800 2023-07-01 18:05:32,539 - INFO - [Train] step: 17799, loss_G: 0.433190, lr_G: 0.000800 2023-07-01 18:06:28,769 - INFO - [Train] step: 17899, loss_D: 0.011834, lr_D: 0.000800 2023-07-01 18:06:28,850 - INFO - [Train] step: 17899, loss_G: 0.614211, lr_G: 0.000800 2023-07-01 18:07:25,079 - INFO - [Train] step: 17999, loss_D: 0.011349, lr_D: 0.000800 2023-07-01 18:07:25,161 - INFO - [Train] step: 17999, loss_G: 0.452987, lr_G: 0.000800 2023-07-01 18:08:29,368 - INFO - [Eval] step: 17999, fid: 30.362633 2023-07-01 18:09:25,638 - INFO - [Train] step: 18099, loss_D: 0.009980, lr_D: 0.000800 2023-07-01 18:09:25,719 - INFO - [Train] step: 18099, loss_G: 0.427755, lr_G: 0.000800 2023-07-01 18:10:22,059 - INFO - [Train] step: 18199, loss_D: 0.024319, lr_D: 0.000800 2023-07-01 18:10:22,141 - INFO - [Train] step: 18199, loss_G: 0.401313, lr_G: 0.000800 2023-07-01 18:11:18,394 - INFO - [Train] step: 18299, loss_D: 0.015745, lr_D: 0.000800 2023-07-01 18:11:18,476 - INFO - [Train] step: 18299, loss_G: 0.375731, lr_G: 0.000800 2023-07-01 18:12:14,735 - INFO - [Train] step: 18399, loss_D: 0.011599, lr_D: 0.000800 2023-07-01 18:12:14,816 - INFO - [Train] step: 18399, loss_G: 0.519474, lr_G: 0.000800 2023-07-01 18:13:11,096 - INFO - [Train] step: 18499, loss_D: 0.021131, lr_D: 0.000800 2023-07-01 18:13:11,177 - INFO - [Train] step: 18499, loss_G: 0.422406, lr_G: 0.000800 2023-07-01 18:14:07,434 - INFO - [Train] step: 18599, loss_D: 0.034770, lr_D: 0.000800 2023-07-01 18:14:07,516 - INFO - [Train] step: 18599, loss_G: 0.503398, lr_G: 0.000800 2023-07-01 18:15:03,752 - INFO - [Train] step: 18699, loss_D: 0.010583, lr_D: 0.000800 2023-07-01 18:15:03,833 - INFO - [Train] step: 18699, loss_G: 0.469879, lr_G: 0.000800 2023-07-01 18:16:00,236 - INFO - [Train] step: 18799, loss_D: 0.011668, lr_D: 0.000800 2023-07-01 18:16:00,318 - INFO - [Train] step: 18799, loss_G: 0.603311, lr_G: 0.000800 2023-07-01 18:16:56,560 - INFO - [Train] step: 18899, loss_D: 0.030451, lr_D: 0.000800 2023-07-01 18:16:56,641 - INFO - [Train] step: 18899, loss_G: 0.477881, lr_G: 0.000800 2023-07-01 18:17:52,894 - INFO - [Train] step: 18999, loss_D: 0.037809, lr_D: 0.000800 2023-07-01 18:17:52,975 - INFO - [Train] step: 18999, loss_G: 0.316230, lr_G: 0.000800 2023-07-01 18:18:57,240 - INFO - [Eval] step: 18999, fid: 29.796473 2023-07-01 18:19:53,528 - INFO - [Train] step: 19099, loss_D: 0.007894, lr_D: 0.000800 2023-07-01 18:19:53,609 - INFO - [Train] step: 19099, loss_G: 0.468824, lr_G: 0.000800 2023-07-01 18:20:49,872 - INFO - [Train] step: 19199, loss_D: 0.046049, lr_D: 0.000800 2023-07-01 18:20:49,953 - INFO - [Train] step: 19199, loss_G: 0.385633, lr_G: 0.000800 2023-07-01 18:21:46,205 - INFO - [Train] step: 19299, loss_D: 0.014116, lr_D: 0.000800 2023-07-01 18:21:46,286 - INFO - [Train] step: 19299, loss_G: 0.507936, lr_G: 0.000800 2023-07-01 18:22:42,578 - INFO - [Train] step: 19399, loss_D: 0.014901, lr_D: 0.000800 2023-07-01 18:22:42,660 - INFO - [Train] step: 19399, loss_G: 0.515274, lr_G: 0.000800 2023-07-01 18:23:39,066 - INFO - [Train] step: 19499, loss_D: 0.021365, lr_D: 0.000800 2023-07-01 18:23:39,147 - INFO - [Train] step: 19499, loss_G: 0.399002, lr_G: 0.000800 2023-07-01 18:24:35,373 - INFO - [Train] step: 19599, loss_D: 0.011417, lr_D: 0.000800 2023-07-01 18:24:35,455 - INFO - [Train] step: 19599, loss_G: 0.471099, lr_G: 0.000800 2023-07-01 18:25:31,709 - INFO - [Train] step: 19699, loss_D: 0.007876, lr_D: 0.000800 2023-07-01 18:25:31,791 - INFO - [Train] step: 19699, loss_G: 0.545653, lr_G: 0.000800 2023-07-01 18:26:28,052 - INFO - [Train] step: 19799, loss_D: 0.009936, lr_D: 0.000800 2023-07-01 18:26:28,133 - INFO - [Train] step: 19799, loss_G: 0.574504, lr_G: 0.000800 2023-07-01 18:27:24,355 - INFO - [Train] step: 19899, loss_D: 0.013347, lr_D: 0.000800 2023-07-01 18:27:24,436 - INFO - [Train] step: 19899, loss_G: 0.600867, lr_G: 0.000800 2023-07-01 18:28:20,702 - INFO - [Train] step: 19999, loss_D: 0.025694, lr_D: 0.000800 2023-07-01 18:28:20,783 - INFO - [Train] step: 19999, loss_G: 0.346467, lr_G: 0.000800 2023-07-01 18:29:24,970 - INFO - [Eval] step: 19999, fid: 30.423091 2023-07-01 18:30:21,286 - INFO - [Train] step: 20099, loss_D: 0.022139, lr_D: 0.000800 2023-07-01 18:30:21,367 - INFO - [Train] step: 20099, loss_G: 0.393186, lr_G: 0.000800 2023-07-01 18:31:17,554 - INFO - [Train] step: 20199, loss_D: 0.012953, lr_D: 0.000800 2023-07-01 18:31:17,636 - INFO - [Train] step: 20199, loss_G: 0.466040, lr_G: 0.000800 2023-07-01 18:32:13,856 - INFO - [Train] step: 20299, loss_D: 0.006963, lr_D: 0.000800 2023-07-01 18:32:13,938 - INFO - [Train] step: 20299, loss_G: 0.542140, lr_G: 0.000800 2023-07-01 18:33:10,183 - INFO - [Train] step: 20399, loss_D: 0.056166, lr_D: 0.000800 2023-07-01 18:33:10,264 - INFO - [Train] step: 20399, loss_G: 0.334981, lr_G: 0.000800 2023-07-01 18:34:06,496 - INFO - [Train] step: 20499, loss_D: 0.009354, lr_D: 0.000800 2023-07-01 18:34:06,578 - INFO - [Train] step: 20499, loss_G: 0.494651, lr_G: 0.000800 2023-07-01 18:35:02,841 - INFO - [Train] step: 20599, loss_D: 0.024258, lr_D: 0.000800 2023-07-01 18:35:02,922 - INFO - [Train] step: 20599, loss_G: 0.423513, lr_G: 0.000800 2023-07-01 18:35:59,343 - INFO - [Train] step: 20699, loss_D: 0.009255, lr_D: 0.000800 2023-07-01 18:35:59,425 - INFO - [Train] step: 20699, loss_G: 0.485088, lr_G: 0.000800 2023-07-01 18:36:55,665 - INFO - [Train] step: 20799, loss_D: 0.025261, lr_D: 0.000800 2023-07-01 18:36:55,746 - INFO - [Train] step: 20799, loss_G: 0.543806, lr_G: 0.000800 2023-07-01 18:37:51,995 - INFO - [Train] step: 20899, loss_D: 0.007658, lr_D: 0.000800 2023-07-01 18:37:52,076 - INFO - [Train] step: 20899, loss_G: 0.520707, lr_G: 0.000800 2023-07-01 18:38:48,335 - INFO - [Train] step: 20999, loss_D: 0.153177, lr_D: 0.000800 2023-07-01 18:38:48,417 - INFO - [Train] step: 20999, loss_G: 0.215366, lr_G: 0.000800 2023-07-01 18:39:52,815 - INFO - [Eval] step: 20999, fid: 29.908982 2023-07-01 18:40:48,807 - INFO - [Train] step: 21099, loss_D: 0.007064, lr_D: 0.000800 2023-07-01 18:40:48,889 - INFO - [Train] step: 21099, loss_G: 0.555538, lr_G: 0.000800 2023-07-01 18:41:45,083 - INFO - [Train] step: 21199, loss_D: 0.011253, lr_D: 0.000800 2023-07-01 18:41:45,164 - INFO - [Train] step: 21199, loss_G: 0.575167, lr_G: 0.000800 2023-07-01 18:42:41,395 - INFO - [Train] step: 21299, loss_D: 0.008744, lr_D: 0.000800 2023-07-01 18:42:41,476 - INFO - [Train] step: 21299, loss_G: 0.515165, lr_G: 0.000800 2023-07-01 18:43:37,874 - INFO - [Train] step: 21399, loss_D: 0.008760, lr_D: 0.000800 2023-07-01 18:43:37,955 - INFO - [Train] step: 21399, loss_G: 0.422557, lr_G: 0.000800 2023-07-01 18:44:34,223 - INFO - [Train] step: 21499, loss_D: 0.007670, lr_D: 0.000800 2023-07-01 18:44:34,305 - INFO - [Train] step: 21499, loss_G: 0.504641, lr_G: 0.000800 2023-07-01 18:45:30,533 - INFO - [Train] step: 21599, loss_D: 0.009109, lr_D: 0.000800 2023-07-01 18:45:30,615 - INFO - [Train] step: 21599, loss_G: 0.550982, lr_G: 0.000800 2023-07-01 18:46:26,841 - INFO - [Train] step: 21699, loss_D: 0.014999, lr_D: 0.000800 2023-07-01 18:46:26,923 - INFO - [Train] step: 21699, loss_G: 0.476287, lr_G: 0.000800 2023-07-01 18:47:23,172 - INFO - [Train] step: 21799, loss_D: 0.011643, lr_D: 0.000800 2023-07-01 18:47:23,254 - INFO - [Train] step: 21799, loss_G: 0.562363, lr_G: 0.000800 2023-07-01 18:48:19,471 - INFO - [Train] step: 21899, loss_D: 0.007825, lr_D: 0.000800 2023-07-01 18:48:19,553 - INFO - [Train] step: 21899, loss_G: 0.461671, lr_G: 0.000800 2023-07-01 18:49:15,956 - INFO - [Train] step: 21999, loss_D: 0.008241, lr_D: 0.000800 2023-07-01 18:49:16,038 - INFO - [Train] step: 21999, loss_G: 0.489334, lr_G: 0.000800 2023-07-01 18:50:20,209 - INFO - [Eval] step: 21999, fid: 29.695085 2023-07-01 18:51:16,513 - INFO - [Train] step: 22099, loss_D: 0.027289, lr_D: 0.000800 2023-07-01 18:51:16,595 - INFO - [Train] step: 22099, loss_G: 0.421344, lr_G: 0.000800 2023-07-01 18:52:12,821 - INFO - [Train] step: 22199, loss_D: 0.011550, lr_D: 0.000800 2023-07-01 18:52:12,903 - INFO - [Train] step: 22199, loss_G: 0.429288, lr_G: 0.000800 2023-07-01 18:53:09,153 - INFO - [Train] step: 22299, loss_D: 0.024233, lr_D: 0.000800 2023-07-01 18:53:09,235 - INFO - [Train] step: 22299, loss_G: 0.286523, lr_G: 0.000800 2023-07-01 18:54:05,488 - INFO - [Train] step: 22399, loss_D: 0.034346, lr_D: 0.000800 2023-07-01 18:54:05,569 - INFO - [Train] step: 22399, loss_G: 0.445654, lr_G: 0.000800 2023-07-01 18:55:01,793 - INFO - [Train] step: 22499, loss_D: 0.012056, lr_D: 0.000800 2023-07-01 18:55:01,875 - INFO - [Train] step: 22499, loss_G: 0.430716, lr_G: 0.000800 2023-07-01 18:55:58,117 - INFO - [Train] step: 22599, loss_D: 0.017314, lr_D: 0.000800 2023-07-01 18:55:58,197 - INFO - [Train] step: 22599, loss_G: 0.401118, lr_G: 0.000800 2023-07-01 18:56:54,573 - INFO - [Train] step: 22699, loss_D: 0.006963, lr_D: 0.000800 2023-07-01 18:56:54,655 - INFO - [Train] step: 22699, loss_G: 0.511922, lr_G: 0.000800 2023-07-01 18:57:50,896 - INFO - [Train] step: 22799, loss_D: 0.007882, lr_D: 0.000800 2023-07-01 18:57:50,978 - INFO - [Train] step: 22799, loss_G: 0.539776, lr_G: 0.000800 2023-07-01 18:58:47,253 - INFO - [Train] step: 22899, loss_D: 0.007652, lr_D: 0.000800 2023-07-01 18:58:47,334 - INFO - [Train] step: 22899, loss_G: 0.553684, lr_G: 0.000800 2023-07-01 18:59:43,570 - INFO - [Train] step: 22999, loss_D: 0.011942, lr_D: 0.000800 2023-07-01 18:59:43,651 - INFO - [Train] step: 22999, loss_G: 0.381835, lr_G: 0.000800 2023-07-01 19:00:47,913 - INFO - [Eval] step: 22999, fid: 29.447679 2023-07-01 19:01:44,203 - INFO - [Train] step: 23099, loss_D: 0.008218, lr_D: 0.000800 2023-07-01 19:01:44,285 - INFO - [Train] step: 23099, loss_G: 0.537277, lr_G: 0.000800 2023-07-01 19:02:40,471 - INFO - [Train] step: 23199, loss_D: 0.012256, lr_D: 0.000800 2023-07-01 19:02:40,552 - INFO - [Train] step: 23199, loss_G: 0.600389, lr_G: 0.000800 2023-07-01 19:03:36,958 - INFO - [Train] step: 23299, loss_D: 0.027411, lr_D: 0.000800 2023-07-01 19:03:37,040 - INFO - [Train] step: 23299, loss_G: 0.384273, lr_G: 0.000800 2023-07-01 19:04:33,265 - INFO - [Train] step: 23399, loss_D: 0.008528, lr_D: 0.000800 2023-07-01 19:04:33,347 - INFO - [Train] step: 23399, loss_G: 0.534337, lr_G: 0.000800 2023-07-01 19:05:29,582 - INFO - [Train] step: 23499, loss_D: 0.014829, lr_D: 0.000800 2023-07-01 19:05:29,664 - INFO - [Train] step: 23499, loss_G: 0.630182, lr_G: 0.000800 2023-07-01 19:06:25,897 - INFO - [Train] step: 23599, loss_D: 0.008179, lr_D: 0.000800 2023-07-01 19:06:25,978 - INFO - [Train] step: 23599, loss_G: 0.508672, lr_G: 0.000800 2023-07-01 19:07:22,206 - INFO - [Train] step: 23699, loss_D: 0.006598, lr_D: 0.000800 2023-07-01 19:07:22,288 - INFO - [Train] step: 23699, loss_G: 0.390657, lr_G: 0.000800 2023-07-01 19:08:18,522 - INFO - [Train] step: 23799, loss_D: 0.006603, lr_D: 0.000800 2023-07-01 19:08:18,604 - INFO - [Train] step: 23799, loss_G: 0.485955, lr_G: 0.000800 2023-07-01 19:09:14,834 - INFO - [Train] step: 23899, loss_D: 0.021518, lr_D: 0.000800 2023-07-01 19:09:14,915 - INFO - [Train] step: 23899, loss_G: 0.436500, lr_G: 0.000800 2023-07-01 19:10:11,323 - INFO - [Train] step: 23999, loss_D: 0.021096, lr_D: 0.000800 2023-07-01 19:10:11,405 - INFO - [Train] step: 23999, loss_G: 0.515012, lr_G: 0.000800 2023-07-01 19:11:15,668 - INFO - [Eval] step: 23999, fid: 29.460222 2023-07-01 19:12:11,728 - INFO - [Train] step: 24099, loss_D: 0.011518, lr_D: 0.000800 2023-07-01 19:12:11,810 - INFO - [Train] step: 24099, loss_G: 0.429125, lr_G: 0.000800 2023-07-01 19:13:08,011 - INFO - [Train] step: 24199, loss_D: 0.005955, lr_D: 0.000800 2023-07-01 19:13:08,093 - INFO - [Train] step: 24199, loss_G: 0.494475, lr_G: 0.000800 2023-07-01 19:14:04,308 - INFO - [Train] step: 24299, loss_D: 0.019255, lr_D: 0.000800 2023-07-01 19:14:04,390 - INFO - [Train] step: 24299, loss_G: 0.434792, lr_G: 0.000800 2023-07-01 19:15:00,634 - INFO - [Train] step: 24399, loss_D: 0.017019, lr_D: 0.000800 2023-07-01 19:15:00,715 - INFO - [Train] step: 24399, loss_G: 0.598647, lr_G: 0.000800 2023-07-01 19:15:56,962 - INFO - [Train] step: 24499, loss_D: 0.011056, lr_D: 0.000800 2023-07-01 19:15:57,044 - INFO - [Train] step: 24499, loss_G: 0.476328, lr_G: 0.000800 2023-07-01 19:16:53,412 - INFO - [Train] step: 24599, loss_D: 0.007062, lr_D: 0.000800 2023-07-01 19:16:53,493 - INFO - [Train] step: 24599, loss_G: 0.562907, lr_G: 0.000800 2023-07-01 19:17:49,703 - INFO - [Train] step: 24699, loss_D: 0.007729, lr_D: 0.000800 2023-07-01 19:17:49,785 - INFO - [Train] step: 24699, loss_G: 0.429338, lr_G: 0.000800 2023-07-01 19:18:45,995 - INFO - [Train] step: 24799, loss_D: 0.006395, lr_D: 0.000800 2023-07-01 19:18:46,076 - INFO - [Train] step: 24799, loss_G: 0.507643, lr_G: 0.000800 2023-07-01 19:19:42,271 - INFO - [Train] step: 24899, loss_D: 0.007473, lr_D: 0.000800 2023-07-01 19:19:42,352 - INFO - [Train] step: 24899, loss_G: 0.470858, lr_G: 0.000800 2023-07-01 19:20:38,582 - INFO - [Train] step: 24999, loss_D: 0.026236, lr_D: 0.000800 2023-07-01 19:20:38,664 - INFO - [Train] step: 24999, loss_G: 0.478788, lr_G: 0.000800 2023-07-01 19:21:42,893 - INFO - [Eval] step: 24999, fid: 29.629865 2023-07-01 19:22:38,940 - INFO - [Train] step: 25099, loss_D: 0.013720, lr_D: 0.000800 2023-07-01 19:22:39,022 - INFO - [Train] step: 25099, loss_G: 0.491596, lr_G: 0.000800 2023-07-01 19:23:35,210 - INFO - [Train] step: 25199, loss_D: 0.007071, lr_D: 0.000800 2023-07-01 19:23:35,291 - INFO - [Train] step: 25199, loss_G: 0.494100, lr_G: 0.000800 2023-07-01 19:24:31,672 - INFO - [Train] step: 25299, loss_D: 0.015746, lr_D: 0.000800 2023-07-01 19:24:31,753 - INFO - [Train] step: 25299, loss_G: 0.441014, lr_G: 0.000800 2023-07-01 19:25:27,969 - INFO - [Train] step: 25399, loss_D: 0.007127, lr_D: 0.000800 2023-07-01 19:25:28,050 - INFO - [Train] step: 25399, loss_G: 0.528152, lr_G: 0.000800 2023-07-01 19:26:24,257 - INFO - [Train] step: 25499, loss_D: 0.010578, lr_D: 0.000800 2023-07-01 19:26:24,339 - INFO - [Train] step: 25499, loss_G: 0.574309, lr_G: 0.000800 2023-07-01 19:27:20,552 - INFO - [Train] step: 25599, loss_D: 0.009386, lr_D: 0.000800 2023-07-01 19:27:20,633 - INFO - [Train] step: 25599, loss_G: 0.428291, lr_G: 0.000800 2023-07-01 19:28:16,858 - INFO - [Train] step: 25699, loss_D: 0.013623, lr_D: 0.000800 2023-07-01 19:28:16,939 - INFO - [Train] step: 25699, loss_G: 0.512218, lr_G: 0.000800 2023-07-01 19:29:13,161 - INFO - [Train] step: 25799, loss_D: 0.009861, lr_D: 0.000800 2023-07-01 19:29:13,242 - INFO - [Train] step: 25799, loss_G: 0.500618, lr_G: 0.000800 2023-07-01 19:30:09,619 - INFO - [Train] step: 25899, loss_D: 0.019092, lr_D: 0.000800 2023-07-01 19:30:09,701 - INFO - [Train] step: 25899, loss_G: 0.402772, lr_G: 0.000800 2023-07-01 19:31:05,920 - INFO - [Train] step: 25999, loss_D: 0.005325, lr_D: 0.000800 2023-07-01 19:31:06,002 - INFO - [Train] step: 25999, loss_G: 0.579325, lr_G: 0.000800 2023-07-01 19:32:10,282 - INFO - [Eval] step: 25999, fid: 29.136528 2023-07-01 19:33:06,600 - INFO - [Train] step: 26099, loss_D: 0.011602, lr_D: 0.000800 2023-07-01 19:33:06,681 - INFO - [Train] step: 26099, loss_G: 0.547868, lr_G: 0.000800 2023-07-01 19:34:02,897 - INFO - [Train] step: 26199, loss_D: 0.029338, lr_D: 0.000800 2023-07-01 19:34:02,979 - INFO - [Train] step: 26199, loss_G: 0.247423, lr_G: 0.000800 2023-07-01 19:34:59,195 - INFO - [Train] step: 26299, loss_D: 0.012751, lr_D: 0.000800 2023-07-01 19:34:59,276 - INFO - [Train] step: 26299, loss_G: 0.473507, lr_G: 0.000800 2023-07-01 19:35:55,494 - INFO - [Train] step: 26399, loss_D: 0.007472, lr_D: 0.000800 2023-07-01 19:35:55,576 - INFO - [Train] step: 26399, loss_G: 0.391524, lr_G: 0.000800 2023-07-01 19:36:51,770 - INFO - [Train] step: 26499, loss_D: 0.007753, lr_D: 0.000800 2023-07-01 19:36:51,851 - INFO - [Train] step: 26499, loss_G: 0.507710, lr_G: 0.000800 2023-07-01 19:37:48,217 - INFO - [Train] step: 26599, loss_D: 0.008571, lr_D: 0.000800 2023-07-01 19:37:48,299 - INFO - [Train] step: 26599, loss_G: 0.578436, lr_G: 0.000800 2023-07-01 19:38:44,497 - INFO - [Train] step: 26699, loss_D: 0.011326, lr_D: 0.000800 2023-07-01 19:38:44,579 - INFO - [Train] step: 26699, loss_G: 0.419997, lr_G: 0.000800 2023-07-01 19:39:40,778 - INFO - [Train] step: 26799, loss_D: 0.017305, lr_D: 0.000800 2023-07-01 19:39:40,860 - INFO - [Train] step: 26799, loss_G: 0.346003, lr_G: 0.000800 2023-07-01 19:40:37,063 - INFO - [Train] step: 26899, loss_D: 0.008035, lr_D: 0.000800 2023-07-01 19:40:37,144 - INFO - [Train] step: 26899, loss_G: 0.499777, lr_G: 0.000800 2023-07-01 19:41:33,336 - INFO - [Train] step: 26999, loss_D: 0.011537, lr_D: 0.000800 2023-07-01 19:41:33,417 - INFO - [Train] step: 26999, loss_G: 0.458730, lr_G: 0.000800 2023-07-01 19:42:37,585 - INFO - [Eval] step: 26999, fid: 28.942655 2023-07-01 19:43:33,919 - INFO - [Train] step: 27099, loss_D: 0.006187, lr_D: 0.000800 2023-07-01 19:43:34,000 - INFO - [Train] step: 27099, loss_G: 0.490163, lr_G: 0.000800 2023-07-01 19:44:30,366 - INFO - [Train] step: 27199, loss_D: 0.026210, lr_D: 0.000800 2023-07-01 19:44:30,447 - INFO - [Train] step: 27199, loss_G: 0.390455, lr_G: 0.000800 2023-07-01 19:45:26,665 - INFO - [Train] step: 27299, loss_D: 0.011224, lr_D: 0.000800 2023-07-01 19:45:26,746 - INFO - [Train] step: 27299, loss_G: 0.457716, lr_G: 0.000800 2023-07-01 19:46:22,972 - INFO - [Train] step: 27399, loss_D: 0.006952, lr_D: 0.000800 2023-07-01 19:46:23,053 - INFO - [Train] step: 27399, loss_G: 0.552276, lr_G: 0.000800 2023-07-01 19:47:19,258 - INFO - [Train] step: 27499, loss_D: 0.014164, lr_D: 0.000800 2023-07-01 19:47:19,339 - INFO - [Train] step: 27499, loss_G: 0.496477, lr_G: 0.000800 2023-07-01 19:48:15,563 - INFO - [Train] step: 27599, loss_D: 0.015689, lr_D: 0.000800 2023-07-01 19:48:15,644 - INFO - [Train] step: 27599, loss_G: 0.387336, lr_G: 0.000800 2023-07-01 19:49:11,838 - INFO - [Train] step: 27699, loss_D: 0.011430, lr_D: 0.000800 2023-07-01 19:49:11,919 - INFO - [Train] step: 27699, loss_G: 0.475477, lr_G: 0.000800 2023-07-01 19:50:08,134 - INFO - [Train] step: 27799, loss_D: 0.008779, lr_D: 0.000800 2023-07-01 19:50:08,216 - INFO - [Train] step: 27799, loss_G: 0.550584, lr_G: 0.000800 2023-07-01 19:51:04,583 - INFO - [Train] step: 27899, loss_D: 0.007873, lr_D: 0.000800 2023-07-01 19:51:04,664 - INFO - [Train] step: 27899, loss_G: 0.609806, lr_G: 0.000800 2023-07-01 19:52:00,886 - INFO - [Train] step: 27999, loss_D: 0.006971, lr_D: 0.000800 2023-07-01 19:52:00,967 - INFO - [Train] step: 27999, loss_G: 0.538687, lr_G: 0.000800 2023-07-01 19:53:05,171 - INFO - [Eval] step: 27999, fid: 29.974261 2023-07-01 19:54:01,186 - INFO - [Train] step: 28099, loss_D: 0.008715, lr_D: 0.000800 2023-07-01 19:54:01,267 - INFO - [Train] step: 28099, loss_G: 0.536489, lr_G: 0.000800 2023-07-01 19:54:57,458 - INFO - [Train] step: 28199, loss_D: 0.008299, lr_D: 0.000800 2023-07-01 19:54:57,540 - INFO - [Train] step: 28199, loss_G: 0.577131, lr_G: 0.000800 2023-07-01 19:55:53,751 - INFO - [Train] step: 28299, loss_D: 0.017663, lr_D: 0.000800 2023-07-01 19:55:53,833 - INFO - [Train] step: 28299, loss_G: 0.365824, lr_G: 0.000800 2023-07-01 19:56:50,058 - INFO - [Train] step: 28399, loss_D: 0.006118, lr_D: 0.000800 2023-07-01 19:56:50,138 - INFO - [Train] step: 28399, loss_G: 0.454250, lr_G: 0.000800 2023-07-01 19:57:46,505 - INFO - [Train] step: 28499, loss_D: 0.094912, lr_D: 0.000800 2023-07-01 19:57:46,586 - INFO - [Train] step: 28499, loss_G: 0.314954, lr_G: 0.000800 2023-07-01 19:58:42,788 - INFO - [Train] step: 28599, loss_D: 0.007507, lr_D: 0.000800 2023-07-01 19:58:42,870 - INFO - [Train] step: 28599, loss_G: 0.507665, lr_G: 0.000800 2023-07-01 19:59:39,065 - INFO - [Train] step: 28699, loss_D: 0.009753, lr_D: 0.000800 2023-07-01 19:59:39,146 - INFO - [Train] step: 28699, loss_G: 0.405413, lr_G: 0.000800 2023-07-01 20:00:35,349 - INFO - [Train] step: 28799, loss_D: 0.013898, lr_D: 0.000800 2023-07-01 20:00:35,430 - INFO - [Train] step: 28799, loss_G: 0.588386, lr_G: 0.000800 2023-07-01 20:01:31,640 - INFO - [Train] step: 28899, loss_D: 0.004988, lr_D: 0.000800 2023-07-01 20:01:31,721 - INFO - [Train] step: 28899, loss_G: 0.530986, lr_G: 0.000800 2023-07-01 20:02:27,912 - INFO - [Train] step: 28999, loss_D: 0.005864, lr_D: 0.000800 2023-07-01 20:02:27,993 - INFO - [Train] step: 28999, loss_G: 0.531886, lr_G: 0.000800 2023-07-01 20:03:32,247 - INFO - [Eval] step: 28999, fid: 29.637611 2023-07-01 20:04:28,335 - INFO - [Train] step: 29099, loss_D: 0.006993, lr_D: 0.000800 2023-07-01 20:04:28,416 - INFO - [Train] step: 29099, loss_G: 0.542859, lr_G: 0.000800 2023-07-01 20:05:24,753 - INFO - [Train] step: 29199, loss_D: 0.008590, lr_D: 0.000800 2023-07-01 20:05:24,835 - INFO - [Train] step: 29199, loss_G: 0.436323, lr_G: 0.000800 2023-07-01 20:06:21,039 - INFO - [Train] step: 29299, loss_D: 0.012584, lr_D: 0.000800 2023-07-01 20:06:21,121 - INFO - [Train] step: 29299, loss_G: 0.512135, lr_G: 0.000800 2023-07-01 20:07:17,337 - INFO - [Train] step: 29399, loss_D: 0.015635, lr_D: 0.000800 2023-07-01 20:07:17,419 - INFO - [Train] step: 29399, loss_G: 0.409561, lr_G: 0.000800 2023-07-01 20:08:13,631 - INFO - [Train] step: 29499, loss_D: 0.007856, lr_D: 0.000800 2023-07-01 20:08:13,713 - INFO - [Train] step: 29499, loss_G: 0.509219, lr_G: 0.000800 2023-07-01 20:09:09,941 - INFO - [Train] step: 29599, loss_D: 0.006401, lr_D: 0.000800 2023-07-01 20:09:10,022 - INFO - [Train] step: 29599, loss_G: 0.540470, lr_G: 0.000800 2023-07-01 20:10:06,214 - INFO - [Train] step: 29699, loss_D: 0.006845, lr_D: 0.000800 2023-07-01 20:10:06,295 - INFO - [Train] step: 29699, loss_G: 0.538688, lr_G: 0.000800 2023-07-01 20:11:02,640 - INFO - [Train] step: 29799, loss_D: 0.008562, lr_D: 0.000800 2023-07-01 20:11:02,722 - INFO - [Train] step: 29799, loss_G: 0.471855, lr_G: 0.000800 2023-07-01 20:11:58,939 - INFO - [Train] step: 29899, loss_D: 0.010060, lr_D: 0.000800 2023-07-01 20:11:59,020 - INFO - [Train] step: 29899, loss_G: 0.419619, lr_G: 0.000800 2023-07-01 20:12:55,241 - INFO - [Train] step: 29999, loss_D: 0.009867, lr_D: 0.000800 2023-07-01 20:12:55,322 - INFO - [Train] step: 29999, loss_G: 0.467563, lr_G: 0.000800 2023-07-01 20:13:59,545 - INFO - [Eval] step: 29999, fid: 28.438987 2023-07-01 20:14:56,071 - INFO - [Train] step: 30099, loss_D: 0.006585, lr_D: 0.000800 2023-07-01 20:14:56,153 - INFO - [Train] step: 30099, loss_G: 0.525810, lr_G: 0.000800 2023-07-01 20:15:52,365 - INFO - [Train] step: 30199, loss_D: 0.006839, lr_D: 0.000800 2023-07-01 20:15:52,446 - INFO - [Train] step: 30199, loss_G: 0.585401, lr_G: 0.000800 2023-07-01 20:16:48,648 - INFO - [Train] step: 30299, loss_D: 0.005126, lr_D: 0.000800 2023-07-01 20:16:48,729 - INFO - [Train] step: 30299, loss_G: 0.509553, lr_G: 0.000800 2023-07-01 20:17:45,091 - INFO - [Train] step: 30399, loss_D: 0.015950, lr_D: 0.000800 2023-07-01 20:17:45,172 - INFO - [Train] step: 30399, loss_G: 0.473224, lr_G: 0.000800 2023-07-01 20:18:41,373 - INFO - [Train] step: 30499, loss_D: 0.014676, lr_D: 0.000800 2023-07-01 20:18:41,455 - INFO - [Train] step: 30499, loss_G: 0.588311, lr_G: 0.000800 2023-07-01 20:19:37,673 - INFO - [Train] step: 30599, loss_D: 0.006327, lr_D: 0.000800 2023-07-01 20:19:37,754 - INFO - [Train] step: 30599, loss_G: 0.516392, lr_G: 0.000800 2023-07-01 20:20:33,965 - INFO - [Train] step: 30699, loss_D: 0.013700, lr_D: 0.000800 2023-07-01 20:20:34,046 - INFO - [Train] step: 30699, loss_G: 0.444895, lr_G: 0.000800 2023-07-01 20:21:30,232 - INFO - [Train] step: 30799, loss_D: 0.021089, lr_D: 0.000800 2023-07-01 20:21:30,313 - INFO - [Train] step: 30799, loss_G: 0.410392, lr_G: 0.000800 2023-07-01 20:22:26,513 - INFO - [Train] step: 30899, loss_D: 0.005164, lr_D: 0.000800 2023-07-01 20:22:26,595 - INFO - [Train] step: 30899, loss_G: 0.486042, lr_G: 0.000800 2023-07-01 20:23:22,778 - INFO - [Train] step: 30999, loss_D: 0.006826, lr_D: 0.000800 2023-07-01 20:23:22,859 - INFO - [Train] step: 30999, loss_G: 0.473277, lr_G: 0.000800 2023-07-01 20:24:27,079 - INFO - [Eval] step: 30999, fid: 29.204131 2023-07-01 20:25:23,280 - INFO - [Train] step: 31099, loss_D: 0.005944, lr_D: 0.000800 2023-07-01 20:25:23,362 - INFO - [Train] step: 31099, loss_G: 0.498785, lr_G: 0.000800 2023-07-01 20:26:19,543 - INFO - [Train] step: 31199, loss_D: 0.006456, lr_D: 0.000800 2023-07-01 20:26:19,625 - INFO - [Train] step: 31199, loss_G: 0.518566, lr_G: 0.000800 2023-07-01 20:27:15,821 - INFO - [Train] step: 31299, loss_D: 0.009163, lr_D: 0.000800 2023-07-01 20:27:15,903 - INFO - [Train] step: 31299, loss_G: 0.493884, lr_G: 0.000800 2023-07-01 20:28:12,117 - INFO - [Train] step: 31399, loss_D: 0.005674, lr_D: 0.000800 2023-07-01 20:28:12,198 - INFO - [Train] step: 31399, loss_G: 0.486798, lr_G: 0.000800 2023-07-01 20:29:08,439 - INFO - [Train] step: 31499, loss_D: 0.016643, lr_D: 0.000800 2023-07-01 20:29:08,520 - INFO - [Train] step: 31499, loss_G: 0.306968, lr_G: 0.000800 2023-07-01 20:30:04,710 - INFO - [Train] step: 31599, loss_D: 0.005448, lr_D: 0.000800 2023-07-01 20:30:04,792 - INFO - [Train] step: 31599, loss_G: 0.532557, lr_G: 0.000800 2023-07-01 20:31:01,162 - INFO - [Train] step: 31699, loss_D: 0.008905, lr_D: 0.000800 2023-07-01 20:31:01,244 - INFO - [Train] step: 31699, loss_G: 0.529348, lr_G: 0.000800 2023-07-01 20:31:57,439 - INFO - [Train] step: 31799, loss_D: 0.008976, lr_D: 0.000800 2023-07-01 20:31:57,520 - INFO - [Train] step: 31799, loss_G: 0.471150, lr_G: 0.000800 2023-07-01 20:32:53,726 - INFO - [Train] step: 31899, loss_D: 0.005551, lr_D: 0.000800 2023-07-01 20:32:53,808 - INFO - [Train] step: 31899, loss_G: 0.525096, lr_G: 0.000800 2023-07-01 20:33:49,986 - INFO - [Train] step: 31999, loss_D: 0.021764, lr_D: 0.000800 2023-07-01 20:33:50,068 - INFO - [Train] step: 31999, loss_G: 0.399989, lr_G: 0.000800 2023-07-01 20:34:54,299 - INFO - [Eval] step: 31999, fid: 28.903355 2023-07-01 20:35:50,358 - INFO - [Train] step: 32099, loss_D: 0.007599, lr_D: 0.000800 2023-07-01 20:35:50,440 - INFO - [Train] step: 32099, loss_G: 0.505525, lr_G: 0.000800 2023-07-01 20:36:46,632 - INFO - [Train] step: 32199, loss_D: 0.005625, lr_D: 0.000800 2023-07-01 20:36:46,713 - INFO - [Train] step: 32199, loss_G: 0.535835, lr_G: 0.000800 2023-07-01 20:37:42,934 - INFO - [Train] step: 32299, loss_D: 0.014649, lr_D: 0.000800 2023-07-01 20:37:43,015 - INFO - [Train] step: 32299, loss_G: 0.492300, lr_G: 0.000800 2023-07-01 20:38:39,368 - INFO - [Train] step: 32399, loss_D: 0.006051, lr_D: 0.000800 2023-07-01 20:38:39,449 - INFO - [Train] step: 32399, loss_G: 0.485706, lr_G: 0.000800 2023-07-01 20:39:35,661 - INFO - [Train] step: 32499, loss_D: 0.009181, lr_D: 0.000800 2023-07-01 20:39:35,742 - INFO - [Train] step: 32499, loss_G: 0.412587, lr_G: 0.000800 2023-07-01 20:40:31,954 - INFO - [Train] step: 32599, loss_D: 0.008016, lr_D: 0.000800 2023-07-01 20:40:32,036 - INFO - [Train] step: 32599, loss_G: 0.477234, lr_G: 0.000800 2023-07-01 20:41:28,242 - INFO - [Train] step: 32699, loss_D: 0.014378, lr_D: 0.000800 2023-07-01 20:41:28,323 - INFO - [Train] step: 32699, loss_G: 0.445840, lr_G: 0.000800 2023-07-01 20:42:24,555 - INFO - [Train] step: 32799, loss_D: 0.007808, lr_D: 0.000800 2023-07-01 20:42:24,636 - INFO - [Train] step: 32799, loss_G: 0.578918, lr_G: 0.000800 2023-07-01 20:43:20,839 - INFO - [Train] step: 32899, loss_D: 0.009094, lr_D: 0.000800 2023-07-01 20:43:20,921 - INFO - [Train] step: 32899, loss_G: 0.496615, lr_G: 0.000800 2023-07-01 20:44:17,294 - INFO - [Train] step: 32999, loss_D: 0.006956, lr_D: 0.000800 2023-07-01 20:44:17,376 - INFO - [Train] step: 32999, loss_G: 0.498804, lr_G: 0.000800 2023-07-01 20:45:21,581 - INFO - [Eval] step: 32999, fid: 29.391535 2023-07-01 20:46:17,643 - INFO - [Train] step: 33099, loss_D: 0.007565, lr_D: 0.000800 2023-07-01 20:46:17,724 - INFO - [Train] step: 33099, loss_G: 0.506936, lr_G: 0.000800 2023-07-01 20:47:13,942 - INFO - [Train] step: 33199, loss_D: 0.009538, lr_D: 0.000800 2023-07-01 20:47:14,024 - INFO - [Train] step: 33199, loss_G: 0.497423, lr_G: 0.000800 2023-07-01 20:48:10,246 - INFO - [Train] step: 33299, loss_D: 0.004492, lr_D: 0.000800 2023-07-01 20:48:10,327 - INFO - [Train] step: 33299, loss_G: 0.453382, lr_G: 0.000800 2023-07-01 20:49:06,542 - INFO - [Train] step: 33399, loss_D: 0.018868, lr_D: 0.000800 2023-07-01 20:49:06,623 - INFO - [Train] step: 33399, loss_G: 0.485676, lr_G: 0.000800 2023-07-01 20:50:02,827 - INFO - [Train] step: 33499, loss_D: 0.005363, lr_D: 0.000800 2023-07-01 20:50:02,909 - INFO - [Train] step: 33499, loss_G: 0.544964, lr_G: 0.000800 2023-07-01 20:50:59,097 - INFO - [Train] step: 33599, loss_D: 0.009065, lr_D: 0.000800 2023-07-01 20:50:59,178 - INFO - [Train] step: 33599, loss_G: 0.461471, lr_G: 0.000800 2023-07-01 20:51:55,554 - INFO - [Train] step: 33699, loss_D: 0.012356, lr_D: 0.000800 2023-07-01 20:51:55,636 - INFO - [Train] step: 33699, loss_G: 0.485596, lr_G: 0.000800 2023-07-01 20:52:51,848 - INFO - [Train] step: 33799, loss_D: 0.006972, lr_D: 0.000800 2023-07-01 20:52:51,930 - INFO - [Train] step: 33799, loss_G: 0.453154, lr_G: 0.000800 2023-07-01 20:53:48,142 - INFO - [Train] step: 33899, loss_D: 0.005063, lr_D: 0.000800 2023-07-01 20:53:48,223 - INFO - [Train] step: 33899, loss_G: 0.498531, lr_G: 0.000800 2023-07-01 20:54:44,424 - INFO - [Train] step: 33999, loss_D: 0.011498, lr_D: 0.000800 2023-07-01 20:54:44,506 - INFO - [Train] step: 33999, loss_G: 0.470103, lr_G: 0.000800 2023-07-01 20:55:48,720 - INFO - [Eval] step: 33999, fid: 28.693078 2023-07-01 20:56:44,782 - INFO - [Train] step: 34099, loss_D: 0.006173, lr_D: 0.000800 2023-07-01 20:56:44,864 - INFO - [Train] step: 34099, loss_G: 0.462020, lr_G: 0.000800 2023-07-01 20:57:41,075 - INFO - [Train] step: 34199, loss_D: 0.042533, lr_D: 0.000800 2023-07-01 20:57:41,157 - INFO - [Train] step: 34199, loss_G: 0.347429, lr_G: 0.000800 2023-07-01 20:58:37,542 - INFO - [Train] step: 34299, loss_D: 0.008412, lr_D: 0.000800 2023-07-01 20:58:37,623 - INFO - [Train] step: 34299, loss_G: 0.427577, lr_G: 0.000800 2023-07-01 20:59:33,861 - INFO - [Train] step: 34399, loss_D: 0.007444, lr_D: 0.000800 2023-07-01 20:59:33,943 - INFO - [Train] step: 34399, loss_G: 0.491827, lr_G: 0.000800 2023-07-01 21:00:30,162 - INFO - [Train] step: 34499, loss_D: 0.006357, lr_D: 0.000800 2023-07-01 21:00:30,244 - INFO - [Train] step: 34499, loss_G: 0.475162, lr_G: 0.000800 2023-07-01 21:01:26,490 - INFO - [Train] step: 34599, loss_D: 0.005610, lr_D: 0.000800 2023-07-01 21:01:26,571 - INFO - [Train] step: 34599, loss_G: 0.442595, lr_G: 0.000800 2023-07-01 21:02:22,798 - INFO - [Train] step: 34699, loss_D: 0.007770, lr_D: 0.000800 2023-07-01 21:02:22,879 - INFO - [Train] step: 34699, loss_G: 0.499187, lr_G: 0.000800 2023-07-01 21:03:19,093 - INFO - [Train] step: 34799, loss_D: 0.009583, lr_D: 0.000800 2023-07-01 21:03:19,174 - INFO - [Train] step: 34799, loss_G: 0.466947, lr_G: 0.000800 2023-07-01 21:04:15,743 - INFO - [Train] step: 34899, loss_D: 0.018555, lr_D: 0.000800 2023-07-01 21:04:15,824 - INFO - [Train] step: 34899, loss_G: 0.455649, lr_G: 0.000800 2023-07-01 21:05:12,632 - INFO - [Train] step: 34999, loss_D: 0.017899, lr_D: 0.000800 2023-07-01 21:05:12,713 - INFO - [Train] step: 34999, loss_G: 0.392863, lr_G: 0.000800 2023-07-01 21:06:16,973 - INFO - [Eval] step: 34999, fid: 29.209337 2023-07-01 21:07:13,058 - INFO - [Train] step: 35099, loss_D: 0.014280, lr_D: 0.000800 2023-07-01 21:07:13,140 - INFO - [Train] step: 35099, loss_G: 0.413777, lr_G: 0.000800 2023-07-01 21:08:09,362 - INFO - [Train] step: 35199, loss_D: 0.005024, lr_D: 0.000800 2023-07-01 21:08:09,444 - INFO - [Train] step: 35199, loss_G: 0.485391, lr_G: 0.000800 2023-07-01 21:09:05,671 - INFO - [Train] step: 35299, loss_D: 0.006565, lr_D: 0.000800 2023-07-01 21:09:05,752 - INFO - [Train] step: 35299, loss_G: 0.528535, lr_G: 0.000800 2023-07-01 21:10:01,961 - INFO - [Train] step: 35399, loss_D: 0.004606, lr_D: 0.000800 2023-07-01 21:10:02,042 - INFO - [Train] step: 35399, loss_G: 0.421638, lr_G: 0.000800 2023-07-01 21:10:58,247 - INFO - [Train] step: 35499, loss_D: 0.006037, lr_D: 0.000800 2023-07-01 21:10:58,328 - INFO - [Train] step: 35499, loss_G: 0.519085, lr_G: 0.000800 2023-07-01 21:11:54,683 - INFO - [Train] step: 35599, loss_D: 0.006184, lr_D: 0.000800 2023-07-01 21:11:54,765 - INFO - [Train] step: 35599, loss_G: 0.482145, lr_G: 0.000800 2023-07-01 21:12:50,972 - INFO - [Train] step: 35699, loss_D: 0.009600, lr_D: 0.000800 2023-07-01 21:12:51,054 - INFO - [Train] step: 35699, loss_G: 0.522247, lr_G: 0.000800 2023-07-01 21:13:47,263 - INFO - [Train] step: 35799, loss_D: 0.009911, lr_D: 0.000800 2023-07-01 21:13:47,344 - INFO - [Train] step: 35799, loss_G: 0.536509, lr_G: 0.000800 2023-07-01 21:14:43,542 - INFO - [Train] step: 35899, loss_D: 0.006834, lr_D: 0.000800 2023-07-01 21:14:43,624 - INFO - [Train] step: 35899, loss_G: 0.499934, lr_G: 0.000800 2023-07-01 21:15:39,818 - INFO - [Train] step: 35999, loss_D: 0.007034, lr_D: 0.000800 2023-07-01 21:15:39,900 - INFO - [Train] step: 35999, loss_G: 0.427329, lr_G: 0.000800 2023-07-01 21:16:44,109 - INFO - [Eval] step: 35999, fid: 29.273343 2023-07-01 21:17:40,130 - INFO - [Train] step: 36099, loss_D: 0.011253, lr_D: 0.000800 2023-07-01 21:17:40,212 - INFO - [Train] step: 36099, loss_G: 0.464884, lr_G: 0.000800 2023-07-01 21:18:36,402 - INFO - [Train] step: 36199, loss_D: 0.015299, lr_D: 0.000800 2023-07-01 21:18:36,483 - INFO - [Train] step: 36199, loss_G: 0.444111, lr_G: 0.000800 2023-07-01 21:19:32,855 - INFO - [Train] step: 36299, loss_D: 0.004250, lr_D: 0.000800 2023-07-01 21:19:32,937 - INFO - [Train] step: 36299, loss_G: 0.540237, lr_G: 0.000800 2023-07-01 21:20:29,150 - INFO - [Train] step: 36399, loss_D: 0.004625, lr_D: 0.000800 2023-07-01 21:20:29,231 - INFO - [Train] step: 36399, loss_G: 0.500283, lr_G: 0.000800 2023-07-01 21:21:25,454 - INFO - [Train] step: 36499, loss_D: 0.004593, lr_D: 0.000800 2023-07-01 21:21:25,536 - INFO - [Train] step: 36499, loss_G: 0.512707, lr_G: 0.000800 2023-07-01 21:22:21,738 - INFO - [Train] step: 36599, loss_D: 0.006763, lr_D: 0.000800 2023-07-01 21:22:21,820 - INFO - [Train] step: 36599, loss_G: 0.462912, lr_G: 0.000800 2023-07-01 21:23:18,030 - INFO - [Train] step: 36699, loss_D: 0.006047, lr_D: 0.000800 2023-07-01 21:23:18,111 - INFO - [Train] step: 36699, loss_G: 0.542303, lr_G: 0.000800 2023-07-01 21:24:14,291 - INFO - [Train] step: 36799, loss_D: 0.009533, lr_D: 0.000800 2023-07-01 21:24:14,372 - INFO - [Train] step: 36799, loss_G: 0.454194, lr_G: 0.000800 2023-07-01 21:25:10,748 - INFO - [Train] step: 36899, loss_D: 0.009707, lr_D: 0.000800 2023-07-01 21:25:10,830 - INFO - [Train] step: 36899, loss_G: 0.526882, lr_G: 0.000800 2023-07-01 21:26:07,043 - INFO - [Train] step: 36999, loss_D: 0.006530, lr_D: 0.000800 2023-07-01 21:26:07,124 - INFO - [Train] step: 36999, loss_G: 0.490091, lr_G: 0.000800 2023-07-01 21:27:11,304 - INFO - [Eval] step: 36999, fid: 29.200974 2023-07-01 21:28:07,295 - INFO - [Train] step: 37099, loss_D: 0.010660, lr_D: 0.000800 2023-07-01 21:28:07,377 - INFO - [Train] step: 37099, loss_G: 0.527466, lr_G: 0.000800 2023-07-01 21:29:03,595 - INFO - [Train] step: 37199, loss_D: 0.008211, lr_D: 0.000800 2023-07-01 21:29:03,676 - INFO - [Train] step: 37199, loss_G: 0.557356, lr_G: 0.000800 2023-07-01 21:29:59,909 - INFO - [Train] step: 37299, loss_D: 0.004410, lr_D: 0.000800 2023-07-01 21:29:59,998 - INFO - [Train] step: 37299, loss_G: 0.478788, lr_G: 0.000800 2023-07-01 21:30:56,223 - INFO - [Train] step: 37399, loss_D: 0.021454, lr_D: 0.000800 2023-07-01 21:30:56,305 - INFO - [Train] step: 37399, loss_G: 0.386928, lr_G: 0.000800 2023-07-01 21:31:52,538 - INFO - [Train] step: 37499, loss_D: 0.004321, lr_D: 0.000800 2023-07-01 21:31:52,619 - INFO - [Train] step: 37499, loss_G: 0.470156, lr_G: 0.000800 2023-07-01 21:32:49,001 - INFO - [Train] step: 37599, loss_D: 0.008819, lr_D: 0.000800 2023-07-01 21:32:49,083 - INFO - [Train] step: 37599, loss_G: 0.453485, lr_G: 0.000800 2023-07-01 21:33:45,278 - INFO - [Train] step: 37699, loss_D: 0.062676, lr_D: 0.000800 2023-07-01 21:33:45,360 - INFO - [Train] step: 37699, loss_G: 0.274167, lr_G: 0.000800 2023-07-01 21:34:41,560 - INFO - [Train] step: 37799, loss_D: 0.029416, lr_D: 0.000800 2023-07-01 21:34:41,641 - INFO - [Train] step: 37799, loss_G: 0.397284, lr_G: 0.000800 2023-07-01 21:35:37,876 - INFO - [Train] step: 37899, loss_D: 0.003817, lr_D: 0.000800 2023-07-01 21:35:37,957 - INFO - [Train] step: 37899, loss_G: 0.515849, lr_G: 0.000800 2023-07-01 21:36:34,175 - INFO - [Train] step: 37999, loss_D: 0.023767, lr_D: 0.000800 2023-07-01 21:36:34,257 - INFO - [Train] step: 37999, loss_G: 0.394301, lr_G: 0.000800 2023-07-01 21:37:38,499 - INFO - [Eval] step: 37999, fid: 30.114388 2023-07-01 21:38:34,514 - INFO - [Train] step: 38099, loss_D: 0.004758, lr_D: 0.000800 2023-07-01 21:38:34,595 - INFO - [Train] step: 38099, loss_G: 0.452437, lr_G: 0.000800 2023-07-01 21:39:30,978 - INFO - [Train] step: 38199, loss_D: 0.005971, lr_D: 0.000800 2023-07-01 21:39:31,060 - INFO - [Train] step: 38199, loss_G: 0.523271, lr_G: 0.000800 2023-07-01 21:40:27,269 - INFO - [Train] step: 38299, loss_D: 0.009606, lr_D: 0.000800 2023-07-01 21:40:27,351 - INFO - [Train] step: 38299, loss_G: 0.629234, lr_G: 0.000800 2023-07-01 21:41:23,558 - INFO - [Train] step: 38399, loss_D: 0.017154, lr_D: 0.000800 2023-07-01 21:41:23,639 - INFO - [Train] step: 38399, loss_G: 0.646959, lr_G: 0.000800 2023-07-01 21:42:19,845 - INFO - [Train] step: 38499, loss_D: 0.010298, lr_D: 0.000800 2023-07-01 21:42:19,926 - INFO - [Train] step: 38499, loss_G: 0.485315, lr_G: 0.000800 2023-07-01 21:43:16,136 - INFO - [Train] step: 38599, loss_D: 0.012583, lr_D: 0.000800 2023-07-01 21:43:16,218 - INFO - [Train] step: 38599, loss_G: 0.430766, lr_G: 0.000800 2023-07-01 21:44:12,437 - INFO - [Train] step: 38699, loss_D: 0.006646, lr_D: 0.000800 2023-07-01 21:44:12,518 - INFO - [Train] step: 38699, loss_G: 0.462981, lr_G: 0.000800 2023-07-01 21:45:08,736 - INFO - [Train] step: 38799, loss_D: 0.005173, lr_D: 0.000800 2023-07-01 21:45:08,817 - INFO - [Train] step: 38799, loss_G: 0.547105, lr_G: 0.000800 2023-07-01 21:46:05,160 - INFO - [Train] step: 38899, loss_D: 1.011272, lr_D: 0.000800 2023-07-01 21:46:05,242 - INFO - [Train] step: 38899, loss_G: 0.359154, lr_G: 0.000800 2023-07-01 21:47:01,454 - INFO - [Train] step: 38999, loss_D: 0.010683, lr_D: 0.000800 2023-07-01 21:47:01,536 - INFO - [Train] step: 38999, loss_G: 0.491258, lr_G: 0.000800 2023-07-01 21:48:05,715 - INFO - [Eval] step: 38999, fid: 28.675869 2023-07-01 21:49:01,750 - INFO - [Train] step: 39099, loss_D: 0.009661, lr_D: 0.000800 2023-07-01 21:49:01,832 - INFO - [Train] step: 39099, loss_G: 0.450137, lr_G: 0.000800 2023-07-01 21:49:58,041 - INFO - [Train] step: 39199, loss_D: 0.008570, lr_D: 0.000800 2023-07-01 21:49:58,122 - INFO - [Train] step: 39199, loss_G: 0.497657, lr_G: 0.000800 2023-07-01 21:50:54,331 - INFO - [Train] step: 39299, loss_D: 0.006551, lr_D: 0.000800 2023-07-01 21:50:54,412 - INFO - [Train] step: 39299, loss_G: 0.544580, lr_G: 0.000800 2023-07-01 21:51:50,639 - INFO - [Train] step: 39399, loss_D: 0.006657, lr_D: 0.000800 2023-07-01 21:51:50,720 - INFO - [Train] step: 39399, loss_G: 0.631172, lr_G: 0.000800 2023-07-01 21:52:47,106 - INFO - [Train] step: 39499, loss_D: 0.005097, lr_D: 0.000800 2023-07-01 21:52:47,187 - INFO - [Train] step: 39499, loss_G: 0.485837, lr_G: 0.000800 2023-07-01 21:53:43,409 - INFO - [Train] step: 39599, loss_D: 0.005782, lr_D: 0.000800 2023-07-01 21:53:43,491 - INFO - [Train] step: 39599, loss_G: 0.513047, lr_G: 0.000800 2023-07-01 21:54:39,730 - INFO - [Train] step: 39699, loss_D: 0.008062, lr_D: 0.000800 2023-07-01 21:54:39,812 - INFO - [Train] step: 39699, loss_G: 0.589866, lr_G: 0.000800 2023-07-01 21:55:36,021 - INFO - [Train] step: 39799, loss_D: 0.007721, lr_D: 0.000800 2023-07-01 21:55:36,103 - INFO - [Train] step: 39799, loss_G: 0.510992, lr_G: 0.000800 2023-07-01 21:56:32,323 - INFO - [Train] step: 39899, loss_D: 0.006672, lr_D: 0.000800 2023-07-01 21:56:32,404 - INFO - [Train] step: 39899, loss_G: 0.589983, lr_G: 0.000800 2023-07-01 21:57:28,604 - INFO - [Train] step: 39999, loss_D: 0.010124, lr_D: 0.000800 2023-07-01 21:57:28,685 - INFO - [Train] step: 39999, loss_G: 0.580782, lr_G: 0.000800 2023-07-01 21:58:32,933 - INFO - [Eval] step: 39999, fid: 28.713172 2023-07-01 21:58:33,102 - INFO - Best FID score: 28.43898727768476 2023-07-01 21:58:33,103 - INFO - End of training