2023-06-30 13:53:27,638 - INFO - Experiment directory: ./runs/wgan_gp_cifar10/ 2023-06-30 13:53:27,638 - INFO - Number of processes: 1 2023-06-30 13:53:27,638 - INFO - Distributed type: DistributedType.NO 2023-06-30 13:53:27,638 - INFO - Mixed precision: no 2023-06-30 13:53:27,639 - INFO - ============================== 2023-06-30 13:53:27,922 - INFO - Size of training set: 50000 2023-06-30 13:53:27,922 - INFO - Batch size per process: 512 2023-06-30 13:53:27,922 - INFO - Total batch size: 512 2023-06-30 13:53:27,922 - INFO - ============================== 2023-06-30 13:53:28,472 - INFO - Start training... 2023-06-30 13:55:10,101 - INFO - [Train] step: 99, loss_D: -11.312748, lr_D: 0.000800 2023-06-30 13:55:10,182 - INFO - [Train] step: 99, loss_G: 35.602295, lr_G: 0.000800 2023-06-30 13:56:51,632 - INFO - [Train] step: 199, loss_D: -8.858016, lr_D: 0.000800 2023-06-30 13:56:51,714 - INFO - [Train] step: 199, loss_G: 42.709717, lr_G: 0.000800 2023-06-30 13:58:33,297 - INFO - [Train] step: 299, loss_D: -6.607683, lr_D: 0.000800 2023-06-30 13:58:33,379 - INFO - [Train] step: 299, loss_G: 39.683426, lr_G: 0.000800 2023-06-30 14:00:14,967 - INFO - [Train] step: 399, loss_D: -2.085331, lr_D: 0.000800 2023-06-30 14:00:15,048 - INFO - [Train] step: 399, loss_G: 48.280815, lr_G: 0.000800 2023-06-30 14:01:56,643 - INFO - [Train] step: 499, loss_D: -4.094331, lr_D: 0.000800 2023-06-30 14:01:56,724 - INFO - [Train] step: 499, loss_G: 55.625061, lr_G: 0.000800 2023-06-30 14:03:38,302 - INFO - [Train] step: 599, loss_D: -3.399889, lr_D: 0.000800 2023-06-30 14:03:38,383 - INFO - [Train] step: 599, loss_G: 57.864449, lr_G: 0.000800 2023-06-30 14:05:20,087 - INFO - [Train] step: 699, loss_D: -2.310385, lr_D: 0.000800 2023-06-30 14:05:20,169 - INFO - [Train] step: 699, loss_G: 70.274734, lr_G: 0.000800 2023-06-30 14:07:01,745 - INFO - [Train] step: 799, loss_D: -4.000719, lr_D: 0.000800 2023-06-30 14:07:01,826 - INFO - [Train] step: 799, loss_G: 69.733292, lr_G: 0.000800 2023-06-30 14:08:43,384 - INFO - [Train] step: 899, loss_D: -3.408787, lr_D: 0.000800 2023-06-30 14:08:43,465 - INFO - [Train] step: 899, loss_G: 85.543015, lr_G: 0.000800 2023-06-30 14:10:25,014 - INFO - [Train] step: 999, loss_D: -3.995867, lr_D: 0.000800 2023-06-30 14:10:25,096 - INFO - [Train] step: 999, loss_G: 95.984772, lr_G: 0.000800 2023-06-30 14:11:29,245 - INFO - [Eval] step: 999, fid: 92.117445 2023-06-30 14:13:10,927 - INFO - [Train] step: 1099, loss_D: -2.782205, lr_D: 0.000800 2023-06-30 14:13:11,009 - INFO - [Train] step: 1099, loss_G: 99.882828, lr_G: 0.000800 2023-06-30 14:14:52,918 - INFO - [Train] step: 1199, loss_D: -3.569655, lr_D: 0.000800 2023-06-30 14:14:53,000 - INFO - [Train] step: 1199, loss_G: 104.119385, lr_G: 0.000800 2023-06-30 14:16:34,876 - INFO - [Train] step: 1299, loss_D: -2.393898, lr_D: 0.000800 2023-06-30 14:16:34,957 - INFO - [Train] step: 1299, loss_G: 112.550911, lr_G: 0.000800 2023-06-30 14:18:16,698 - INFO - [Train] step: 1399, loss_D: -2.921513, lr_D: 0.000800 2023-06-30 14:18:16,780 - INFO - [Train] step: 1399, loss_G: 117.918495, lr_G: 0.000800 2023-06-30 14:19:58,422 - INFO - [Train] step: 1499, loss_D: -2.485749, lr_D: 0.000800 2023-06-30 14:19:58,504 - INFO - [Train] step: 1499, loss_G: 125.343231, lr_G: 0.000800 2023-06-30 14:21:40,166 - INFO - [Train] step: 1599, loss_D: -2.631428, lr_D: 0.000800 2023-06-30 14:21:40,248 - INFO - [Train] step: 1599, loss_G: 123.018448, lr_G: 0.000800 2023-06-30 14:23:21,894 - INFO - [Train] step: 1699, loss_D: -2.227435, lr_D: 0.000800 2023-06-30 14:23:21,975 - INFO - [Train] step: 1699, loss_G: 122.487442, lr_G: 0.000800 2023-06-30 14:25:03,669 - INFO - [Train] step: 1799, loss_D: -2.569058, lr_D: 0.000800 2023-06-30 14:25:03,750 - INFO - [Train] step: 1799, loss_G: 124.959656, lr_G: 0.000800 2023-06-30 14:26:45,417 - INFO - [Train] step: 1899, loss_D: -2.023973, lr_D: 0.000800 2023-06-30 14:26:45,499 - INFO - [Train] step: 1899, loss_G: 123.167023, lr_G: 0.000800 2023-06-30 14:28:27,294 - INFO - [Train] step: 1999, loss_D: -1.840930, lr_D: 0.000800 2023-06-30 14:28:27,376 - INFO - [Train] step: 1999, loss_G: 125.335052, lr_G: 0.000800 2023-06-30 14:29:31,410 - INFO - [Eval] step: 1999, fid: 69.843517 2023-06-30 14:31:13,279 - INFO - [Train] step: 2099, loss_D: -2.567406, lr_D: 0.000800 2023-06-30 14:31:13,361 - INFO - [Train] step: 2099, loss_G: 126.175156, lr_G: 0.000800 2023-06-30 14:32:55,062 - INFO - [Train] step: 2199, loss_D: -2.195648, lr_D: 0.000800 2023-06-30 14:32:55,144 - INFO - [Train] step: 2199, loss_G: 124.703735, lr_G: 0.000800 2023-06-30 14:34:36,837 - INFO - [Train] step: 2299, loss_D: -1.873435, lr_D: 0.000800 2023-06-30 14:34:36,919 - INFO - [Train] step: 2299, loss_G: 125.181572, lr_G: 0.000800 2023-06-30 14:36:18,622 - INFO - [Train] step: 2399, loss_D: -2.627619, lr_D: 0.000800 2023-06-30 14:36:18,704 - INFO - [Train] step: 2399, loss_G: 127.954941, lr_G: 0.000800 2023-06-30 14:38:00,369 - INFO - [Train] step: 2499, loss_D: -1.911760, lr_D: 0.000800 2023-06-30 14:38:00,451 - INFO - [Train] step: 2499, loss_G: 126.415726, lr_G: 0.000800 2023-06-30 14:39:42,236 - INFO - [Train] step: 2599, loss_D: -2.195098, lr_D: 0.000800 2023-06-30 14:39:42,318 - INFO - [Train] step: 2599, loss_G: 125.598663, lr_G: 0.000800 2023-06-30 14:41:23,968 - INFO - [Train] step: 2699, loss_D: -2.405486, lr_D: 0.000800 2023-06-30 14:41:24,058 - INFO - [Train] step: 2699, loss_G: 126.935150, lr_G: 0.000800 2023-06-30 14:43:05,684 - INFO - [Train] step: 2799, loss_D: -1.795690, lr_D: 0.000800 2023-06-30 14:43:05,765 - INFO - [Train] step: 2799, loss_G: 123.326523, lr_G: 0.000800 2023-06-30 14:44:47,435 - INFO - [Train] step: 2899, loss_D: -2.262973, lr_D: 0.000800 2023-06-30 14:44:47,516 - INFO - [Train] step: 2899, loss_G: 127.968414, lr_G: 0.000800 2023-06-30 14:46:29,186 - INFO - [Train] step: 2999, loss_D: -2.696603, lr_D: 0.000800 2023-06-30 14:46:29,268 - INFO - [Train] step: 2999, loss_G: 130.061325, lr_G: 0.000800 2023-06-30 14:47:33,401 - INFO - [Eval] step: 2999, fid: 54.430509 2023-06-30 14:49:15,281 - INFO - [Train] step: 3099, loss_D: -1.849721, lr_D: 0.000800 2023-06-30 14:49:15,363 - INFO - [Train] step: 3099, loss_G: 129.286392, lr_G: 0.000800 2023-06-30 14:50:57,037 - INFO - [Train] step: 3199, loss_D: -2.551917, lr_D: 0.000800 2023-06-30 14:50:57,119 - INFO - [Train] step: 3199, loss_G: 129.376495, lr_G: 0.000800 2023-06-30 14:52:38,957 - INFO - [Train] step: 3299, loss_D: -1.060911, lr_D: 0.000800 2023-06-30 14:52:39,039 - INFO - [Train] step: 3299, loss_G: 131.401825, lr_G: 0.000800 2023-06-30 14:54:20,733 - INFO - [Train] step: 3399, loss_D: -2.088665, lr_D: 0.000800 2023-06-30 14:54:20,815 - INFO - [Train] step: 3399, loss_G: 131.632660, lr_G: 0.000800 2023-06-30 14:56:02,477 - INFO - [Train] step: 3499, loss_D: -2.746735, lr_D: 0.000800 2023-06-30 14:56:02,558 - INFO - [Train] step: 3499, loss_G: 132.529175, lr_G: 0.000800 2023-06-30 14:57:44,219 - INFO - [Train] step: 3599, loss_D: -2.215415, lr_D: 0.000800 2023-06-30 14:57:44,301 - INFO - [Train] step: 3599, loss_G: 129.934525, lr_G: 0.000800 2023-06-30 14:59:25,968 - INFO - [Train] step: 3699, loss_D: -1.772032, lr_D: 0.000800 2023-06-30 14:59:26,050 - INFO - [Train] step: 3699, loss_G: 132.969757, lr_G: 0.000800 2023-06-30 15:01:07,737 - INFO - [Train] step: 3799, loss_D: -2.189769, lr_D: 0.000800 2023-06-30 15:01:07,819 - INFO - [Train] step: 3799, loss_G: 132.451645, lr_G: 0.000800 2023-06-30 15:02:49,616 - INFO - [Train] step: 3899, loss_D: -1.944162, lr_D: 0.000800 2023-06-30 15:02:49,697 - INFO - [Train] step: 3899, loss_G: 132.560364, lr_G: 0.000800 2023-06-30 15:04:31,357 - INFO - [Train] step: 3999, loss_D: -1.844589, lr_D: 0.000800 2023-06-30 15:04:31,439 - INFO - [Train] step: 3999, loss_G: 135.847824, lr_G: 0.000800 2023-06-30 15:05:35,613 - INFO - [Eval] step: 3999, fid: 46.255898 2023-06-30 15:07:17,438 - INFO - [Train] step: 4099, loss_D: -2.297100, lr_D: 0.000800 2023-06-30 15:07:17,520 - INFO - [Train] step: 4099, loss_G: 134.990692, lr_G: 0.000800 2023-06-30 15:08:59,210 - INFO - [Train] step: 4199, loss_D: -2.638199, lr_D: 0.000800 2023-06-30 15:08:59,292 - INFO - [Train] step: 4199, loss_G: 134.279373, lr_G: 0.000800 2023-06-30 15:10:40,981 - INFO - [Train] step: 4299, loss_D: -2.220670, lr_D: 0.000800 2023-06-30 15:10:41,063 - INFO - [Train] step: 4299, loss_G: 132.362976, lr_G: 0.000800 2023-06-30 15:12:22,709 - INFO - [Train] step: 4399, loss_D: -1.121068, lr_D: 0.000800 2023-06-30 15:12:22,791 - INFO - [Train] step: 4399, loss_G: 128.345612, lr_G: 0.000800 2023-06-30 15:14:04,440 - INFO - [Train] step: 4499, loss_D: -1.856126, lr_D: 0.000800 2023-06-30 15:14:04,521 - INFO - [Train] step: 4499, loss_G: 136.985260, lr_G: 0.000800 2023-06-30 15:15:46,416 - INFO - [Train] step: 4599, loss_D: -1.471236, lr_D: 0.000800 2023-06-30 15:15:46,498 - INFO - [Train] step: 4599, loss_G: 129.484833, lr_G: 0.000800 2023-06-30 15:17:28,226 - INFO - [Train] step: 4699, loss_D: -1.452663, lr_D: 0.000800 2023-06-30 15:17:28,307 - INFO - [Train] step: 4699, loss_G: 130.953949, lr_G: 0.000800 2023-06-30 15:19:09,973 - INFO - [Train] step: 4799, loss_D: -0.915825, lr_D: 0.000800 2023-06-30 15:19:10,054 - INFO - [Train] step: 4799, loss_G: 127.419266, lr_G: 0.000800 2023-06-30 15:20:51,655 - INFO - [Train] step: 4899, loss_D: -2.853243, lr_D: 0.000800 2023-06-30 15:20:51,736 - INFO - [Train] step: 4899, loss_G: 129.186508, lr_G: 0.000800 2023-06-30 15:22:33,356 - INFO - [Train] step: 4999, loss_D: -1.863373, lr_D: 0.000800 2023-06-30 15:22:33,438 - INFO - [Train] step: 4999, loss_G: 128.600983, lr_G: 0.000800 2023-06-30 15:23:37,529 - INFO - [Eval] step: 4999, fid: 43.559219 2023-06-30 15:25:19,324 - INFO - [Train] step: 5099, loss_D: -1.961694, lr_D: 0.000800 2023-06-30 15:25:19,414 - INFO - [Train] step: 5099, loss_G: 124.999557, lr_G: 0.000800 2023-06-30 15:27:01,170 - INFO - [Train] step: 5199, loss_D: -1.782862, lr_D: 0.000800 2023-06-30 15:27:01,251 - INFO - [Train] step: 5199, loss_G: 129.107925, lr_G: 0.000800 2023-06-30 15:28:42,875 - INFO - [Train] step: 5299, loss_D: -1.710044, lr_D: 0.000800 2023-06-30 15:28:42,957 - INFO - [Train] step: 5299, loss_G: 128.532288, lr_G: 0.000800 2023-06-30 15:30:24,635 - INFO - [Train] step: 5399, loss_D: -1.912928, lr_D: 0.000800 2023-06-30 15:30:24,717 - INFO - [Train] step: 5399, loss_G: 129.576492, lr_G: 0.000800 2023-06-30 15:32:06,345 - INFO - [Train] step: 5499, loss_D: -2.102364, lr_D: 0.000800 2023-06-30 15:32:06,426 - INFO - [Train] step: 5499, loss_G: 127.418373, lr_G: 0.000800 2023-06-30 15:33:48,057 - INFO - [Train] step: 5599, loss_D: -1.541395, lr_D: 0.000800 2023-06-30 15:33:48,139 - INFO - [Train] step: 5599, loss_G: 126.218567, lr_G: 0.000800 2023-06-30 15:35:29,787 - INFO - [Train] step: 5699, loss_D: -1.920484, lr_D: 0.000800 2023-06-30 15:35:29,869 - INFO - [Train] step: 5699, loss_G: 128.363037, lr_G: 0.000800 2023-06-30 15:37:11,500 - INFO - [Train] step: 5799, loss_D: -1.691981, lr_D: 0.000800 2023-06-30 15:37:11,581 - INFO - [Train] step: 5799, loss_G: 129.383301, lr_G: 0.000800 2023-06-30 15:38:53,373 - INFO - [Train] step: 5899, loss_D: -2.144343, lr_D: 0.000800 2023-06-30 15:38:53,454 - INFO - [Train] step: 5899, loss_G: 124.283127, lr_G: 0.000800 2023-06-30 15:40:35,102 - INFO - [Train] step: 5999, loss_D: -1.986274, lr_D: 0.000800 2023-06-30 15:40:35,183 - INFO - [Train] step: 5999, loss_G: 127.358818, lr_G: 0.000800 2023-06-30 15:41:39,307 - INFO - [Eval] step: 5999, fid: 40.532469 2023-06-30 15:43:21,130 - INFO - [Train] step: 6099, loss_D: -1.794473, lr_D: 0.000800 2023-06-30 15:43:21,212 - INFO - [Train] step: 6099, loss_G: 124.095512, lr_G: 0.000800 2023-06-30 15:45:02,884 - INFO - [Train] step: 6199, loss_D: -2.586183, lr_D: 0.000800 2023-06-30 15:45:02,965 - INFO - [Train] step: 6199, loss_G: 122.430763, lr_G: 0.000800 2023-06-30 15:46:44,621 - INFO - [Train] step: 6299, loss_D: -1.227314, lr_D: 0.000800 2023-06-30 15:46:44,710 - INFO - [Train] step: 6299, loss_G: 125.728073, lr_G: 0.000800 2023-06-30 15:48:26,350 - INFO - [Train] step: 6399, loss_D: -2.013944, lr_D: 0.000800 2023-06-30 15:48:26,432 - INFO - [Train] step: 6399, loss_G: 122.519875, lr_G: 0.000800 2023-06-30 15:50:08,252 - INFO - [Train] step: 6499, loss_D: -1.748764, lr_D: 0.000800 2023-06-30 15:50:08,334 - INFO - [Train] step: 6499, loss_G: 123.795464, lr_G: 0.000800 2023-06-30 15:51:49,990 - INFO - [Train] step: 6599, loss_D: -1.011449, lr_D: 0.000800 2023-06-30 15:51:50,072 - INFO - [Train] step: 6599, loss_G: 123.367134, lr_G: 0.000800 2023-06-30 15:53:31,707 - INFO - [Train] step: 6699, loss_D: -1.686450, lr_D: 0.000800 2023-06-30 15:53:31,789 - INFO - [Train] step: 6699, loss_G: 123.273911, lr_G: 0.000800 2023-06-30 15:55:13,461 - INFO - [Train] step: 6799, loss_D: -1.454825, lr_D: 0.000800 2023-06-30 15:55:13,543 - INFO - [Train] step: 6799, loss_G: 124.074944, lr_G: 0.000800 2023-06-30 15:56:55,268 - INFO - [Train] step: 6899, loss_D: -1.422927, lr_D: 0.000800 2023-06-30 15:56:55,350 - INFO - [Train] step: 6899, loss_G: 122.827522, lr_G: 0.000800 2023-06-30 15:58:37,001 - INFO - [Train] step: 6999, loss_D: -1.281263, lr_D: 0.000800 2023-06-30 15:58:37,083 - INFO - [Train] step: 6999, loss_G: 122.462067, lr_G: 0.000800 2023-06-30 15:59:41,158 - INFO - [Eval] step: 6999, fid: 40.678369 2023-06-30 16:01:22,660 - INFO - [Train] step: 7099, loss_D: -1.909589, lr_D: 0.000800 2023-06-30 16:01:22,741 - INFO - [Train] step: 7099, loss_G: 121.502892, lr_G: 0.000800 2023-06-30 16:03:04,553 - INFO - [Train] step: 7199, loss_D: -1.608817, lr_D: 0.000800 2023-06-30 16:03:04,635 - INFO - [Train] step: 7199, loss_G: 117.671356, lr_G: 0.000800 2023-06-30 16:04:46,284 - INFO - [Train] step: 7299, loss_D: -1.539927, lr_D: 0.000800 2023-06-30 16:04:46,374 - INFO - [Train] step: 7299, loss_G: 119.550011, lr_G: 0.000800 2023-06-30 16:06:27,996 - INFO - [Train] step: 7399, loss_D: -1.586816, lr_D: 0.000800 2023-06-30 16:06:28,078 - INFO - [Train] step: 7399, loss_G: 121.976181, lr_G: 0.000800 2023-06-30 16:08:09,783 - INFO - [Train] step: 7499, loss_D: -1.769343, lr_D: 0.000800 2023-06-30 16:08:09,865 - INFO - [Train] step: 7499, loss_G: 122.146538, lr_G: 0.000800 2023-06-30 16:09:51,482 - INFO - [Train] step: 7599, loss_D: -1.476466, lr_D: 0.000800 2023-06-30 16:09:51,563 - INFO - [Train] step: 7599, loss_G: 121.153030, lr_G: 0.000800 2023-06-30 16:11:33,175 - INFO - [Train] step: 7699, loss_D: -1.653832, lr_D: 0.000800 2023-06-30 16:11:33,257 - INFO - [Train] step: 7699, loss_G: 122.500397, lr_G: 0.000800 2023-06-30 16:13:15,012 - INFO - [Train] step: 7799, loss_D: -0.853846, lr_D: 0.000800 2023-06-30 16:13:15,094 - INFO - [Train] step: 7799, loss_G: 121.113586, lr_G: 0.000800 2023-06-30 16:14:56,683 - INFO - [Train] step: 7899, loss_D: -1.419387, lr_D: 0.000800 2023-06-30 16:14:56,764 - INFO - [Train] step: 7899, loss_G: 120.883308, lr_G: 0.000800 2023-06-30 16:16:38,384 - INFO - [Train] step: 7999, loss_D: -1.458437, lr_D: 0.000800 2023-06-30 16:16:38,466 - INFO - [Train] step: 7999, loss_G: 118.520050, lr_G: 0.000800 2023-06-30 16:17:42,628 - INFO - [Eval] step: 7999, fid: 37.123883 2023-06-30 16:19:24,490 - INFO - [Train] step: 8099, loss_D: -1.621110, lr_D: 0.000800 2023-06-30 16:19:24,572 - INFO - [Train] step: 8099, loss_G: 119.782066, lr_G: 0.000800 2023-06-30 16:21:06,223 - INFO - [Train] step: 8199, loss_D: -1.536119, lr_D: 0.000800 2023-06-30 16:21:06,305 - INFO - [Train] step: 8199, loss_G: 120.118103, lr_G: 0.000800 2023-06-30 16:22:47,936 - INFO - [Train] step: 8299, loss_D: -2.273143, lr_D: 0.000800 2023-06-30 16:22:48,017 - INFO - [Train] step: 8299, loss_G: 121.002991, lr_G: 0.000800 2023-06-30 16:24:29,603 - INFO - [Train] step: 8399, loss_D: -1.816567, lr_D: 0.000800 2023-06-30 16:24:29,685 - INFO - [Train] step: 8399, loss_G: 120.733269, lr_G: 0.000800 2023-06-30 16:26:11,453 - INFO - [Train] step: 8499, loss_D: -2.159787, lr_D: 0.000800 2023-06-30 16:26:11,535 - INFO - [Train] step: 8499, loss_G: 120.717163, lr_G: 0.000800 2023-06-30 16:27:53,178 - INFO - [Train] step: 8599, loss_D: -1.914967, lr_D: 0.000800 2023-06-30 16:27:53,260 - INFO - [Train] step: 8599, loss_G: 122.037209, lr_G: 0.000800 2023-06-30 16:29:34,900 - INFO - [Train] step: 8699, loss_D: -0.970962, lr_D: 0.000800 2023-06-30 16:29:34,981 - INFO - [Train] step: 8699, loss_G: 120.927795, lr_G: 0.000800 2023-06-30 16:31:16,635 - INFO - [Train] step: 8799, loss_D: -1.724363, lr_D: 0.000800 2023-06-30 16:31:16,717 - INFO - [Train] step: 8799, loss_G: 123.646011, lr_G: 0.000800 2023-06-30 16:32:58,446 - INFO - [Train] step: 8899, loss_D: -1.878051, lr_D: 0.000800 2023-06-30 16:32:58,527 - INFO - [Train] step: 8899, loss_G: 121.482964, lr_G: 0.000800 2023-06-30 16:34:40,165 - INFO - [Train] step: 8999, loss_D: -1.807158, lr_D: 0.000800 2023-06-30 16:34:40,246 - INFO - [Train] step: 8999, loss_G: 118.100883, lr_G: 0.000800 2023-06-30 16:35:44,346 - INFO - [Eval] step: 8999, fid: 35.879919 2023-06-30 16:37:26,311 - INFO - [Train] step: 9099, loss_D: -2.189571, lr_D: 0.000800 2023-06-30 16:37:26,393 - INFO - [Train] step: 9099, loss_G: 118.239990, lr_G: 0.000800 2023-06-30 16:39:08,010 - INFO - [Train] step: 9199, loss_D: -1.809322, lr_D: 0.000800 2023-06-30 16:39:08,092 - INFO - [Train] step: 9199, loss_G: 121.108879, lr_G: 0.000800 2023-06-30 16:40:49,742 - INFO - [Train] step: 9299, loss_D: -1.865253, lr_D: 0.000800 2023-06-30 16:40:49,824 - INFO - [Train] step: 9299, loss_G: 121.302376, lr_G: 0.000800 2023-06-30 16:42:31,473 - INFO - [Train] step: 9399, loss_D: -1.959725, lr_D: 0.000800 2023-06-30 16:42:31,554 - INFO - [Train] step: 9399, loss_G: 119.994080, lr_G: 0.000800 2023-06-30 16:44:13,221 - INFO - [Train] step: 9499, loss_D: -1.835892, lr_D: 0.000800 2023-06-30 16:44:13,303 - INFO - [Train] step: 9499, loss_G: 118.571793, lr_G: 0.000800 2023-06-30 16:45:54,967 - INFO - [Train] step: 9599, loss_D: -1.560739, lr_D: 0.000800 2023-06-30 16:45:55,049 - INFO - [Train] step: 9599, loss_G: 118.286049, lr_G: 0.000800 2023-06-30 16:47:36,758 - INFO - [Train] step: 9699, loss_D: -1.470145, lr_D: 0.000800 2023-06-30 16:47:36,840 - INFO - [Train] step: 9699, loss_G: 119.416199, lr_G: 0.000800 2023-06-30 16:49:19,029 - INFO - [Train] step: 9799, loss_D: -1.899243, lr_D: 0.000800 2023-06-30 16:49:19,110 - INFO - [Train] step: 9799, loss_G: 120.244514, lr_G: 0.000800 2023-06-30 16:51:00,962 - INFO - [Train] step: 9899, loss_D: -1.957967, lr_D: 0.000800 2023-06-30 16:51:01,043 - INFO - [Train] step: 9899, loss_G: 120.143356, lr_G: 0.000800 2023-06-30 16:52:42,676 - INFO - [Train] step: 9999, loss_D: -1.723293, lr_D: 0.000800 2023-06-30 16:52:42,758 - INFO - [Train] step: 9999, loss_G: 125.207207, lr_G: 0.000800 2023-06-30 16:53:46,900 - INFO - [Eval] step: 9999, fid: 35.025195 2023-06-30 16:55:28,840 - INFO - [Train] step: 10099, loss_D: -1.874750, lr_D: 0.000800 2023-06-30 16:55:28,922 - INFO - [Train] step: 10099, loss_G: 122.445992, lr_G: 0.000800 2023-06-30 16:57:10,551 - INFO - [Train] step: 10199, loss_D: -1.529478, lr_D: 0.000800 2023-06-30 16:57:10,633 - INFO - [Train] step: 10199, loss_G: 119.559776, lr_G: 0.000800 2023-06-30 16:58:52,306 - INFO - [Train] step: 10299, loss_D: -1.106743, lr_D: 0.000800 2023-06-30 16:58:52,387 - INFO - [Train] step: 10299, loss_G: 121.200172, lr_G: 0.000800 2023-06-30 17:00:34,207 - INFO - [Train] step: 10399, loss_D: -1.796572, lr_D: 0.000800 2023-06-30 17:00:34,289 - INFO - [Train] step: 10399, loss_G: 122.266602, lr_G: 0.000800 2023-06-30 17:02:15,875 - INFO - [Train] step: 10499, loss_D: -2.162703, lr_D: 0.000800 2023-06-30 17:02:15,957 - INFO - [Train] step: 10499, loss_G: 122.024261, lr_G: 0.000800 2023-06-30 17:03:57,528 - INFO - [Train] step: 10599, loss_D: -2.214431, lr_D: 0.000800 2023-06-30 17:03:57,610 - INFO - [Train] step: 10599, loss_G: 124.498901, lr_G: 0.000800 2023-06-30 17:05:39,184 - INFO - [Train] step: 10699, loss_D: -1.689871, lr_D: 0.000800 2023-06-30 17:05:39,266 - INFO - [Train] step: 10699, loss_G: 118.671494, lr_G: 0.000800 2023-06-30 17:07:20,842 - INFO - [Train] step: 10799, loss_D: -1.484121, lr_D: 0.000800 2023-06-30 17:07:20,924 - INFO - [Train] step: 10799, loss_G: 121.493317, lr_G: 0.000800 2023-06-30 17:09:02,527 - INFO - [Train] step: 10899, loss_D: -2.050078, lr_D: 0.000800 2023-06-30 17:09:02,608 - INFO - [Train] step: 10899, loss_G: 121.145226, lr_G: 0.000800 2023-06-30 17:10:44,379 - INFO - [Train] step: 10999, loss_D: -1.499255, lr_D: 0.000800 2023-06-30 17:10:44,461 - INFO - [Train] step: 10999, loss_G: 122.515488, lr_G: 0.000800 2023-06-30 17:11:48,624 - INFO - [Eval] step: 10999, fid: 34.445985 2023-06-30 17:13:30,348 - INFO - [Train] step: 11099, loss_D: -1.783552, lr_D: 0.000800 2023-06-30 17:13:30,430 - INFO - [Train] step: 11099, loss_G: 124.988068, lr_G: 0.000800 2023-06-30 17:15:12,038 - INFO - [Train] step: 11199, loss_D: -1.871187, lr_D: 0.000800 2023-06-30 17:15:12,120 - INFO - [Train] step: 11199, loss_G: 126.567146, lr_G: 0.000800 2023-06-30 17:16:53,681 - INFO - [Train] step: 11299, loss_D: -1.811520, lr_D: 0.000800 2023-06-30 17:16:53,763 - INFO - [Train] step: 11299, loss_G: 123.148788, lr_G: 0.000800 2023-06-30 17:18:35,362 - INFO - [Train] step: 11399, loss_D: -1.705046, lr_D: 0.000800 2023-06-30 17:18:35,443 - INFO - [Train] step: 11399, loss_G: 125.039566, lr_G: 0.000800 2023-06-30 17:20:17,076 - INFO - [Train] step: 11499, loss_D: -1.897895, lr_D: 0.000800 2023-06-30 17:20:17,158 - INFO - [Train] step: 11499, loss_G: 126.817459, lr_G: 0.000800 2023-06-30 17:21:58,770 - INFO - [Train] step: 11599, loss_D: -1.770157, lr_D: 0.000800 2023-06-30 17:21:58,851 - INFO - [Train] step: 11599, loss_G: 125.694809, lr_G: 0.000800 2023-06-30 17:23:40,624 - INFO - [Train] step: 11699, loss_D: -1.316467, lr_D: 0.000800 2023-06-30 17:23:40,706 - INFO - [Train] step: 11699, loss_G: 130.067627, lr_G: 0.000800 2023-06-30 17:25:22,315 - INFO - [Train] step: 11799, loss_D: -2.301578, lr_D: 0.000800 2023-06-30 17:25:22,396 - INFO - [Train] step: 11799, loss_G: 128.625977, lr_G: 0.000800 2023-06-30 17:27:04,006 - INFO - [Train] step: 11899, loss_D: -1.965297, lr_D: 0.000800 2023-06-30 17:27:04,087 - INFO - [Train] step: 11899, loss_G: 129.813080, lr_G: 0.000800 2023-06-30 17:28:45,730 - INFO - [Train] step: 11999, loss_D: -1.210586, lr_D: 0.000800 2023-06-30 17:28:45,811 - INFO - [Train] step: 11999, loss_G: 129.736038, lr_G: 0.000800 2023-06-30 17:29:49,963 - INFO - [Eval] step: 11999, fid: 33.080401 2023-06-30 17:31:31,670 - INFO - [Train] step: 12099, loss_D: -1.452431, lr_D: 0.000800 2023-06-30 17:31:31,751 - INFO - [Train] step: 12099, loss_G: 130.960495, lr_G: 0.000800 2023-06-30 17:33:13,410 - INFO - [Train] step: 12199, loss_D: -1.422784, lr_D: 0.000800 2023-06-30 17:33:13,492 - INFO - [Train] step: 12199, loss_G: 130.746643, lr_G: 0.000800 2023-06-30 17:34:55,301 - INFO - [Train] step: 12299, loss_D: -2.259657, lr_D: 0.000800 2023-06-30 17:34:55,383 - INFO - [Train] step: 12299, loss_G: 132.368347, lr_G: 0.000800 2023-06-30 17:36:37,023 - INFO - [Train] step: 12399, loss_D: -2.098342, lr_D: 0.000800 2023-06-30 17:36:37,105 - INFO - [Train] step: 12399, loss_G: 132.797363, lr_G: 0.000800 2023-06-30 17:38:18,741 - INFO - [Train] step: 12499, loss_D: -2.041819, lr_D: 0.000800 2023-06-30 17:38:18,822 - INFO - [Train] step: 12499, loss_G: 133.367096, lr_G: 0.000800 2023-06-30 17:40:00,424 - INFO - [Train] step: 12599, loss_D: -1.469098, lr_D: 0.000800 2023-06-30 17:40:00,505 - INFO - [Train] step: 12599, loss_G: 135.717621, lr_G: 0.000800 2023-06-30 17:41:42,169 - INFO - [Train] step: 12699, loss_D: -1.705028, lr_D: 0.000800 2023-06-30 17:41:42,251 - INFO - [Train] step: 12699, loss_G: 135.656647, lr_G: 0.000800 2023-06-30 17:43:23,976 - INFO - [Train] step: 12799, loss_D: -2.113808, lr_D: 0.000800 2023-06-30 17:43:24,057 - INFO - [Train] step: 12799, loss_G: 136.990295, lr_G: 0.000800 2023-06-30 17:45:05,750 - INFO - [Train] step: 12899, loss_D: -1.107802, lr_D: 0.000800 2023-06-30 17:45:05,831 - INFO - [Train] step: 12899, loss_G: 135.886627, lr_G: 0.000800 2023-06-30 17:46:47,642 - INFO - [Train] step: 12999, loss_D: -0.977931, lr_D: 0.000800 2023-06-30 17:46:47,723 - INFO - [Train] step: 12999, loss_G: 136.133286, lr_G: 0.000800 2023-06-30 17:47:51,929 - INFO - [Eval] step: 12999, fid: 32.331055 2023-06-30 17:49:33,691 - INFO - [Train] step: 13099, loss_D: -1.992167, lr_D: 0.000800 2023-06-30 17:49:33,773 - INFO - [Train] step: 13099, loss_G: 134.305771, lr_G: 0.000800 2023-06-30 17:51:15,403 - INFO - [Train] step: 13199, loss_D: -2.055268, lr_D: 0.000800 2023-06-30 17:51:15,484 - INFO - [Train] step: 13199, loss_G: 134.934204, lr_G: 0.000800 2023-06-30 17:52:57,076 - INFO - [Train] step: 13299, loss_D: -1.747457, lr_D: 0.000800 2023-06-30 17:52:57,157 - INFO - [Train] step: 13299, loss_G: 135.841400, lr_G: 0.000800 2023-06-30 17:54:38,766 - INFO - [Train] step: 13399, loss_D: -1.744236, lr_D: 0.000800 2023-06-30 17:54:38,847 - INFO - [Train] step: 13399, loss_G: 135.121490, lr_G: 0.000800 2023-06-30 17:56:20,510 - INFO - [Train] step: 13499, loss_D: -1.829350, lr_D: 0.000800 2023-06-30 17:56:20,592 - INFO - [Train] step: 13499, loss_G: 135.772293, lr_G: 0.000800 2023-06-30 17:58:02,402 - INFO - [Train] step: 13599, loss_D: -1.903694, lr_D: 0.000800 2023-06-30 17:58:02,483 - INFO - [Train] step: 13599, loss_G: 137.412598, lr_G: 0.000800 2023-06-30 17:59:44,095 - INFO - [Train] step: 13699, loss_D: -1.673252, lr_D: 0.000800 2023-06-30 17:59:44,176 - INFO - [Train] step: 13699, loss_G: 138.247803, lr_G: 0.000800 2023-06-30 18:01:25,801 - INFO - [Train] step: 13799, loss_D: -1.932615, lr_D: 0.000800 2023-06-30 18:01:25,883 - INFO - [Train] step: 13799, loss_G: 138.080566, lr_G: 0.000800 2023-06-30 18:03:07,527 - INFO - [Train] step: 13899, loss_D: -1.430263, lr_D: 0.000800 2023-06-30 18:03:07,609 - INFO - [Train] step: 13899, loss_G: 136.702698, lr_G: 0.000800 2023-06-30 18:04:49,245 - INFO - [Train] step: 13999, loss_D: -2.242876, lr_D: 0.000800 2023-06-30 18:04:49,327 - INFO - [Train] step: 13999, loss_G: 139.644897, lr_G: 0.000800 2023-06-30 18:05:53,528 - INFO - [Eval] step: 13999, fid: 31.974302 2023-06-30 18:07:35,304 - INFO - [Train] step: 14099, loss_D: -1.663563, lr_D: 0.000800 2023-06-30 18:07:35,386 - INFO - [Train] step: 14099, loss_G: 138.061691, lr_G: 0.000800 2023-06-30 18:09:17,033 - INFO - [Train] step: 14199, loss_D: -1.170629, lr_D: 0.000800 2023-06-30 18:09:17,114 - INFO - [Train] step: 14199, loss_G: 140.698212, lr_G: 0.000800 2023-06-30 18:10:58,935 - INFO - [Train] step: 14299, loss_D: -1.807862, lr_D: 0.000800 2023-06-30 18:10:59,017 - INFO - [Train] step: 14299, loss_G: 139.929016, lr_G: 0.000800 2023-06-30 18:12:40,701 - INFO - [Train] step: 14399, loss_D: -1.605962, lr_D: 0.000800 2023-06-30 18:12:40,783 - INFO - [Train] step: 14399, loss_G: 141.499008, lr_G: 0.000800 2023-06-30 18:14:22,441 - INFO - [Train] step: 14499, loss_D: -1.976781, lr_D: 0.000800 2023-06-30 18:14:22,523 - INFO - [Train] step: 14499, loss_G: 142.716339, lr_G: 0.000800 2023-06-30 18:16:04,225 - INFO - [Train] step: 14599, loss_D: -2.140872, lr_D: 0.000800 2023-06-30 18:16:04,307 - INFO - [Train] step: 14599, loss_G: 142.381683, lr_G: 0.000800 2023-06-30 18:17:45,965 - INFO - [Train] step: 14699, loss_D: -1.667673, lr_D: 0.000800 2023-06-30 18:17:46,047 - INFO - [Train] step: 14699, loss_G: 142.678070, lr_G: 0.000800 2023-06-30 18:19:27,731 - INFO - [Train] step: 14799, loss_D: -1.967873, lr_D: 0.000800 2023-06-30 18:19:27,813 - INFO - [Train] step: 14799, loss_G: 145.064240, lr_G: 0.000800 2023-06-30 18:21:09,640 - INFO - [Train] step: 14899, loss_D: -2.016306, lr_D: 0.000800 2023-06-30 18:21:09,722 - INFO - [Train] step: 14899, loss_G: 144.310638, lr_G: 0.000800 2023-06-30 18:22:51,384 - INFO - [Train] step: 14999, loss_D: -0.530840, lr_D: 0.000800 2023-06-30 18:22:51,465 - INFO - [Train] step: 14999, loss_G: 146.087479, lr_G: 0.000800 2023-06-30 18:23:55,691 - INFO - [Eval] step: 14999, fid: 31.432697 2023-06-30 18:25:37,470 - INFO - [Train] step: 15099, loss_D: -2.807539, lr_D: 0.000800 2023-06-30 18:25:37,551 - INFO - [Train] step: 15099, loss_G: 146.658752, lr_G: 0.000800 2023-06-30 18:27:19,305 - INFO - [Train] step: 15199, loss_D: -1.300594, lr_D: 0.000800 2023-06-30 18:27:19,394 - INFO - [Train] step: 15199, loss_G: 148.159454, lr_G: 0.000800 2023-06-30 18:29:01,067 - INFO - [Train] step: 15299, loss_D: -1.555680, lr_D: 0.000800 2023-06-30 18:29:01,149 - INFO - [Train] step: 15299, loss_G: 148.357574, lr_G: 0.000800 2023-06-30 18:30:42,792 - INFO - [Train] step: 15399, loss_D: -1.889162, lr_D: 0.000800 2023-06-30 18:30:42,873 - INFO - [Train] step: 15399, loss_G: 147.846405, lr_G: 0.000800 2023-06-30 18:32:24,554 - INFO - [Train] step: 15499, loss_D: -1.801906, lr_D: 0.000800 2023-06-30 18:32:24,635 - INFO - [Train] step: 15499, loss_G: 150.054260, lr_G: 0.000800 2023-06-30 18:34:06,424 - INFO - [Train] step: 15599, loss_D: -2.033732, lr_D: 0.000800 2023-06-30 18:34:06,505 - INFO - [Train] step: 15599, loss_G: 151.037369, lr_G: 0.000800 2023-06-30 18:35:48,237 - INFO - [Train] step: 15699, loss_D: -0.887448, lr_D: 0.000800 2023-06-30 18:35:48,318 - INFO - [Train] step: 15699, loss_G: 150.547562, lr_G: 0.000800 2023-06-30 18:37:30,042 - INFO - [Train] step: 15799, loss_D: -1.871249, lr_D: 0.000800 2023-06-30 18:37:30,124 - INFO - [Train] step: 15799, loss_G: 151.017609, lr_G: 0.000800 2023-06-30 18:39:11,782 - INFO - [Train] step: 15899, loss_D: -1.805780, lr_D: 0.000800 2023-06-30 18:39:11,864 - INFO - [Train] step: 15899, loss_G: 152.032043, lr_G: 0.000800 2023-06-30 18:40:53,511 - INFO - [Train] step: 15999, loss_D: -1.866432, lr_D: 0.000800 2023-06-30 18:40:53,592 - INFO - [Train] step: 15999, loss_G: 152.085632, lr_G: 0.000800 2023-06-30 18:41:57,901 - INFO - [Eval] step: 15999, fid: 31.217378 2023-06-30 18:43:39,638 - INFO - [Train] step: 16099, loss_D: -2.040899, lr_D: 0.000800 2023-06-30 18:43:39,727 - INFO - [Train] step: 16099, loss_G: 152.784195, lr_G: 0.000800 2023-06-30 18:45:21,651 - INFO - [Train] step: 16199, loss_D: -1.050922, lr_D: 0.000800 2023-06-30 18:45:21,732 - INFO - [Train] step: 16199, loss_G: 154.812134, lr_G: 0.000800 2023-06-30 18:47:03,356 - INFO - [Train] step: 16299, loss_D: -1.953090, lr_D: 0.000800 2023-06-30 18:47:03,437 - INFO - [Train] step: 16299, loss_G: 155.661591, lr_G: 0.000800 2023-06-30 18:48:45,060 - INFO - [Train] step: 16399, loss_D: -2.566016, lr_D: 0.000800 2023-06-30 18:48:45,141 - INFO - [Train] step: 16399, loss_G: 155.269623, lr_G: 0.000800 2023-06-30 18:50:26,841 - INFO - [Train] step: 16499, loss_D: -1.651610, lr_D: 0.000800 2023-06-30 18:50:26,923 - INFO - [Train] step: 16499, loss_G: 159.161270, lr_G: 0.000800 2023-06-30 18:52:08,599 - INFO - [Train] step: 16599, loss_D: -2.591121, lr_D: 0.000800 2023-06-30 18:52:08,681 - INFO - [Train] step: 16599, loss_G: 161.822418, lr_G: 0.000800 2023-06-30 18:53:50,309 - INFO - [Train] step: 16699, loss_D: -1.809893, lr_D: 0.000800 2023-06-30 18:53:50,391 - INFO - [Train] step: 16699, loss_G: 162.239166, lr_G: 0.000800 2023-06-30 18:55:32,024 - INFO - [Train] step: 16799, loss_D: -2.223201, lr_D: 0.000800 2023-06-30 18:55:32,105 - INFO - [Train] step: 16799, loss_G: 164.367432, lr_G: 0.000800 2023-06-30 18:57:13,881 - INFO - [Train] step: 16899, loss_D: -2.297200, lr_D: 0.000800 2023-06-30 18:57:13,962 - INFO - [Train] step: 16899, loss_G: 165.013092, lr_G: 0.000800 2023-06-30 18:58:55,620 - INFO - [Train] step: 16999, loss_D: -1.864687, lr_D: 0.000800 2023-06-30 18:58:55,701 - INFO - [Train] step: 16999, loss_G: 166.018860, lr_G: 0.000800 2023-06-30 18:59:59,902 - INFO - [Eval] step: 16999, fid: 30.910166 2023-06-30 19:01:41,719 - INFO - [Train] step: 17099, loss_D: -1.953503, lr_D: 0.000800 2023-06-30 19:01:41,801 - INFO - [Train] step: 17099, loss_G: 164.526428, lr_G: 0.000800 2023-06-30 19:03:23,453 - INFO - [Train] step: 17199, loss_D: -1.670968, lr_D: 0.000800 2023-06-30 19:03:23,535 - INFO - [Train] step: 17199, loss_G: 167.690948, lr_G: 0.000800 2023-06-30 19:05:05,152 - INFO - [Train] step: 17299, loss_D: -3.177252, lr_D: 0.000800 2023-06-30 19:05:05,233 - INFO - [Train] step: 17299, loss_G: 163.963623, lr_G: 0.000800 2023-06-30 19:06:46,966 - INFO - [Train] step: 17399, loss_D: -2.375909, lr_D: 0.000800 2023-06-30 19:06:47,047 - INFO - [Train] step: 17399, loss_G: 167.939392, lr_G: 0.000800 2023-06-30 19:08:28,879 - INFO - [Train] step: 17499, loss_D: 0.785781, lr_D: 0.000800 2023-06-30 19:08:28,961 - INFO - [Train] step: 17499, loss_G: 168.338013, lr_G: 0.000800 2023-06-30 19:10:10,590 - INFO - [Train] step: 17599, loss_D: -1.854200, lr_D: 0.000800 2023-06-30 19:10:10,672 - INFO - [Train] step: 17599, loss_G: 169.813522, lr_G: 0.000800 2023-06-30 19:11:52,263 - INFO - [Train] step: 17699, loss_D: -1.961651, lr_D: 0.000800 2023-06-30 19:11:52,344 - INFO - [Train] step: 17699, loss_G: 171.961441, lr_G: 0.000800 2023-06-30 19:13:33,944 - INFO - [Train] step: 17799, loss_D: -1.860318, lr_D: 0.000800 2023-06-30 19:13:34,025 - INFO - [Train] step: 17799, loss_G: 171.863770, lr_G: 0.000800 2023-06-30 19:15:15,655 - INFO - [Train] step: 17899, loss_D: -1.613761, lr_D: 0.000800 2023-06-30 19:15:15,737 - INFO - [Train] step: 17899, loss_G: 172.240738, lr_G: 0.000800 2023-06-30 19:16:57,408 - INFO - [Train] step: 17999, loss_D: -1.814374, lr_D: 0.000800 2023-06-30 19:16:57,489 - INFO - [Train] step: 17999, loss_G: 173.582703, lr_G: 0.000800 2023-06-30 19:18:01,682 - INFO - [Eval] step: 17999, fid: 30.876666 2023-06-30 19:19:43,278 - INFO - [Train] step: 18099, loss_D: -1.941814, lr_D: 0.000800 2023-06-30 19:19:43,359 - INFO - [Train] step: 18099, loss_G: 175.812729, lr_G: 0.000800 2023-06-30 19:21:25,173 - INFO - [Train] step: 18199, loss_D: -2.271147, lr_D: 0.000800 2023-06-30 19:21:25,255 - INFO - [Train] step: 18199, loss_G: 175.829483, lr_G: 0.000800 2023-06-30 19:23:06,921 - INFO - [Train] step: 18299, loss_D: -2.898890, lr_D: 0.000800 2023-06-30 19:23:07,003 - INFO - [Train] step: 18299, loss_G: 176.860092, lr_G: 0.000800 2023-06-30 19:24:48,642 - INFO - [Train] step: 18399, loss_D: -2.164594, lr_D: 0.000800 2023-06-30 19:24:48,723 - INFO - [Train] step: 18399, loss_G: 178.864380, lr_G: 0.000800 2023-06-30 19:26:30,395 - INFO - [Train] step: 18499, loss_D: -2.459539, lr_D: 0.000800 2023-06-30 19:26:30,477 - INFO - [Train] step: 18499, loss_G: 178.802689, lr_G: 0.000800 2023-06-30 19:28:12,167 - INFO - [Train] step: 18599, loss_D: -2.196259, lr_D: 0.000800 2023-06-30 19:28:12,249 - INFO - [Train] step: 18599, loss_G: 181.023804, lr_G: 0.000800 2023-06-30 19:29:53,858 - INFO - [Train] step: 18699, loss_D: -2.881361, lr_D: 0.000800 2023-06-30 19:29:53,940 - INFO - [Train] step: 18699, loss_G: 179.746170, lr_G: 0.000800 2023-06-30 19:31:35,889 - INFO - [Train] step: 18799, loss_D: -2.417088, lr_D: 0.000800 2023-06-30 19:31:35,971 - INFO - [Train] step: 18799, loss_G: 181.307129, lr_G: 0.000800 2023-06-30 19:33:17,742 - INFO - [Train] step: 18899, loss_D: -1.666083, lr_D: 0.000800 2023-06-30 19:33:17,824 - INFO - [Train] step: 18899, loss_G: 183.417328, lr_G: 0.000800 2023-06-30 19:34:59,582 - INFO - [Train] step: 18999, loss_D: -3.156470, lr_D: 0.000800 2023-06-30 19:34:59,663 - INFO - [Train] step: 18999, loss_G: 185.821259, lr_G: 0.000800 2023-06-30 19:36:03,902 - INFO - [Eval] step: 18999, fid: 30.558246 2023-06-30 19:37:45,681 - INFO - [Train] step: 19099, loss_D: -0.824734, lr_D: 0.000800 2023-06-30 19:37:45,763 - INFO - [Train] step: 19099, loss_G: 184.431671, lr_G: 0.000800 2023-06-30 19:39:27,382 - INFO - [Train] step: 19199, loss_D: -2.544101, lr_D: 0.000800 2023-06-30 19:39:27,463 - INFO - [Train] step: 19199, loss_G: 185.316879, lr_G: 0.000800 2023-06-30 19:41:09,128 - INFO - [Train] step: 19299, loss_D: -2.147238, lr_D: 0.000800 2023-06-30 19:41:09,209 - INFO - [Train] step: 19299, loss_G: 184.963806, lr_G: 0.000800 2023-06-30 19:42:50,860 - INFO - [Train] step: 19399, loss_D: -2.254052, lr_D: 0.000800 2023-06-30 19:42:50,942 - INFO - [Train] step: 19399, loss_G: 187.541473, lr_G: 0.000800 2023-06-30 19:44:32,744 - INFO - [Train] step: 19499, loss_D: 1.514937, lr_D: 0.000800 2023-06-30 19:44:32,826 - INFO - [Train] step: 19499, loss_G: 187.895752, lr_G: 0.000800 2023-06-30 19:46:14,481 - INFO - [Train] step: 19599, loss_D: -2.131149, lr_D: 0.000800 2023-06-30 19:46:14,563 - INFO - [Train] step: 19599, loss_G: 187.605133, lr_G: 0.000800 2023-06-30 19:47:56,207 - INFO - [Train] step: 19699, loss_D: -2.728745, lr_D: 0.000800 2023-06-30 19:47:56,289 - INFO - [Train] step: 19699, loss_G: 186.620255, lr_G: 0.000800 2023-06-30 19:49:37,938 - INFO - [Train] step: 19799, loss_D: -2.257333, lr_D: 0.000800 2023-06-30 19:49:38,019 - INFO - [Train] step: 19799, loss_G: 189.922745, lr_G: 0.000800 2023-06-30 19:51:19,641 - INFO - [Train] step: 19899, loss_D: -1.513238, lr_D: 0.000800 2023-06-30 19:51:19,723 - INFO - [Train] step: 19899, loss_G: 192.797424, lr_G: 0.000800 2023-06-30 19:53:01,328 - INFO - [Train] step: 19999, loss_D: -2.481059, lr_D: 0.000800 2023-06-30 19:53:01,409 - INFO - [Train] step: 19999, loss_G: 192.262695, lr_G: 0.000800 2023-06-30 19:54:05,642 - INFO - [Eval] step: 19999, fid: 30.241830 2023-06-30 19:55:47,739 - INFO - [Train] step: 20099, loss_D: -2.582145, lr_D: 0.000800 2023-06-30 19:55:47,821 - INFO - [Train] step: 20099, loss_G: 192.402603, lr_G: 0.000800 2023-06-30 19:57:29,474 - INFO - [Train] step: 20199, loss_D: -2.716302, lr_D: 0.000800 2023-06-30 19:57:29,556 - INFO - [Train] step: 20199, loss_G: 194.821655, lr_G: 0.000800 2023-06-30 19:59:11,218 - INFO - [Train] step: 20299, loss_D: -3.669365, lr_D: 0.000800 2023-06-30 19:59:11,300 - INFO - [Train] step: 20299, loss_G: 197.042206, lr_G: 0.000800 2023-06-30 20:00:53,015 - INFO - [Train] step: 20399, loss_D: -1.663941, lr_D: 0.000800 2023-06-30 20:00:53,096 - INFO - [Train] step: 20399, loss_G: 198.626526, lr_G: 0.000800 2023-06-30 20:02:34,745 - INFO - [Train] step: 20499, loss_D: -2.218254, lr_D: 0.000800 2023-06-30 20:02:34,827 - INFO - [Train] step: 20499, loss_G: 200.691727, lr_G: 0.000800 2023-06-30 20:04:16,473 - INFO - [Train] step: 20599, loss_D: -2.385633, lr_D: 0.000800 2023-06-30 20:04:16,555 - INFO - [Train] step: 20599, loss_G: 198.676956, lr_G: 0.000800 2023-06-30 20:05:58,331 - INFO - [Train] step: 20699, loss_D: -1.642828, lr_D: 0.000800 2023-06-30 20:05:58,413 - INFO - [Train] step: 20699, loss_G: 205.006836, lr_G: 0.000800 2023-06-30 20:07:39,994 - INFO - [Train] step: 20799, loss_D: -2.294549, lr_D: 0.000800 2023-06-30 20:07:40,075 - INFO - [Train] step: 20799, loss_G: 207.098938, lr_G: 0.000800 2023-06-30 20:09:21,712 - INFO - [Train] step: 20899, loss_D: -2.255158, lr_D: 0.000800 2023-06-30 20:09:21,794 - INFO - [Train] step: 20899, loss_G: 209.350723, lr_G: 0.000800 2023-06-30 20:11:03,440 - INFO - [Train] step: 20999, loss_D: -2.331969, lr_D: 0.000800 2023-06-30 20:11:03,521 - INFO - [Train] step: 20999, loss_G: 212.277817, lr_G: 0.000800 2023-06-30 20:12:07,686 - INFO - [Eval] step: 20999, fid: 29.852672 2023-06-30 20:13:49,435 - INFO - [Train] step: 21099, loss_D: -2.138738, lr_D: 0.000800 2023-06-30 20:13:49,516 - INFO - [Train] step: 21099, loss_G: 211.179321, lr_G: 0.000800 2023-06-30 20:15:31,218 - INFO - [Train] step: 21199, loss_D: -2.364866, lr_D: 0.000800 2023-06-30 20:15:31,300 - INFO - [Train] step: 21199, loss_G: 210.639679, lr_G: 0.000800 2023-06-30 20:17:13,037 - INFO - [Train] step: 21299, loss_D: -1.861578, lr_D: 0.000800 2023-06-30 20:17:13,118 - INFO - [Train] step: 21299, loss_G: 213.138474, lr_G: 0.000800 2023-06-30 20:18:54,901 - INFO - [Train] step: 21399, loss_D: -2.295768, lr_D: 0.000800 2023-06-30 20:18:54,983 - INFO - [Train] step: 21399, loss_G: 216.073074, lr_G: 0.000800 2023-06-30 20:20:36,610 - INFO - [Train] step: 21499, loss_D: -2.034185, lr_D: 0.000800 2023-06-30 20:20:36,692 - INFO - [Train] step: 21499, loss_G: 216.505066, lr_G: 0.000800 2023-06-30 20:22:18,355 - INFO - [Train] step: 21599, loss_D: -2.213518, lr_D: 0.000800 2023-06-30 20:22:18,437 - INFO - [Train] step: 21599, loss_G: 217.484955, lr_G: 0.000800 2023-06-30 20:24:00,051 - INFO - [Train] step: 21699, loss_D: -2.218930, lr_D: 0.000800 2023-06-30 20:24:00,132 - INFO - [Train] step: 21699, loss_G: 218.327332, lr_G: 0.000800 2023-06-30 20:25:41,756 - INFO - [Train] step: 21799, loss_D: -2.459942, lr_D: 0.000800 2023-06-30 20:25:41,838 - INFO - [Train] step: 21799, loss_G: 224.517258, lr_G: 0.000800 2023-06-30 20:27:23,574 - INFO - [Train] step: 21899, loss_D: -1.838030, lr_D: 0.000800 2023-06-30 20:27:23,655 - INFO - [Train] step: 21899, loss_G: 225.069916, lr_G: 0.000800 2023-06-30 20:29:05,511 - INFO - [Train] step: 21999, loss_D: -3.107598, lr_D: 0.000800 2023-06-30 20:29:05,593 - INFO - [Train] step: 21999, loss_G: 227.362564, lr_G: 0.000800 2023-06-30 20:30:09,855 - INFO - [Eval] step: 21999, fid: 29.670812 2023-06-30 20:31:51,626 - INFO - [Train] step: 22099, loss_D: -3.201806, lr_D: 0.000800 2023-06-30 20:31:51,708 - INFO - [Train] step: 22099, loss_G: 226.773468, lr_G: 0.000800 2023-06-30 20:33:33,336 - INFO - [Train] step: 22199, loss_D: -2.264891, lr_D: 0.000800 2023-06-30 20:33:33,417 - INFO - [Train] step: 22199, loss_G: 223.881454, lr_G: 0.000800 2023-06-30 20:35:15,093 - INFO - [Train] step: 22299, loss_D: -2.738436, lr_D: 0.000800 2023-06-30 20:35:15,174 - INFO - [Train] step: 22299, loss_G: 229.099350, lr_G: 0.000800 2023-06-30 20:36:56,804 - INFO - [Train] step: 22399, loss_D: -1.075695, lr_D: 0.000800 2023-06-30 20:36:56,885 - INFO - [Train] step: 22399, loss_G: 229.666580, lr_G: 0.000800 2023-06-30 20:38:38,531 - INFO - [Train] step: 22499, loss_D: -2.047518, lr_D: 0.000800 2023-06-30 20:38:38,612 - INFO - [Train] step: 22499, loss_G: 231.443283, lr_G: 0.000800 2023-06-30 20:40:20,236 - INFO - [Train] step: 22599, loss_D: -2.843856, lr_D: 0.000800 2023-06-30 20:40:20,317 - INFO - [Train] step: 22599, loss_G: 231.160248, lr_G: 0.000800 2023-06-30 20:42:02,111 - INFO - [Train] step: 22699, loss_D: -2.224667, lr_D: 0.000800 2023-06-30 20:42:02,192 - INFO - [Train] step: 22699, loss_G: 233.724030, lr_G: 0.000800 2023-06-30 20:43:43,817 - INFO - [Train] step: 22799, loss_D: -3.161266, lr_D: 0.000800 2023-06-30 20:43:43,898 - INFO - [Train] step: 22799, loss_G: 233.719025, lr_G: 0.000800 2023-06-30 20:45:25,523 - INFO - [Train] step: 22899, loss_D: -2.240168, lr_D: 0.000800 2023-06-30 20:45:25,605 - INFO - [Train] step: 22899, loss_G: 236.488876, lr_G: 0.000800 2023-06-30 20:47:07,207 - INFO - [Train] step: 22999, loss_D: -2.435300, lr_D: 0.000800 2023-06-30 20:47:07,288 - INFO - [Train] step: 22999, loss_G: 240.612747, lr_G: 0.000800 2023-06-30 20:48:11,540 - INFO - [Eval] step: 22999, fid: 29.990827 2023-06-30 20:49:52,812 - INFO - [Train] step: 23099, loss_D: -1.503462, lr_D: 0.000800 2023-06-30 20:49:52,894 - INFO - [Train] step: 23099, loss_G: 240.135681, lr_G: 0.000800 2023-06-30 20:51:34,488 - INFO - [Train] step: 23199, loss_D: -2.414200, lr_D: 0.000800 2023-06-30 20:51:34,569 - INFO - [Train] step: 23199, loss_G: 240.464417, lr_G: 0.000800 2023-06-30 20:53:16,413 - INFO - [Train] step: 23299, loss_D: 6.198669, lr_D: 0.000800 2023-06-30 20:53:16,495 - INFO - [Train] step: 23299, loss_G: 243.788177, lr_G: 0.000800 2023-06-30 20:54:58,208 - INFO - [Train] step: 23399, loss_D: -2.083214, lr_D: 0.000800 2023-06-30 20:54:58,289 - INFO - [Train] step: 23399, loss_G: 246.391968, lr_G: 0.000800 2023-06-30 20:56:40,006 - INFO - [Train] step: 23499, loss_D: -2.777602, lr_D: 0.000800 2023-06-30 20:56:40,088 - INFO - [Train] step: 23499, loss_G: 247.986725, lr_G: 0.000800 2023-06-30 20:58:21,765 - INFO - [Train] step: 23599, loss_D: 0.886512, lr_D: 0.000800 2023-06-30 20:58:21,846 - INFO - [Train] step: 23599, loss_G: 246.024933, lr_G: 0.000800 2023-06-30 21:00:03,577 - INFO - [Train] step: 23699, loss_D: -1.909957, lr_D: 0.000800 2023-06-30 21:00:03,658 - INFO - [Train] step: 23699, loss_G: 249.900742, lr_G: 0.000800 2023-06-30 21:01:45,370 - INFO - [Train] step: 23799, loss_D: -2.220460, lr_D: 0.000800 2023-06-30 21:01:45,452 - INFO - [Train] step: 23799, loss_G: 252.714661, lr_G: 0.000800 2023-06-30 21:03:27,159 - INFO - [Train] step: 23899, loss_D: -1.215677, lr_D: 0.000800 2023-06-30 21:03:27,240 - INFO - [Train] step: 23899, loss_G: 252.338135, lr_G: 0.000800 2023-06-30 21:05:09,094 - INFO - [Train] step: 23999, loss_D: -3.013986, lr_D: 0.000800 2023-06-30 21:05:09,175 - INFO - [Train] step: 23999, loss_G: 253.344177, lr_G: 0.000800 2023-06-30 21:06:13,378 - INFO - [Eval] step: 23999, fid: 29.827638 2023-06-30 21:07:54,664 - INFO - [Train] step: 24099, loss_D: -2.109229, lr_D: 0.000800 2023-06-30 21:07:54,745 - INFO - [Train] step: 24099, loss_G: 257.752441, lr_G: 0.000800 2023-06-30 21:09:36,104 - INFO - [Train] step: 24199, loss_D: -2.350919, lr_D: 0.000800 2023-06-30 21:09:36,185 - INFO - [Train] step: 24199, loss_G: 258.692322, lr_G: 0.000800 2023-06-30 21:11:17,473 - INFO - [Train] step: 24299, loss_D: -2.248062, lr_D: 0.000800 2023-06-30 21:11:17,555 - INFO - [Train] step: 24299, loss_G: 257.707031, lr_G: 0.000800 2023-06-30 21:12:58,916 - INFO - [Train] step: 24399, loss_D: -2.838322, lr_D: 0.000800 2023-06-30 21:12:58,998 - INFO - [Train] step: 24399, loss_G: 259.908203, lr_G: 0.000800 2023-06-30 21:14:40,312 - INFO - [Train] step: 24499, loss_D: -2.496197, lr_D: 0.000800 2023-06-30 21:14:40,393 - INFO - [Train] step: 24499, loss_G: 261.031921, lr_G: 0.000800 2023-06-30 21:16:21,866 - INFO - [Train] step: 24599, loss_D: -2.562991, lr_D: 0.000800 2023-06-30 21:16:21,948 - INFO - [Train] step: 24599, loss_G: 261.096558, lr_G: 0.000800 2023-06-30 21:18:03,263 - INFO - [Train] step: 24699, loss_D: -1.651629, lr_D: 0.000800 2023-06-30 21:18:03,344 - INFO - [Train] step: 24699, loss_G: 265.283295, lr_G: 0.000800 2023-06-30 21:19:44,640 - INFO - [Train] step: 24799, loss_D: -2.288223, lr_D: 0.000800 2023-06-30 21:19:44,721 - INFO - [Train] step: 24799, loss_G: 266.250000, lr_G: 0.000800 2023-06-30 21:21:25,992 - INFO - [Train] step: 24899, loss_D: -1.971659, lr_D: 0.000800 2023-06-30 21:21:26,073 - INFO - [Train] step: 24899, loss_G: 270.497864, lr_G: 0.000800 2023-06-30 21:23:07,383 - INFO - [Train] step: 24999, loss_D: -2.037721, lr_D: 0.000800 2023-06-30 21:23:07,464 - INFO - [Train] step: 24999, loss_G: 270.945557, lr_G: 0.000800 2023-06-30 21:24:11,698 - INFO - [Eval] step: 24999, fid: 29.715487 2023-06-30 21:25:52,996 - INFO - [Train] step: 25099, loss_D: -2.312455, lr_D: 0.000800 2023-06-30 21:25:53,077 - INFO - [Train] step: 25099, loss_G: 275.845764, lr_G: 0.000800 2023-06-30 21:27:34,382 - INFO - [Train] step: 25199, loss_D: -2.267100, lr_D: 0.000800 2023-06-30 21:27:34,463 - INFO - [Train] step: 25199, loss_G: 274.108459, lr_G: 0.000800 2023-06-30 21:29:15,961 - INFO - [Train] step: 25299, loss_D: -2.783442, lr_D: 0.000800 2023-06-30 21:29:16,042 - INFO - [Train] step: 25299, loss_G: 272.848755, lr_G: 0.000800 2023-06-30 21:30:57,378 - INFO - [Train] step: 25399, loss_D: -2.267543, lr_D: 0.000800 2023-06-30 21:30:57,460 - INFO - [Train] step: 25399, loss_G: 275.490112, lr_G: 0.000800 2023-06-30 21:32:38,781 - INFO - [Train] step: 25499, loss_D: -2.604930, lr_D: 0.000800 2023-06-30 21:32:38,862 - INFO - [Train] step: 25499, loss_G: 279.230225, lr_G: 0.000800 2023-06-30 21:34:20,169 - INFO - [Train] step: 25599, loss_D: -2.521077, lr_D: 0.000800 2023-06-30 21:34:20,250 - INFO - [Train] step: 25599, loss_G: 281.559784, lr_G: 0.000800 2023-06-30 21:36:01,560 - INFO - [Train] step: 25699, loss_D: -1.952807, lr_D: 0.000800 2023-06-30 21:36:01,641 - INFO - [Train] step: 25699, loss_G: 282.933899, lr_G: 0.000800 2023-06-30 21:37:42,962 - INFO - [Train] step: 25799, loss_D: 7.762302, lr_D: 0.000800 2023-06-30 21:37:43,043 - INFO - [Train] step: 25799, loss_G: 286.669067, lr_G: 0.000800 2023-06-30 21:39:24,494 - INFO - [Train] step: 25899, loss_D: -2.252496, lr_D: 0.000800 2023-06-30 21:39:24,576 - INFO - [Train] step: 25899, loss_G: 288.240204, lr_G: 0.000800 2023-06-30 21:41:05,921 - INFO - [Train] step: 25999, loss_D: -3.135201, lr_D: 0.000800 2023-06-30 21:41:06,002 - INFO - [Train] step: 25999, loss_G: 288.846558, lr_G: 0.000800 2023-06-30 21:42:10,224 - INFO - [Eval] step: 25999, fid: 29.644681 2023-06-30 21:43:51,779 - INFO - [Train] step: 26099, loss_D: -2.199665, lr_D: 0.000800 2023-06-30 21:43:51,860 - INFO - [Train] step: 26099, loss_G: 289.676147, lr_G: 0.000800 2023-06-30 21:45:33,194 - INFO - [Train] step: 26199, loss_D: -2.061146, lr_D: 0.000800 2023-06-30 21:45:33,275 - INFO - [Train] step: 26199, loss_G: 290.700043, lr_G: 0.000800 2023-06-30 21:47:14,629 - INFO - [Train] step: 26299, loss_D: -2.298442, lr_D: 0.000800 2023-06-30 21:47:14,710 - INFO - [Train] step: 26299, loss_G: 297.807526, lr_G: 0.000800 2023-06-30 21:48:56,040 - INFO - [Train] step: 26399, loss_D: -2.848533, lr_D: 0.000800 2023-06-30 21:48:56,121 - INFO - [Train] step: 26399, loss_G: 295.691528, lr_G: 0.000800 2023-06-30 21:50:37,422 - INFO - [Train] step: 26499, loss_D: -2.396522, lr_D: 0.000800 2023-06-30 21:50:37,503 - INFO - [Train] step: 26499, loss_G: 297.134155, lr_G: 0.000800 2023-06-30 21:52:18,987 - INFO - [Train] step: 26599, loss_D: -2.358667, lr_D: 0.000800 2023-06-30 21:52:19,068 - INFO - [Train] step: 26599, loss_G: 300.250366, lr_G: 0.000800 2023-06-30 21:54:00,425 - INFO - [Train] step: 26699, loss_D: -2.343750, lr_D: 0.000800 2023-06-30 21:54:00,507 - INFO - [Train] step: 26699, loss_G: 303.103729, lr_G: 0.000800 2023-06-30 21:55:41,915 - INFO - [Train] step: 26799, loss_D: -2.279582, lr_D: 0.000800 2023-06-30 21:55:41,997 - INFO - [Train] step: 26799, loss_G: 301.897095, lr_G: 0.000800 2023-06-30 21:57:23,284 - INFO - [Train] step: 26899, loss_D: -2.497117, lr_D: 0.000800 2023-06-30 21:57:23,366 - INFO - [Train] step: 26899, loss_G: 306.548798, lr_G: 0.000800 2023-06-30 21:59:04,659 - INFO - [Train] step: 26999, loss_D: -2.424914, lr_D: 0.000800 2023-06-30 21:59:04,741 - INFO - [Train] step: 26999, loss_G: 313.660126, lr_G: 0.000800 2023-06-30 22:00:08,989 - INFO - [Eval] step: 26999, fid: 29.903329 2023-06-30 22:01:50,244 - INFO - [Train] step: 27099, loss_D: -2.573433, lr_D: 0.000800 2023-06-30 22:01:50,325 - INFO - [Train] step: 27099, loss_G: 311.459167, lr_G: 0.000800 2023-06-30 22:03:31,801 - INFO - [Train] step: 27199, loss_D: -1.841810, lr_D: 0.000800 2023-06-30 22:03:31,882 - INFO - [Train] step: 27199, loss_G: 311.668793, lr_G: 0.000800 2023-06-30 22:05:13,197 - INFO - [Train] step: 27299, loss_D: -2.623956, lr_D: 0.000800 2023-06-30 22:05:13,282 - INFO - [Train] step: 27299, loss_G: 314.068146, lr_G: 0.000800 2023-06-30 22:06:54,617 - INFO - [Train] step: 27399, loss_D: -2.550889, lr_D: 0.000800 2023-06-30 22:06:54,698 - INFO - [Train] step: 27399, loss_G: 317.185364, lr_G: 0.000800 2023-06-30 22:08:35,976 - INFO - [Train] step: 27499, loss_D: -2.102031, lr_D: 0.000800 2023-06-30 22:08:36,058 - INFO - [Train] step: 27499, loss_G: 319.160187, lr_G: 0.000800 2023-06-30 22:10:17,351 - INFO - [Train] step: 27599, loss_D: -2.562485, lr_D: 0.000800 2023-06-30 22:10:17,432 - INFO - [Train] step: 27599, loss_G: 321.081299, lr_G: 0.000800 2023-06-30 22:11:58,726 - INFO - [Train] step: 27699, loss_D: -2.647586, lr_D: 0.000800 2023-06-30 22:11:58,807 - INFO - [Train] step: 27699, loss_G: 321.172974, lr_G: 0.000800 2023-06-30 22:13:40,107 - INFO - [Train] step: 27799, loss_D: -2.542240, lr_D: 0.000800 2023-06-30 22:13:40,188 - INFO - [Train] step: 27799, loss_G: 326.429932, lr_G: 0.000800 2023-06-30 22:15:21,606 - INFO - [Train] step: 27899, loss_D: -2.255647, lr_D: 0.000800 2023-06-30 22:15:21,687 - INFO - [Train] step: 27899, loss_G: 324.881622, lr_G: 0.000800 2023-06-30 22:17:02,982 - INFO - [Train] step: 27999, loss_D: -1.976703, lr_D: 0.000800 2023-06-30 22:17:03,063 - INFO - [Train] step: 27999, loss_G: 327.483398, lr_G: 0.000800 2023-06-30 22:18:07,322 - INFO - [Eval] step: 27999, fid: 29.412858 2023-06-30 22:19:48,956 - INFO - [Train] step: 28099, loss_D: -2.617584, lr_D: 0.000800 2023-06-30 22:19:49,038 - INFO - [Train] step: 28099, loss_G: 325.819153, lr_G: 0.000800 2023-06-30 22:21:30,426 - INFO - [Train] step: 28199, loss_D: -2.771311, lr_D: 0.000800 2023-06-30 22:21:30,507 - INFO - [Train] step: 28199, loss_G: 326.512390, lr_G: 0.000800 2023-06-30 22:23:11,885 - INFO - [Train] step: 28299, loss_D: -2.177960, lr_D: 0.000800 2023-06-30 22:23:11,966 - INFO - [Train] step: 28299, loss_G: 327.168152, lr_G: 0.000800 2023-06-30 22:24:53,343 - INFO - [Train] step: 28399, loss_D: -2.477755, lr_D: 0.000800 2023-06-30 22:24:53,423 - INFO - [Train] step: 28399, loss_G: 328.847290, lr_G: 0.000800 2023-06-30 22:26:34,938 - INFO - [Train] step: 28499, loss_D: -1.313727, lr_D: 0.000800 2023-06-30 22:26:35,019 - INFO - [Train] step: 28499, loss_G: 328.050262, lr_G: 0.000800 2023-06-30 22:28:16,351 - INFO - [Train] step: 28599, loss_D: -2.181830, lr_D: 0.000800 2023-06-30 22:28:16,433 - INFO - [Train] step: 28599, loss_G: 333.957062, lr_G: 0.000800 2023-06-30 22:29:57,765 - INFO - [Train] step: 28699, loss_D: -1.928108, lr_D: 0.000800 2023-06-30 22:29:57,846 - INFO - [Train] step: 28699, loss_G: 337.401794, lr_G: 0.000800 2023-06-30 22:31:39,233 - INFO - [Train] step: 28799, loss_D: -1.367324, lr_D: 0.000800 2023-06-30 22:31:39,314 - INFO - [Train] step: 28799, loss_G: 339.819397, lr_G: 0.000800 2023-06-30 22:33:20,700 - INFO - [Train] step: 28899, loss_D: -2.479833, lr_D: 0.000800 2023-06-30 22:33:20,781 - INFO - [Train] step: 28899, loss_G: 339.079529, lr_G: 0.000800 2023-06-30 22:35:02,153 - INFO - [Train] step: 28999, loss_D: -2.283656, lr_D: 0.000800 2023-06-30 22:35:02,234 - INFO - [Train] step: 28999, loss_G: 341.678833, lr_G: 0.000800 2023-06-30 22:36:06,439 - INFO - [Eval] step: 28999, fid: 29.816814 2023-06-30 22:37:47,815 - INFO - [Train] step: 29099, loss_D: -2.984109, lr_D: 0.000800 2023-06-30 22:37:47,896 - INFO - [Train] step: 29099, loss_G: 343.722870, lr_G: 0.000800 2023-06-30 22:39:29,425 - INFO - [Train] step: 29199, loss_D: -2.023509, lr_D: 0.000800 2023-06-30 22:39:29,507 - INFO - [Train] step: 29199, loss_G: 348.134308, lr_G: 0.000800 2023-06-30 22:41:10,908 - INFO - [Train] step: 29299, loss_D: -2.347546, lr_D: 0.000800 2023-06-30 22:41:10,990 - INFO - [Train] step: 29299, loss_G: 350.042419, lr_G: 0.000800 2023-06-30 22:42:52,350 - INFO - [Train] step: 29399, loss_D: -2.735665, lr_D: 0.000800 2023-06-30 22:42:52,431 - INFO - [Train] step: 29399, loss_G: 349.983337, lr_G: 0.000800 2023-06-30 22:44:33,797 - INFO - [Train] step: 29499, loss_D: -2.450019, lr_D: 0.000800 2023-06-30 22:44:33,878 - INFO - [Train] step: 29499, loss_G: 347.629822, lr_G: 0.000800 2023-06-30 22:46:15,273 - INFO - [Train] step: 29599, loss_D: -2.320687, lr_D: 0.000800 2023-06-30 22:46:15,355 - INFO - [Train] step: 29599, loss_G: 347.213806, lr_G: 0.000800 2023-06-30 22:47:56,718 - INFO - [Train] step: 29699, loss_D: -2.389506, lr_D: 0.000800 2023-06-30 22:47:56,799 - INFO - [Train] step: 29699, loss_G: 350.454407, lr_G: 0.000800 2023-06-30 22:49:38,347 - INFO - [Train] step: 29799, loss_D: -3.027055, lr_D: 0.000800 2023-06-30 22:49:38,429 - INFO - [Train] step: 29799, loss_G: 354.073212, lr_G: 0.000800 2023-06-30 22:51:19,783 - INFO - [Train] step: 29899, loss_D: -2.695169, lr_D: 0.000800 2023-06-30 22:51:19,864 - INFO - [Train] step: 29899, loss_G: 356.786102, lr_G: 0.000800 2023-06-30 22:53:01,226 - INFO - [Train] step: 29999, loss_D: -2.730213, lr_D: 0.000800 2023-06-30 22:53:01,307 - INFO - [Train] step: 29999, loss_G: 357.023621, lr_G: 0.000800 2023-06-30 22:54:05,505 - INFO - [Eval] step: 29999, fid: 29.761918 2023-06-30 22:55:46,920 - INFO - [Train] step: 30099, loss_D: -2.071956, lr_D: 0.000800 2023-06-30 22:55:47,001 - INFO - [Train] step: 30099, loss_G: 358.159058, lr_G: 0.000800 2023-06-30 22:57:28,736 - INFO - [Train] step: 30199, loss_D: -2.497499, lr_D: 0.000800 2023-06-30 22:57:28,817 - INFO - [Train] step: 30199, loss_G: 359.605988, lr_G: 0.000800 2023-06-30 22:59:10,592 - INFO - [Train] step: 30299, loss_D: -1.291651, lr_D: 0.000800 2023-06-30 22:59:10,674 - INFO - [Train] step: 30299, loss_G: 361.755066, lr_G: 0.000800 2023-06-30 23:00:52,575 - INFO - [Train] step: 30399, loss_D: 0.295814, lr_D: 0.000800 2023-06-30 23:00:52,657 - INFO - [Train] step: 30399, loss_G: 365.825134, lr_G: 0.000800 2023-06-30 23:02:34,352 - INFO - [Train] step: 30499, loss_D: -2.848036, lr_D: 0.000800 2023-06-30 23:02:34,434 - INFO - [Train] step: 30499, loss_G: 365.356628, lr_G: 0.000800 2023-06-30 23:04:16,185 - INFO - [Train] step: 30599, loss_D: -1.784161, lr_D: 0.000800 2023-06-30 23:04:16,267 - INFO - [Train] step: 30599, loss_G: 366.816345, lr_G: 0.000800 2023-06-30 23:05:58,014 - INFO - [Train] step: 30699, loss_D: -0.716805, lr_D: 0.000800 2023-06-30 23:05:58,095 - INFO - [Train] step: 30699, loss_G: 368.590515, lr_G: 0.000800 2023-06-30 23:07:39,780 - INFO - [Train] step: 30799, loss_D: -3.089762, lr_D: 0.000800 2023-06-30 23:07:39,862 - INFO - [Train] step: 30799, loss_G: 369.016754, lr_G: 0.000800 2023-06-30 23:09:21,563 - INFO - [Train] step: 30899, loss_D: -2.574355, lr_D: 0.000800 2023-06-30 23:09:21,644 - INFO - [Train] step: 30899, loss_G: 372.218689, lr_G: 0.000800 2023-06-30 23:11:03,418 - INFO - [Train] step: 30999, loss_D: -3.345348, lr_D: 0.000800 2023-06-30 23:11:03,499 - INFO - [Train] step: 30999, loss_G: 372.726074, lr_G: 0.000800 2023-06-30 23:12:07,739 - INFO - [Eval] step: 30999, fid: 29.644446 2023-06-30 23:13:49,219 - INFO - [Train] step: 31099, loss_D: -2.471330, lr_D: 0.000800 2023-06-30 23:13:49,300 - INFO - [Train] step: 31099, loss_G: 371.383118, lr_G: 0.000800 2023-06-30 23:15:30,683 - INFO - [Train] step: 31199, loss_D: -1.356232, lr_D: 0.000800 2023-06-30 23:15:30,764 - INFO - [Train] step: 31199, loss_G: 374.848328, lr_G: 0.000800 2023-06-30 23:17:12,122 - INFO - [Train] step: 31299, loss_D: -2.403287, lr_D: 0.000800 2023-06-30 23:17:12,203 - INFO - [Train] step: 31299, loss_G: 373.229889, lr_G: 0.000800 2023-06-30 23:18:53,564 - INFO - [Train] step: 31399, loss_D: -1.360738, lr_D: 0.000800 2023-06-30 23:18:53,653 - INFO - [Train] step: 31399, loss_G: 376.332550, lr_G: 0.000800 2023-06-30 23:20:35,008 - INFO - [Train] step: 31499, loss_D: -2.582425, lr_D: 0.000800 2023-06-30 23:20:35,089 - INFO - [Train] step: 31499, loss_G: 375.209229, lr_G: 0.000800 2023-06-30 23:22:16,432 - INFO - [Train] step: 31599, loss_D: -2.407710, lr_D: 0.000800 2023-06-30 23:22:16,513 - INFO - [Train] step: 31599, loss_G: 376.163818, lr_G: 0.000800 2023-06-30 23:23:57,991 - INFO - [Train] step: 31699, loss_D: -2.660625, lr_D: 0.000800 2023-06-30 23:23:58,072 - INFO - [Train] step: 31699, loss_G: 375.089478, lr_G: 0.000800 2023-06-30 23:25:39,410 - INFO - [Train] step: 31799, loss_D: -2.484296, lr_D: 0.000800 2023-06-30 23:25:39,491 - INFO - [Train] step: 31799, loss_G: 375.905823, lr_G: 0.000800 2023-06-30 23:27:20,815 - INFO - [Train] step: 31899, loss_D: -2.969232, lr_D: 0.000800 2023-06-30 23:27:20,896 - INFO - [Train] step: 31899, loss_G: 376.660950, lr_G: 0.000800 2023-06-30 23:29:02,230 - INFO - [Train] step: 31999, loss_D: -2.878012, lr_D: 0.000800 2023-06-30 23:29:02,312 - INFO - [Train] step: 31999, loss_G: 378.589172, lr_G: 0.000800 2023-06-30 23:30:06,653 - INFO - [Eval] step: 31999, fid: 29.155440 2023-06-30 23:31:48,270 - INFO - [Train] step: 32099, loss_D: -2.701690, lr_D: 0.000800 2023-06-30 23:31:48,351 - INFO - [Train] step: 32099, loss_G: 379.471985, lr_G: 0.000800 2023-06-30 23:33:29,782 - INFO - [Train] step: 32199, loss_D: -1.023760, lr_D: 0.000800 2023-06-30 23:33:29,863 - INFO - [Train] step: 32199, loss_G: 383.152802, lr_G: 0.000800 2023-06-30 23:35:11,262 - INFO - [Train] step: 32299, loss_D: -2.285961, lr_D: 0.000800 2023-06-30 23:35:11,343 - INFO - [Train] step: 32299, loss_G: 379.264038, lr_G: 0.000800 2023-06-30 23:36:52,944 - INFO - [Train] step: 32399, loss_D: -2.573316, lr_D: 0.000800 2023-06-30 23:36:53,025 - INFO - [Train] step: 32399, loss_G: 380.037964, lr_G: 0.000800 2023-06-30 23:38:34,387 - INFO - [Train] step: 32499, loss_D: -2.506263, lr_D: 0.000800 2023-06-30 23:38:34,468 - INFO - [Train] step: 32499, loss_G: 381.465210, lr_G: 0.000800 2023-06-30 23:40:15,838 - INFO - [Train] step: 32599, loss_D: -0.611246, lr_D: 0.000800 2023-06-30 23:40:15,920 - INFO - [Train] step: 32599, loss_G: 386.681854, lr_G: 0.000800 2023-06-30 23:41:57,365 - INFO - [Train] step: 32699, loss_D: -2.427005, lr_D: 0.000800 2023-06-30 23:41:57,446 - INFO - [Train] step: 32699, loss_G: 387.738037, lr_G: 0.000800 2023-06-30 23:43:38,810 - INFO - [Train] step: 32799, loss_D: -2.737130, lr_D: 0.000800 2023-06-30 23:43:38,892 - INFO - [Train] step: 32799, loss_G: 386.465820, lr_G: 0.000800 2023-06-30 23:45:20,276 - INFO - [Train] step: 32899, loss_D: -1.997162, lr_D: 0.000800 2023-06-30 23:45:20,357 - INFO - [Train] step: 32899, loss_G: 390.333923, lr_G: 0.000800 2023-06-30 23:47:01,873 - INFO - [Train] step: 32999, loss_D: -2.580127, lr_D: 0.000800 2023-06-30 23:47:01,955 - INFO - [Train] step: 32999, loss_G: 391.326019, lr_G: 0.000800 2023-06-30 23:48:06,240 - INFO - [Eval] step: 32999, fid: 29.292619 2023-06-30 23:49:47,581 - INFO - [Train] step: 33099, loss_D: -2.263976, lr_D: 0.000800 2023-06-30 23:49:47,662 - INFO - [Train] step: 33099, loss_G: 388.182495, lr_G: 0.000800 2023-06-30 23:51:29,044 - INFO - [Train] step: 33199, loss_D: -2.391199, lr_D: 0.000800 2023-06-30 23:51:29,125 - INFO - [Train] step: 33199, loss_G: 391.492065, lr_G: 0.000800 2023-06-30 23:53:10,480 - INFO - [Train] step: 33299, loss_D: 0.601247, lr_D: 0.000800 2023-06-30 23:53:10,561 - INFO - [Train] step: 33299, loss_G: 391.666626, lr_G: 0.000800 2023-06-30 23:54:51,944 - INFO - [Train] step: 33399, loss_D: -2.521722, lr_D: 0.000800 2023-06-30 23:54:52,025 - INFO - [Train] step: 33399, loss_G: 391.218658, lr_G: 0.000800 2023-06-30 23:56:33,460 - INFO - [Train] step: 33499, loss_D: -2.545768, lr_D: 0.000800 2023-06-30 23:56:33,541 - INFO - [Train] step: 33499, loss_G: 392.872620, lr_G: 0.000800 2023-06-30 23:58:14,912 - INFO - [Train] step: 33599, loss_D: -3.128125, lr_D: 0.000800 2023-06-30 23:58:14,993 - INFO - [Train] step: 33599, loss_G: 394.236938, lr_G: 0.000800 2023-06-30 23:59:56,468 - INFO - [Train] step: 33699, loss_D: -1.851955, lr_D: 0.000800 2023-06-30 23:59:56,549 - INFO - [Train] step: 33699, loss_G: 396.546570, lr_G: 0.000800 2023-07-01 00:01:37,864 - INFO - [Train] step: 33799, loss_D: -3.594588, lr_D: 0.000800 2023-07-01 00:01:37,945 - INFO - [Train] step: 33799, loss_G: 400.591278, lr_G: 0.000800 2023-07-01 00:03:19,291 - INFO - [Train] step: 33899, loss_D: 0.897312, lr_D: 0.000800 2023-07-01 00:03:19,373 - INFO - [Train] step: 33899, loss_G: 401.186737, lr_G: 0.000800 2023-07-01 00:05:00,757 - INFO - [Train] step: 33999, loss_D: -2.829364, lr_D: 0.000800 2023-07-01 00:05:00,838 - INFO - [Train] step: 33999, loss_G: 403.535706, lr_G: 0.000800 2023-07-01 00:06:05,029 - INFO - [Eval] step: 33999, fid: 29.315158 2023-07-01 00:07:46,348 - INFO - [Train] step: 34099, loss_D: 0.481018, lr_D: 0.000800 2023-07-01 00:07:46,430 - INFO - [Train] step: 34099, loss_G: 407.253723, lr_G: 0.000800 2023-07-01 00:09:27,817 - INFO - [Train] step: 34199, loss_D: -2.767841, lr_D: 0.000800 2023-07-01 00:09:27,898 - INFO - [Train] step: 34199, loss_G: 411.204712, lr_G: 0.000800 2023-07-01 00:11:09,441 - INFO - [Train] step: 34299, loss_D: -0.613835, lr_D: 0.000800 2023-07-01 00:11:09,522 - INFO - [Train] step: 34299, loss_G: 411.659210, lr_G: 0.000800 2023-07-01 00:12:50,902 - INFO - [Train] step: 34399, loss_D: -3.116790, lr_D: 0.000800 2023-07-01 00:12:50,984 - INFO - [Train] step: 34399, loss_G: 416.016663, lr_G: 0.000800 2023-07-01 00:14:32,326 - INFO - [Train] step: 34499, loss_D: -2.776420, lr_D: 0.000800 2023-07-01 00:14:32,407 - INFO - [Train] step: 34499, loss_G: 420.120453, lr_G: 0.000800 2023-07-01 00:16:13,761 - INFO - [Train] step: 34599, loss_D: 0.989557, lr_D: 0.000800 2023-07-01 00:16:13,842 - INFO - [Train] step: 34599, loss_G: 420.401001, lr_G: 0.000800 2023-07-01 00:17:55,185 - INFO - [Train] step: 34699, loss_D: -2.256785, lr_D: 0.000800 2023-07-01 00:17:55,266 - INFO - [Train] step: 34699, loss_G: 416.899597, lr_G: 0.000800 2023-07-01 00:19:36,611 - INFO - [Train] step: 34799, loss_D: -2.196185, lr_D: 0.000800 2023-07-01 00:19:36,693 - INFO - [Train] step: 34799, loss_G: 423.381165, lr_G: 0.000800 2023-07-01 00:21:18,068 - INFO - [Train] step: 34899, loss_D: -3.141039, lr_D: 0.000800 2023-07-01 00:21:18,149 - INFO - [Train] step: 34899, loss_G: 428.075348, lr_G: 0.000800 2023-07-01 00:22:59,723 - INFO - [Train] step: 34999, loss_D: -2.935205, lr_D: 0.000800 2023-07-01 00:22:59,804 - INFO - [Train] step: 34999, loss_G: 424.486328, lr_G: 0.000800 2023-07-01 00:24:03,991 - INFO - [Eval] step: 34999, fid: 29.535667 2023-07-01 00:25:45,362 - INFO - [Train] step: 35099, loss_D: -2.748412, lr_D: 0.000800 2023-07-01 00:25:45,443 - INFO - [Train] step: 35099, loss_G: 424.003662, lr_G: 0.000800 2023-07-01 00:27:26,819 - INFO - [Train] step: 35199, loss_D: -2.839583, lr_D: 0.000800 2023-07-01 00:27:26,900 - INFO - [Train] step: 35199, loss_G: 428.727173, lr_G: 0.000800 2023-07-01 00:29:08,284 - INFO - [Train] step: 35299, loss_D: -2.972176, lr_D: 0.000800 2023-07-01 00:29:08,365 - INFO - [Train] step: 35299, loss_G: 429.516113, lr_G: 0.000800 2023-07-01 00:30:49,728 - INFO - [Train] step: 35399, loss_D: -2.443041, lr_D: 0.000800 2023-07-01 00:30:49,809 - INFO - [Train] step: 35399, loss_G: 429.157349, lr_G: 0.000800 2023-07-01 00:32:31,207 - INFO - [Train] step: 35499, loss_D: -2.949633, lr_D: 0.000800 2023-07-01 00:32:31,288 - INFO - [Train] step: 35499, loss_G: 429.553711, lr_G: 0.000800 2023-07-01 00:34:12,776 - INFO - [Train] step: 35599, loss_D: -2.697312, lr_D: 0.000800 2023-07-01 00:34:12,857 - INFO - [Train] step: 35599, loss_G: 431.439392, lr_G: 0.000800 2023-07-01 00:35:54,226 - INFO - [Train] step: 35699, loss_D: -0.927166, lr_D: 0.000800 2023-07-01 00:35:54,307 - INFO - [Train] step: 35699, loss_G: 434.296326, lr_G: 0.000800 2023-07-01 00:37:35,661 - INFO - [Train] step: 35799, loss_D: -3.029692, lr_D: 0.000800 2023-07-01 00:37:35,743 - INFO - [Train] step: 35799, loss_G: 436.045349, lr_G: 0.000800 2023-07-01 00:39:17,155 - INFO - [Train] step: 35899, loss_D: -2.785388, lr_D: 0.000800 2023-07-01 00:39:17,237 - INFO - [Train] step: 35899, loss_G: 438.327148, lr_G: 0.000800 2023-07-01 00:40:58,584 - INFO - [Train] step: 35999, loss_D: -1.492091, lr_D: 0.000800 2023-07-01 00:40:58,665 - INFO - [Train] step: 35999, loss_G: 441.354370, lr_G: 0.000800 2023-07-01 00:42:02,882 - INFO - [Eval] step: 35999, fid: 29.599504 2023-07-01 00:43:44,148 - INFO - [Train] step: 36099, loss_D: -2.497011, lr_D: 0.000800 2023-07-01 00:43:44,230 - INFO - [Train] step: 36099, loss_G: 442.932434, lr_G: 0.000800 2023-07-01 00:45:25,536 - INFO - [Train] step: 36199, loss_D: -1.837285, lr_D: 0.000800 2023-07-01 00:45:25,617 - INFO - [Train] step: 36199, loss_G: 445.760620, lr_G: 0.000800 2023-07-01 00:47:07,099 - INFO - [Train] step: 36299, loss_D: -2.929988, lr_D: 0.000800 2023-07-01 00:47:07,180 - INFO - [Train] step: 36299, loss_G: 447.919189, lr_G: 0.000800 2023-07-01 00:48:48,504 - INFO - [Train] step: 36399, loss_D: -4.339283, lr_D: 0.000800 2023-07-01 00:48:48,585 - INFO - [Train] step: 36399, loss_G: 449.712097, lr_G: 0.000800 2023-07-01 00:50:29,889 - INFO - [Train] step: 36499, loss_D: -2.993167, lr_D: 0.000800 2023-07-01 00:50:29,970 - INFO - [Train] step: 36499, loss_G: 451.343201, lr_G: 0.000800 2023-07-01 00:52:11,260 - INFO - [Train] step: 36599, loss_D: -3.054156, lr_D: 0.000800 2023-07-01 00:52:11,341 - INFO - [Train] step: 36599, loss_G: 451.466248, lr_G: 0.000800 2023-07-01 00:53:52,609 - INFO - [Train] step: 36699, loss_D: -2.877662, lr_D: 0.000800 2023-07-01 00:53:52,690 - INFO - [Train] step: 36699, loss_G: 456.095520, lr_G: 0.000800 2023-07-01 00:55:33,995 - INFO - [Train] step: 36799, loss_D: -3.209817, lr_D: 0.000800 2023-07-01 00:55:34,077 - INFO - [Train] step: 36799, loss_G: 454.479492, lr_G: 0.000800 2023-07-01 00:57:15,517 - INFO - [Train] step: 36899, loss_D: -2.630890, lr_D: 0.000800 2023-07-01 00:57:15,598 - INFO - [Train] step: 36899, loss_G: 453.285675, lr_G: 0.000800 2023-07-01 00:58:56,937 - INFO - [Train] step: 36999, loss_D: -2.578079, lr_D: 0.000800 2023-07-01 00:58:57,019 - INFO - [Train] step: 36999, loss_G: 457.827515, lr_G: 0.000800 2023-07-01 01:00:01,194 - INFO - [Eval] step: 36999, fid: 29.266871 2023-07-01 01:01:42,486 - INFO - [Train] step: 37099, loss_D: -3.983014, lr_D: 0.000800 2023-07-01 01:01:42,568 - INFO - [Train] step: 37099, loss_G: 463.444916, lr_G: 0.000800 2023-07-01 01:03:23,917 - INFO - [Train] step: 37199, loss_D: -3.300295, lr_D: 0.000800 2023-07-01 01:03:23,999 - INFO - [Train] step: 37199, loss_G: 465.628174, lr_G: 0.000800 2023-07-01 01:05:05,382 - INFO - [Train] step: 37299, loss_D: -2.806351, lr_D: 0.000800 2023-07-01 01:05:05,464 - INFO - [Train] step: 37299, loss_G: 467.073822, lr_G: 0.000800 2023-07-01 01:06:46,843 - INFO - [Train] step: 37399, loss_D: -2.604542, lr_D: 0.000800 2023-07-01 01:06:46,924 - INFO - [Train] step: 37399, loss_G: 464.637726, lr_G: 0.000800 2023-07-01 01:08:28,295 - INFO - [Train] step: 37499, loss_D: -2.967821, lr_D: 0.000800 2023-07-01 01:08:28,376 - INFO - [Train] step: 37499, loss_G: 467.642609, lr_G: 0.000800 2023-07-01 01:10:09,903 - INFO - [Train] step: 37599, loss_D: -3.276244, lr_D: 0.000800 2023-07-01 01:10:09,985 - INFO - [Train] step: 37599, loss_G: 471.536316, lr_G: 0.000800 2023-07-01 01:11:51,372 - INFO - [Train] step: 37699, loss_D: -3.017306, lr_D: 0.000800 2023-07-01 01:11:51,453 - INFO - [Train] step: 37699, loss_G: 472.022980, lr_G: 0.000800 2023-07-01 01:13:32,830 - INFO - [Train] step: 37799, loss_D: -0.885573, lr_D: 0.000800 2023-07-01 01:13:32,912 - INFO - [Train] step: 37799, loss_G: 472.615051, lr_G: 0.000800 2023-07-01 01:15:14,315 - INFO - [Train] step: 37899, loss_D: -1.469989, lr_D: 0.000800 2023-07-01 01:15:14,397 - INFO - [Train] step: 37899, loss_G: 476.197815, lr_G: 0.000800 2023-07-01 01:16:55,780 - INFO - [Train] step: 37999, loss_D: -2.606544, lr_D: 0.000800 2023-07-01 01:16:55,861 - INFO - [Train] step: 37999, loss_G: 475.610413, lr_G: 0.000800 2023-07-01 01:18:00,111 - INFO - [Eval] step: 37999, fid: 29.863294 2023-07-01 01:19:41,449 - INFO - [Train] step: 38099, loss_D: -2.417771, lr_D: 0.000800 2023-07-01 01:19:41,530 - INFO - [Train] step: 38099, loss_G: 480.615845, lr_G: 0.000800 2023-07-01 01:21:23,049 - INFO - [Train] step: 38199, loss_D: -2.757498, lr_D: 0.000800 2023-07-01 01:21:23,131 - INFO - [Train] step: 38199, loss_G: 480.763916, lr_G: 0.000800 2023-07-01 01:23:04,532 - INFO - [Train] step: 38299, loss_D: -1.626531, lr_D: 0.000800 2023-07-01 01:23:04,613 - INFO - [Train] step: 38299, loss_G: 484.418030, lr_G: 0.000800 2023-07-01 01:24:46,002 - INFO - [Train] step: 38399, loss_D: -2.568066, lr_D: 0.000800 2023-07-01 01:24:46,083 - INFO - [Train] step: 38399, loss_G: 486.394226, lr_G: 0.000800 2023-07-01 01:26:27,430 - INFO - [Train] step: 38499, loss_D: -2.203668, lr_D: 0.000800 2023-07-01 01:26:27,512 - INFO - [Train] step: 38499, loss_G: 487.039154, lr_G: 0.000800 2023-07-01 01:28:08,846 - INFO - [Train] step: 38599, loss_D: -3.121551, lr_D: 0.000800 2023-07-01 01:28:08,927 - INFO - [Train] step: 38599, loss_G: 485.324005, lr_G: 0.000800 2023-07-01 01:29:50,301 - INFO - [Train] step: 38699, loss_D: -1.955580, lr_D: 0.000800 2023-07-01 01:29:50,382 - INFO - [Train] step: 38699, loss_G: 490.187683, lr_G: 0.000800 2023-07-01 01:31:31,750 - INFO - [Train] step: 38799, loss_D: -1.425967, lr_D: 0.000800 2023-07-01 01:31:31,831 - INFO - [Train] step: 38799, loss_G: 498.296875, lr_G: 0.000800 2023-07-01 01:33:13,331 - INFO - [Train] step: 38899, loss_D: -2.722102, lr_D: 0.000800 2023-07-01 01:33:13,413 - INFO - [Train] step: 38899, loss_G: 494.252991, lr_G: 0.000800 2023-07-01 01:34:54,792 - INFO - [Train] step: 38999, loss_D: -2.565505, lr_D: 0.000800 2023-07-01 01:34:54,873 - INFO - [Train] step: 38999, loss_G: 497.036133, lr_G: 0.000800 2023-07-01 01:35:59,037 - INFO - [Eval] step: 38999, fid: 28.997234 2023-07-01 01:37:40,634 - INFO - [Train] step: 39099, loss_D: -3.082582, lr_D: 0.000800 2023-07-01 01:37:40,715 - INFO - [Train] step: 39099, loss_G: 502.368835, lr_G: 0.000800 2023-07-01 01:39:22,042 - INFO - [Train] step: 39199, loss_D: -0.691785, lr_D: 0.000800 2023-07-01 01:39:22,123 - INFO - [Train] step: 39199, loss_G: 502.713470, lr_G: 0.000800 2023-07-01 01:41:03,460 - INFO - [Train] step: 39299, loss_D: -3.035886, lr_D: 0.000800 2023-07-01 01:41:03,541 - INFO - [Train] step: 39299, loss_G: 501.627502, lr_G: 0.000800 2023-07-01 01:42:44,869 - INFO - [Train] step: 39399, loss_D: -2.330028, lr_D: 0.000800 2023-07-01 01:42:44,950 - INFO - [Train] step: 39399, loss_G: 503.046112, lr_G: 0.000800 2023-07-01 01:44:26,424 - INFO - [Train] step: 39499, loss_D: 0.350765, lr_D: 0.000800 2023-07-01 01:44:26,505 - INFO - [Train] step: 39499, loss_G: 507.593872, lr_G: 0.000800 2023-07-01 01:46:07,866 - INFO - [Train] step: 39599, loss_D: -2.659655, lr_D: 0.000800 2023-07-01 01:46:07,947 - INFO - [Train] step: 39599, loss_G: 508.421631, lr_G: 0.000800 2023-07-01 01:47:49,243 - INFO - [Train] step: 39699, loss_D: -3.457156, lr_D: 0.000800 2023-07-01 01:47:49,324 - INFO - [Train] step: 39699, loss_G: 508.093262, lr_G: 0.000800 2023-07-01 01:49:30,717 - INFO - [Train] step: 39799, loss_D: -2.838082, lr_D: 0.000800 2023-07-01 01:49:30,799 - INFO - [Train] step: 39799, loss_G: 509.928345, lr_G: 0.000800 2023-07-01 01:51:12,116 - INFO - [Train] step: 39899, loss_D: -3.344132, lr_D: 0.000800 2023-07-01 01:51:12,197 - INFO - [Train] step: 39899, loss_G: 514.257446, lr_G: 0.000800 2023-07-01 01:52:53,509 - INFO - [Train] step: 39999, loss_D: -1.528742, lr_D: 0.000800 2023-07-01 01:52:53,591 - INFO - [Train] step: 39999, loss_G: 511.488281, lr_G: 0.000800 2023-07-01 01:53:57,766 - INFO - [Eval] step: 39999, fid: 28.738332 2023-07-01 01:53:58,251 - INFO - Best FID score: 28.738331834471126 2023-07-01 01:53:58,252 - INFO - End of training