2023-06-19 16:18:20,043 - INFO - Experiment directory: ./runs/gan_r1reg_cifar10/ 2023-06-19 16:18:20,043 - INFO - Number of processes: 1 2023-06-19 16:18:20,043 - INFO - Distributed type: DistributedType.NO 2023-06-19 16:18:20,043 - INFO - Mixed precision: no 2023-06-19 16:18:20,043 - INFO - ============================== 2023-06-19 16:18:20,327 - INFO - Size of training set: 50000 2023-06-19 16:18:20,327 - INFO - Batch size per process: 512 2023-06-19 16:18:20,327 - INFO - Total batch size: 512 2023-06-19 16:18:20,327 - INFO - ============================== 2023-06-19 16:18:20,879 - INFO - Start training... 2023-06-19 16:20:02,403 - INFO - [Train] step: 99, loss_D: 0.630572, lr_D: 0.000800 2023-06-19 16:20:02,484 - INFO - [Train] step: 99, loss_G: 1.592984, lr_G: 0.000800 2023-06-19 16:21:44,011 - INFO - [Train] step: 199, loss_D: 0.631153, lr_D: 0.000800 2023-06-19 16:21:44,093 - INFO - [Train] step: 199, loss_G: 1.318275, lr_G: 0.000800 2023-06-19 16:23:25,742 - INFO - [Train] step: 299, loss_D: 0.679171, lr_D: 0.000800 2023-06-19 16:23:25,824 - INFO - [Train] step: 299, loss_G: 1.425788, lr_G: 0.000800 2023-06-19 16:25:07,472 - INFO - [Train] step: 399, loss_D: 0.627571, lr_D: 0.000800 2023-06-19 16:25:07,561 - INFO - [Train] step: 399, loss_G: 0.911464, lr_G: 0.000800 2023-06-19 16:26:49,206 - INFO - [Train] step: 499, loss_D: 0.734117, lr_D: 0.000800 2023-06-19 16:26:49,288 - INFO - [Train] step: 499, loss_G: 1.798515, lr_G: 0.000800 2023-06-19 16:28:30,885 - INFO - [Train] step: 599, loss_D: 0.676259, lr_D: 0.000800 2023-06-19 16:28:30,967 - INFO - [Train] step: 599, loss_G: 0.764190, lr_G: 0.000800 2023-06-19 16:30:12,769 - INFO - [Train] step: 699, loss_D: 0.662621, lr_D: 0.000800 2023-06-19 16:30:12,851 - INFO - [Train] step: 699, loss_G: 0.969821, lr_G: 0.000800 2023-06-19 16:31:54,521 - INFO - [Train] step: 799, loss_D: 0.667231, lr_D: 0.000800 2023-06-19 16:31:54,602 - INFO - [Train] step: 799, loss_G: 1.157355, lr_G: 0.000800 2023-06-19 16:33:36,244 - INFO - [Train] step: 899, loss_D: 0.640085, lr_D: 0.000800 2023-06-19 16:33:36,326 - INFO - [Train] step: 899, loss_G: 0.900922, lr_G: 0.000800 2023-06-19 16:35:17,998 - INFO - [Train] step: 999, loss_D: 0.653502, lr_D: 0.000800 2023-06-19 16:35:18,080 - INFO - [Train] step: 999, loss_G: 0.826323, lr_G: 0.000800 2023-06-19 16:36:22,002 - INFO - [Eval] step: 999, fid: 121.418935 2023-06-19 16:38:03,667 - INFO - [Train] step: 1099, loss_D: 0.689157, lr_D: 0.000800 2023-06-19 16:38:03,749 - INFO - [Train] step: 1099, loss_G: 0.547471, lr_G: 0.000800 2023-06-19 16:39:45,448 - INFO - [Train] step: 1199, loss_D: 0.637257, lr_D: 0.000800 2023-06-19 16:39:45,529 - INFO - [Train] step: 1199, loss_G: 0.792830, lr_G: 0.000800 2023-06-19 16:41:27,350 - INFO - [Train] step: 1299, loss_D: 0.621763, lr_D: 0.000800 2023-06-19 16:41:27,432 - INFO - [Train] step: 1299, loss_G: 0.884128, lr_G: 0.000800 2023-06-19 16:43:09,090 - INFO - [Train] step: 1399, loss_D: 0.681109, lr_D: 0.000800 2023-06-19 16:43:09,171 - INFO - [Train] step: 1399, loss_G: 0.744074, lr_G: 0.000800 2023-06-19 16:44:50,889 - INFO - [Train] step: 1499, loss_D: 0.610626, lr_D: 0.000800 2023-06-19 16:44:50,970 - INFO - [Train] step: 1499, loss_G: 1.058784, lr_G: 0.000800 2023-06-19 16:46:32,664 - INFO - [Train] step: 1599, loss_D: 0.644191, lr_D: 0.000800 2023-06-19 16:46:32,746 - INFO - [Train] step: 1599, loss_G: 0.779935, lr_G: 0.000800 2023-06-19 16:48:14,426 - INFO - [Train] step: 1699, loss_D: 0.653259, lr_D: 0.000800 2023-06-19 16:48:14,508 - INFO - [Train] step: 1699, loss_G: 0.822730, lr_G: 0.000800 2023-06-19 16:49:56,172 - INFO - [Train] step: 1799, loss_D: 0.656331, lr_D: 0.000800 2023-06-19 16:49:56,253 - INFO - [Train] step: 1799, loss_G: 0.936053, lr_G: 0.000800 2023-06-19 16:51:37,873 - INFO - [Train] step: 1899, loss_D: 0.652987, lr_D: 0.000800 2023-06-19 16:51:37,955 - INFO - [Train] step: 1899, loss_G: 0.805230, lr_G: 0.000800 2023-06-19 16:53:19,772 - INFO - [Train] step: 1999, loss_D: 0.654512, lr_D: 0.000800 2023-06-19 16:53:19,854 - INFO - [Train] step: 1999, loss_G: 0.798084, lr_G: 0.000800 2023-06-19 16:54:23,735 - INFO - [Eval] step: 1999, fid: 65.381408 2023-06-19 16:56:05,557 - INFO - [Train] step: 2099, loss_D: 0.692581, lr_D: 0.000800 2023-06-19 16:56:05,639 - INFO - [Train] step: 2099, loss_G: 0.700422, lr_G: 0.000800 2023-06-19 16:57:47,344 - INFO - [Train] step: 2199, loss_D: 0.670529, lr_D: 0.000800 2023-06-19 16:57:47,426 - INFO - [Train] step: 2199, loss_G: 0.763524, lr_G: 0.000800 2023-06-19 16:59:29,142 - INFO - [Train] step: 2299, loss_D: 0.658742, lr_D: 0.000800 2023-06-19 16:59:29,223 - INFO - [Train] step: 2299, loss_G: 0.829960, lr_G: 0.000800 2023-06-19 17:01:10,918 - INFO - [Train] step: 2399, loss_D: 0.679070, lr_D: 0.000800 2023-06-19 17:01:10,999 - INFO - [Train] step: 2399, loss_G: 0.872157, lr_G: 0.000800 2023-06-19 17:02:52,710 - INFO - [Train] step: 2499, loss_D: 0.659195, lr_D: 0.000800 2023-06-19 17:02:52,792 - INFO - [Train] step: 2499, loss_G: 0.812520, lr_G: 0.000800 2023-06-19 17:04:34,651 - INFO - [Train] step: 2599, loss_D: 0.655010, lr_D: 0.000800 2023-06-19 17:04:34,732 - INFO - [Train] step: 2599, loss_G: 0.776013, lr_G: 0.000800 2023-06-19 17:06:16,420 - INFO - [Train] step: 2699, loss_D: 0.661451, lr_D: 0.000800 2023-06-19 17:06:16,502 - INFO - [Train] step: 2699, loss_G: 0.750409, lr_G: 0.000800 2023-06-19 17:07:58,174 - INFO - [Train] step: 2799, loss_D: 0.674862, lr_D: 0.000800 2023-06-19 17:07:58,256 - INFO - [Train] step: 2799, loss_G: 0.879265, lr_G: 0.000800 2023-06-19 17:09:39,921 - INFO - [Train] step: 2899, loss_D: 0.669296, lr_D: 0.000800 2023-06-19 17:09:40,002 - INFO - [Train] step: 2899, loss_G: 0.779422, lr_G: 0.000800 2023-06-19 17:11:21,655 - INFO - [Train] step: 2999, loss_D: 0.666309, lr_D: 0.000800 2023-06-19 17:11:21,736 - INFO - [Train] step: 2999, loss_G: 0.691414, lr_G: 0.000800 2023-06-19 17:12:25,710 - INFO - [Eval] step: 2999, fid: 60.139611 2023-06-19 17:14:07,325 - INFO - [Train] step: 3099, loss_D: 0.668169, lr_D: 0.000800 2023-06-19 17:14:07,407 - INFO - [Train] step: 3099, loss_G: 0.730235, lr_G: 0.000800 2023-06-19 17:15:48,961 - INFO - [Train] step: 3199, loss_D: 0.669945, lr_D: 0.000800 2023-06-19 17:15:49,042 - INFO - [Train] step: 3199, loss_G: 0.658206, lr_G: 0.000800 2023-06-19 17:17:30,774 - INFO - [Train] step: 3299, loss_D: 0.668016, lr_D: 0.000800 2023-06-19 17:17:30,856 - INFO - [Train] step: 3299, loss_G: 0.777101, lr_G: 0.000800 2023-06-19 17:19:12,406 - INFO - [Train] step: 3399, loss_D: 0.670470, lr_D: 0.000800 2023-06-19 17:19:12,487 - INFO - [Train] step: 3399, loss_G: 0.771688, lr_G: 0.000800 2023-06-19 17:20:54,079 - INFO - [Train] step: 3499, loss_D: 0.668364, lr_D: 0.000800 2023-06-19 17:20:54,161 - INFO - [Train] step: 3499, loss_G: 0.770079, lr_G: 0.000800 2023-06-19 17:22:35,716 - INFO - [Train] step: 3599, loss_D: 0.667798, lr_D: 0.000800 2023-06-19 17:22:35,798 - INFO - [Train] step: 3599, loss_G: 0.749209, lr_G: 0.000800 2023-06-19 17:24:17,377 - INFO - [Train] step: 3699, loss_D: 0.629803, lr_D: 0.000800 2023-06-19 17:24:17,459 - INFO - [Train] step: 3699, loss_G: 0.873855, lr_G: 0.000800 2023-06-19 17:25:59,016 - INFO - [Train] step: 3799, loss_D: 0.674705, lr_D: 0.000800 2023-06-19 17:25:59,098 - INFO - [Train] step: 3799, loss_G: 0.724041, lr_G: 0.000800 2023-06-19 17:27:40,810 - INFO - [Train] step: 3899, loss_D: 0.680453, lr_D: 0.000800 2023-06-19 17:27:40,891 - INFO - [Train] step: 3899, loss_G: 0.708225, lr_G: 0.000800 2023-06-19 17:29:22,415 - INFO - [Train] step: 3999, loss_D: 0.689825, lr_D: 0.000800 2023-06-19 17:29:22,497 - INFO - [Train] step: 3999, loss_G: 0.704541, lr_G: 0.000800 2023-06-19 17:30:26,512 - INFO - [Eval] step: 3999, fid: 47.273956 2023-06-19 17:32:08,242 - INFO - [Train] step: 4099, loss_D: 0.685012, lr_D: 0.000800 2023-06-19 17:32:08,324 - INFO - [Train] step: 4099, loss_G: 0.683307, lr_G: 0.000800 2023-06-19 17:33:50,050 - INFO - [Train] step: 4199, loss_D: 0.677231, lr_D: 0.000800 2023-06-19 17:33:50,132 - INFO - [Train] step: 4199, loss_G: 0.745005, lr_G: 0.000800 2023-06-19 17:35:31,874 - INFO - [Train] step: 4299, loss_D: 0.677819, lr_D: 0.000800 2023-06-19 17:35:31,956 - INFO - [Train] step: 4299, loss_G: 0.672230, lr_G: 0.000800 2023-06-19 17:37:13,717 - INFO - [Train] step: 4399, loss_D: 0.675707, lr_D: 0.000800 2023-06-19 17:37:13,798 - INFO - [Train] step: 4399, loss_G: 0.725379, lr_G: 0.000800 2023-06-19 17:38:55,513 - INFO - [Train] step: 4499, loss_D: 0.675966, lr_D: 0.000800 2023-06-19 17:38:55,594 - INFO - [Train] step: 4499, loss_G: 0.733016, lr_G: 0.000800 2023-06-19 17:40:37,468 - INFO - [Train] step: 4599, loss_D: 0.677952, lr_D: 0.000800 2023-06-19 17:40:37,550 - INFO - [Train] step: 4599, loss_G: 0.783288, lr_G: 0.000800 2023-06-19 17:42:19,316 - INFO - [Train] step: 4699, loss_D: 0.669699, lr_D: 0.000800 2023-06-19 17:42:19,398 - INFO - [Train] step: 4699, loss_G: 0.681148, lr_G: 0.000800 2023-06-19 17:44:01,096 - INFO - [Train] step: 4799, loss_D: 0.672987, lr_D: 0.000800 2023-06-19 17:44:01,177 - INFO - [Train] step: 4799, loss_G: 0.797471, lr_G: 0.000800 2023-06-19 17:45:42,917 - INFO - [Train] step: 4899, loss_D: 0.677501, lr_D: 0.000800 2023-06-19 17:45:42,998 - INFO - [Train] step: 4899, loss_G: 0.727849, lr_G: 0.000800 2023-06-19 17:47:24,785 - INFO - [Train] step: 4999, loss_D: 0.676733, lr_D: 0.000800 2023-06-19 17:47:24,867 - INFO - [Train] step: 4999, loss_G: 0.753475, lr_G: 0.000800 2023-06-19 17:48:28,906 - INFO - [Eval] step: 4999, fid: 39.778054 2023-06-19 17:50:10,375 - INFO - [Train] step: 5099, loss_D: 0.677156, lr_D: 0.000800 2023-06-19 17:50:10,456 - INFO - [Train] step: 5099, loss_G: 0.732821, lr_G: 0.000800 2023-06-19 17:51:51,891 - INFO - [Train] step: 5199, loss_D: 0.672936, lr_D: 0.000800 2023-06-19 17:51:51,972 - INFO - [Train] step: 5199, loss_G: 0.730051, lr_G: 0.000800 2023-06-19 17:53:33,201 - INFO - [Train] step: 5299, loss_D: 0.675087, lr_D: 0.000800 2023-06-19 17:53:33,283 - INFO - [Train] step: 5299, loss_G: 0.726818, lr_G: 0.000800 2023-06-19 17:55:14,477 - INFO - [Train] step: 5399, loss_D: 0.678691, lr_D: 0.000800 2023-06-19 17:55:14,558 - INFO - [Train] step: 5399, loss_G: 0.745096, lr_G: 0.000800 2023-06-19 17:56:55,772 - INFO - [Train] step: 5499, loss_D: 0.678806, lr_D: 0.000800 2023-06-19 17:56:55,853 - INFO - [Train] step: 5499, loss_G: 0.731462, lr_G: 0.000800 2023-06-19 17:58:37,067 - INFO - [Train] step: 5599, loss_D: 0.677838, lr_D: 0.000800 2023-06-19 17:58:37,148 - INFO - [Train] step: 5599, loss_G: 0.657271, lr_G: 0.000800 2023-06-19 18:00:18,346 - INFO - [Train] step: 5699, loss_D: 0.674916, lr_D: 0.000800 2023-06-19 18:00:18,427 - INFO - [Train] step: 5699, loss_G: 0.759198, lr_G: 0.000800 2023-06-19 18:01:59,624 - INFO - [Train] step: 5799, loss_D: 0.681174, lr_D: 0.000800 2023-06-19 18:01:59,705 - INFO - [Train] step: 5799, loss_G: 0.714235, lr_G: 0.000800 2023-06-19 18:03:41,037 - INFO - [Train] step: 5899, loss_D: 0.680392, lr_D: 0.000800 2023-06-19 18:03:41,119 - INFO - [Train] step: 5899, loss_G: 0.730233, lr_G: 0.000800 2023-06-19 18:05:22,302 - INFO - [Train] step: 5999, loss_D: 0.681072, lr_D: 0.000800 2023-06-19 18:05:22,383 - INFO - [Train] step: 5999, loss_G: 0.724977, lr_G: 0.000800 2023-06-19 18:06:26,415 - INFO - [Eval] step: 5999, fid: 35.012374 2023-06-19 18:08:07,949 - INFO - [Train] step: 6099, loss_D: 0.677228, lr_D: 0.000800 2023-06-19 18:08:08,030 - INFO - [Train] step: 6099, loss_G: 0.722313, lr_G: 0.000800 2023-06-19 18:09:49,306 - INFO - [Train] step: 6199, loss_D: 0.679063, lr_D: 0.000800 2023-06-19 18:09:49,388 - INFO - [Train] step: 6199, loss_G: 0.706979, lr_G: 0.000800 2023-06-19 18:11:30,649 - INFO - [Train] step: 6299, loss_D: 0.679207, lr_D: 0.000800 2023-06-19 18:11:30,731 - INFO - [Train] step: 6299, loss_G: 0.752991, lr_G: 0.000800 2023-06-19 18:13:11,927 - INFO - [Train] step: 6399, loss_D: 0.677348, lr_D: 0.000800 2023-06-19 18:13:12,008 - INFO - [Train] step: 6399, loss_G: 0.727207, lr_G: 0.000800 2023-06-19 18:14:53,419 - INFO - [Train] step: 6499, loss_D: 0.678382, lr_D: 0.000800 2023-06-19 18:14:53,500 - INFO - [Train] step: 6499, loss_G: 0.697461, lr_G: 0.000800 2023-06-19 18:16:34,733 - INFO - [Train] step: 6599, loss_D: 0.678211, lr_D: 0.000800 2023-06-19 18:16:34,814 - INFO - [Train] step: 6599, loss_G: 0.767757, lr_G: 0.000800 2023-06-19 18:18:16,088 - INFO - [Train] step: 6699, loss_D: 0.676264, lr_D: 0.000800 2023-06-19 18:18:16,169 - INFO - [Train] step: 6699, loss_G: 0.704337, lr_G: 0.000800 2023-06-19 18:19:57,459 - INFO - [Train] step: 6799, loss_D: 0.681342, lr_D: 0.000800 2023-06-19 18:19:57,540 - INFO - [Train] step: 6799, loss_G: 0.765746, lr_G: 0.000800 2023-06-19 18:21:38,764 - INFO - [Train] step: 6899, loss_D: 0.680248, lr_D: 0.000800 2023-06-19 18:21:38,845 - INFO - [Train] step: 6899, loss_G: 0.690578, lr_G: 0.000800 2023-06-19 18:23:20,265 - INFO - [Train] step: 6999, loss_D: 0.679006, lr_D: 0.000800 2023-06-19 18:23:20,346 - INFO - [Train] step: 6999, loss_G: 0.745722, lr_G: 0.000800 2023-06-19 18:24:24,475 - INFO - [Eval] step: 6999, fid: 31.936117 2023-06-19 18:26:06,019 - INFO - [Train] step: 7099, loss_D: 0.678658, lr_D: 0.000800 2023-06-19 18:26:06,100 - INFO - [Train] step: 7099, loss_G: 0.725606, lr_G: 0.000800 2023-06-19 18:27:47,561 - INFO - [Train] step: 7199, loss_D: 0.677341, lr_D: 0.000800 2023-06-19 18:27:47,642 - INFO - [Train] step: 7199, loss_G: 0.732703, lr_G: 0.000800 2023-06-19 18:29:29,010 - INFO - [Train] step: 7299, loss_D: 0.679921, lr_D: 0.000800 2023-06-19 18:29:29,092 - INFO - [Train] step: 7299, loss_G: 0.724179, lr_G: 0.000800 2023-06-19 18:31:10,410 - INFO - [Train] step: 7399, loss_D: 0.683386, lr_D: 0.000800 2023-06-19 18:31:10,492 - INFO - [Train] step: 7399, loss_G: 0.679819, lr_G: 0.000800 2023-06-19 18:32:51,814 - INFO - [Train] step: 7499, loss_D: 0.681078, lr_D: 0.000800 2023-06-19 18:32:51,896 - INFO - [Train] step: 7499, loss_G: 0.734118, lr_G: 0.000800 2023-06-19 18:34:33,204 - INFO - [Train] step: 7599, loss_D: 0.684464, lr_D: 0.000800 2023-06-19 18:34:33,285 - INFO - [Train] step: 7599, loss_G: 0.718786, lr_G: 0.000800 2023-06-19 18:36:14,711 - INFO - [Train] step: 7699, loss_D: 0.684127, lr_D: 0.000800 2023-06-19 18:36:14,792 - INFO - [Train] step: 7699, loss_G: 0.682303, lr_G: 0.000800 2023-06-19 18:37:56,187 - INFO - [Train] step: 7799, loss_D: 0.682236, lr_D: 0.000800 2023-06-19 18:37:56,268 - INFO - [Train] step: 7799, loss_G: 0.719574, lr_G: 0.000800 2023-06-19 18:39:37,546 - INFO - [Train] step: 7899, loss_D: 0.682577, lr_D: 0.000800 2023-06-19 18:39:37,627 - INFO - [Train] step: 7899, loss_G: 0.715023, lr_G: 0.000800 2023-06-19 18:41:18,931 - INFO - [Train] step: 7999, loss_D: 0.680906, lr_D: 0.000800 2023-06-19 18:41:19,013 - INFO - [Train] step: 7999, loss_G: 0.718277, lr_G: 0.000800 2023-06-19 18:42:23,186 - INFO - [Eval] step: 7999, fid: 29.591039 2023-06-19 18:44:04,740 - INFO - [Train] step: 8099, loss_D: 0.681273, lr_D: 0.000800 2023-06-19 18:44:04,821 - INFO - [Train] step: 8099, loss_G: 0.722181, lr_G: 0.000800 2023-06-19 18:45:46,216 - INFO - [Train] step: 8199, loss_D: 0.682773, lr_D: 0.000800 2023-06-19 18:45:46,297 - INFO - [Train] step: 8199, loss_G: 0.717473, lr_G: 0.000800 2023-06-19 18:47:27,581 - INFO - [Train] step: 8299, loss_D: 0.682070, lr_D: 0.000800 2023-06-19 18:47:27,662 - INFO - [Train] step: 8299, loss_G: 0.737657, lr_G: 0.000800 2023-06-19 18:49:08,943 - INFO - [Train] step: 8399, loss_D: 0.681997, lr_D: 0.000800 2023-06-19 18:49:09,024 - INFO - [Train] step: 8399, loss_G: 0.726991, lr_G: 0.000800 2023-06-19 18:50:50,469 - INFO - [Train] step: 8499, loss_D: 0.680500, lr_D: 0.000800 2023-06-19 18:50:50,550 - INFO - [Train] step: 8499, loss_G: 0.722429, lr_G: 0.000800 2023-06-19 18:52:31,819 - INFO - [Train] step: 8599, loss_D: 0.682515, lr_D: 0.000800 2023-06-19 18:52:31,900 - INFO - [Train] step: 8599, loss_G: 0.712274, lr_G: 0.000800 2023-06-19 18:54:13,143 - INFO - [Train] step: 8699, loss_D: 0.682045, lr_D: 0.000800 2023-06-19 18:54:13,225 - INFO - [Train] step: 8699, loss_G: 0.715126, lr_G: 0.000800 2023-06-19 18:55:54,613 - INFO - [Train] step: 8799, loss_D: 0.682826, lr_D: 0.000800 2023-06-19 18:55:54,694 - INFO - [Train] step: 8799, loss_G: 0.710536, lr_G: 0.000800 2023-06-19 18:57:35,971 - INFO - [Train] step: 8899, loss_D: 0.682718, lr_D: 0.000800 2023-06-19 18:57:36,052 - INFO - [Train] step: 8899, loss_G: 0.707596, lr_G: 0.000800 2023-06-19 18:59:17,285 - INFO - [Train] step: 8999, loss_D: 0.684008, lr_D: 0.000800 2023-06-19 18:59:17,366 - INFO - [Train] step: 8999, loss_G: 0.703820, lr_G: 0.000800 2023-06-19 19:00:21,502 - INFO - [Eval] step: 8999, fid: 29.677048 2023-06-19 19:02:02,931 - INFO - [Train] step: 9099, loss_D: 0.680977, lr_D: 0.000800 2023-06-19 19:02:03,012 - INFO - [Train] step: 9099, loss_G: 0.716335, lr_G: 0.000800 2023-06-19 19:03:44,289 - INFO - [Train] step: 9199, loss_D: 0.684396, lr_D: 0.000800 2023-06-19 19:03:44,370 - INFO - [Train] step: 9199, loss_G: 0.688010, lr_G: 0.000800 2023-06-19 19:05:25,675 - INFO - [Train] step: 9299, loss_D: 0.681495, lr_D: 0.000800 2023-06-19 19:05:25,756 - INFO - [Train] step: 9299, loss_G: 0.727215, lr_G: 0.000800 2023-06-19 19:07:07,081 - INFO - [Train] step: 9399, loss_D: 0.679605, lr_D: 0.000800 2023-06-19 19:07:07,163 - INFO - [Train] step: 9399, loss_G: 0.719803, lr_G: 0.000800 2023-06-19 19:08:48,439 - INFO - [Train] step: 9499, loss_D: 0.681447, lr_D: 0.000800 2023-06-19 19:08:48,520 - INFO - [Train] step: 9499, loss_G: 0.741672, lr_G: 0.000800 2023-06-19 19:10:29,810 - INFO - [Train] step: 9599, loss_D: 0.683440, lr_D: 0.000800 2023-06-19 19:10:29,891 - INFO - [Train] step: 9599, loss_G: 0.700764, lr_G: 0.000800 2023-06-19 19:12:11,252 - INFO - [Train] step: 9699, loss_D: 0.680953, lr_D: 0.000800 2023-06-19 19:12:11,333 - INFO - [Train] step: 9699, loss_G: 0.714566, lr_G: 0.000800 2023-06-19 19:13:53,192 - INFO - [Train] step: 9799, loss_D: 0.681105, lr_D: 0.000800 2023-06-19 19:13:53,274 - INFO - [Train] step: 9799, loss_G: 0.733830, lr_G: 0.000800 2023-06-19 19:15:34,737 - INFO - [Train] step: 9899, loss_D: 0.680626, lr_D: 0.000800 2023-06-19 19:15:34,819 - INFO - [Train] step: 9899, loss_G: 0.675343, lr_G: 0.000800 2023-06-19 19:17:16,152 - INFO - [Train] step: 9999, loss_D: 0.681770, lr_D: 0.000800 2023-06-19 19:17:16,234 - INFO - [Train] step: 9999, loss_G: 0.738462, lr_G: 0.000800 2023-06-19 19:18:20,432 - INFO - [Eval] step: 9999, fid: 27.189832 2023-06-19 19:20:02,155 - INFO - [Train] step: 10099, loss_D: 0.684162, lr_D: 0.000800 2023-06-19 19:20:02,237 - INFO - [Train] step: 10099, loss_G: 0.702962, lr_G: 0.000800 2023-06-19 19:21:43,519 - INFO - [Train] step: 10199, loss_D: 0.682704, lr_D: 0.000800 2023-06-19 19:21:43,600 - INFO - [Train] step: 10199, loss_G: 0.719688, lr_G: 0.000800 2023-06-19 19:23:24,882 - INFO - [Train] step: 10299, loss_D: 0.680741, lr_D: 0.000800 2023-06-19 19:23:24,963 - INFO - [Train] step: 10299, loss_G: 0.715438, lr_G: 0.000800 2023-06-19 19:25:06,461 - INFO - [Train] step: 10399, loss_D: 0.681199, lr_D: 0.000800 2023-06-19 19:25:06,542 - INFO - [Train] step: 10399, loss_G: 0.730319, lr_G: 0.000800 2023-06-19 19:26:47,843 - INFO - [Train] step: 10499, loss_D: 0.689495, lr_D: 0.000800 2023-06-19 19:26:47,924 - INFO - [Train] step: 10499, loss_G: 0.882725, lr_G: 0.000800 2023-06-19 19:28:29,222 - INFO - [Train] step: 10599, loss_D: 0.684456, lr_D: 0.000800 2023-06-19 19:28:29,304 - INFO - [Train] step: 10599, loss_G: 0.695934, lr_G: 0.000800 2023-06-19 19:30:10,629 - INFO - [Train] step: 10699, loss_D: 0.681960, lr_D: 0.000800 2023-06-19 19:30:10,710 - INFO - [Train] step: 10699, loss_G: 0.714138, lr_G: 0.000800 2023-06-19 19:31:52,010 - INFO - [Train] step: 10799, loss_D: 0.681526, lr_D: 0.000800 2023-06-19 19:31:52,091 - INFO - [Train] step: 10799, loss_G: 0.713037, lr_G: 0.000800 2023-06-19 19:33:33,332 - INFO - [Train] step: 10899, loss_D: 0.681710, lr_D: 0.000800 2023-06-19 19:33:33,414 - INFO - [Train] step: 10899, loss_G: 0.761762, lr_G: 0.000800 2023-06-19 19:35:14,877 - INFO - [Train] step: 10999, loss_D: 0.682471, lr_D: 0.000800 2023-06-19 19:35:14,961 - INFO - [Train] step: 10999, loss_G: 0.709668, lr_G: 0.000800 2023-06-19 19:36:19,165 - INFO - [Eval] step: 10999, fid: 26.901409 2023-06-19 19:38:00,708 - INFO - [Train] step: 11099, loss_D: 0.681955, lr_D: 0.000800 2023-06-19 19:38:00,789 - INFO - [Train] step: 11099, loss_G: 0.707945, lr_G: 0.000800 2023-06-19 19:39:42,069 - INFO - [Train] step: 11199, loss_D: 0.680271, lr_D: 0.000800 2023-06-19 19:39:42,150 - INFO - [Train] step: 11199, loss_G: 0.719432, lr_G: 0.000800 2023-06-19 19:41:23,430 - INFO - [Train] step: 11299, loss_D: 0.682420, lr_D: 0.000800 2023-06-19 19:41:23,511 - INFO - [Train] step: 11299, loss_G: 0.706786, lr_G: 0.000800 2023-06-19 19:43:04,839 - INFO - [Train] step: 11399, loss_D: 0.680561, lr_D: 0.000800 2023-06-19 19:43:04,928 - INFO - [Train] step: 11399, loss_G: 0.709855, lr_G: 0.000800 2023-06-19 19:44:46,272 - INFO - [Train] step: 11499, loss_D: 0.681561, lr_D: 0.000800 2023-06-19 19:44:46,353 - INFO - [Train] step: 11499, loss_G: 0.739285, lr_G: 0.000800 2023-06-19 19:46:27,629 - INFO - [Train] step: 11599, loss_D: 0.681758, lr_D: 0.000800 2023-06-19 19:46:27,710 - INFO - [Train] step: 11599, loss_G: 0.707144, lr_G: 0.000800 2023-06-19 19:48:09,140 - INFO - [Train] step: 11699, loss_D: 0.679912, lr_D: 0.000800 2023-06-19 19:48:09,221 - INFO - [Train] step: 11699, loss_G: 0.737145, lr_G: 0.000800 2023-06-19 19:49:50,441 - INFO - [Train] step: 11799, loss_D: 0.682164, lr_D: 0.000800 2023-06-19 19:49:50,522 - INFO - [Train] step: 11799, loss_G: 0.743891, lr_G: 0.000800 2023-06-19 19:51:31,817 - INFO - [Train] step: 11899, loss_D: 0.681108, lr_D: 0.000800 2023-06-19 19:51:31,898 - INFO - [Train] step: 11899, loss_G: 0.722816, lr_G: 0.000800 2023-06-19 19:53:13,139 - INFO - [Train] step: 11999, loss_D: 0.682869, lr_D: 0.000800 2023-06-19 19:53:13,220 - INFO - [Train] step: 11999, loss_G: 0.716350, lr_G: 0.000800 2023-06-19 19:54:17,447 - INFO - [Eval] step: 11999, fid: 26.349956 2023-06-19 19:55:59,000 - INFO - [Train] step: 12099, loss_D: 0.681681, lr_D: 0.000800 2023-06-19 19:55:59,081 - INFO - [Train] step: 12099, loss_G: 0.700207, lr_G: 0.000800 2023-06-19 19:57:40,375 - INFO - [Train] step: 12199, loss_D: 0.682758, lr_D: 0.000800 2023-06-19 19:57:40,456 - INFO - [Train] step: 12199, loss_G: 0.744789, lr_G: 0.000800 2023-06-19 19:59:21,922 - INFO - [Train] step: 12299, loss_D: 0.682400, lr_D: 0.000800 2023-06-19 19:59:22,004 - INFO - [Train] step: 12299, loss_G: 0.726246, lr_G: 0.000800 2023-06-19 20:01:03,295 - INFO - [Train] step: 12399, loss_D: 0.681180, lr_D: 0.000800 2023-06-19 20:01:03,376 - INFO - [Train] step: 12399, loss_G: 0.724668, lr_G: 0.000800 2023-06-19 20:02:44,634 - INFO - [Train] step: 12499, loss_D: 0.671713, lr_D: 0.000800 2023-06-19 20:02:44,715 - INFO - [Train] step: 12499, loss_G: 0.737876, lr_G: 0.000800 2023-06-19 20:04:25,959 - INFO - [Train] step: 12599, loss_D: 0.692707, lr_D: 0.000800 2023-06-19 20:04:26,040 - INFO - [Train] step: 12599, loss_G: 0.695448, lr_G: 0.000800 2023-06-19 20:06:07,256 - INFO - [Train] step: 12699, loss_D: 0.687864, lr_D: 0.000800 2023-06-19 20:06:07,337 - INFO - [Train] step: 12699, loss_G: 0.694780, lr_G: 0.000800 2023-06-19 20:07:48,653 - INFO - [Train] step: 12799, loss_D: 0.685421, lr_D: 0.000800 2023-06-19 20:07:48,734 - INFO - [Train] step: 12799, loss_G: 0.699317, lr_G: 0.000800 2023-06-19 20:09:29,986 - INFO - [Train] step: 12899, loss_D: 0.682226, lr_D: 0.000800 2023-06-19 20:09:30,067 - INFO - [Train] step: 12899, loss_G: 0.732791, lr_G: 0.000800 2023-06-19 20:11:11,499 - INFO - [Train] step: 12999, loss_D: 0.683133, lr_D: 0.000800 2023-06-19 20:11:11,581 - INFO - [Train] step: 12999, loss_G: 0.704684, lr_G: 0.000800 2023-06-19 20:12:15,803 - INFO - [Eval] step: 12999, fid: 25.731887 2023-06-19 20:13:57,343 - INFO - [Train] step: 13099, loss_D: 0.681846, lr_D: 0.000800 2023-06-19 20:13:57,425 - INFO - [Train] step: 13099, loss_G: 0.731126, lr_G: 0.000800 2023-06-19 20:15:38,724 - INFO - [Train] step: 13199, loss_D: 0.681822, lr_D: 0.000800 2023-06-19 20:15:38,805 - INFO - [Train] step: 13199, loss_G: 0.739586, lr_G: 0.000800 2023-06-19 20:17:20,072 - INFO - [Train] step: 13299, loss_D: 0.681887, lr_D: 0.000800 2023-06-19 20:17:20,153 - INFO - [Train] step: 13299, loss_G: 0.747734, lr_G: 0.000800 2023-06-19 20:19:01,414 - INFO - [Train] step: 13399, loss_D: 0.682501, lr_D: 0.000800 2023-06-19 20:19:01,495 - INFO - [Train] step: 13399, loss_G: 0.693516, lr_G: 0.000800 2023-06-19 20:20:42,768 - INFO - [Train] step: 13499, loss_D: 0.681313, lr_D: 0.000800 2023-06-19 20:20:42,850 - INFO - [Train] step: 13499, loss_G: 0.729428, lr_G: 0.000800 2023-06-19 20:22:24,237 - INFO - [Train] step: 13599, loss_D: 0.681387, lr_D: 0.000800 2023-06-19 20:22:24,318 - INFO - [Train] step: 13599, loss_G: 0.729709, lr_G: 0.000800 2023-06-19 20:24:05,621 - INFO - [Train] step: 13699, loss_D: 0.682070, lr_D: 0.000800 2023-06-19 20:24:05,703 - INFO - [Train] step: 13699, loss_G: 0.675197, lr_G: 0.000800 2023-06-19 20:25:46,950 - INFO - [Train] step: 13799, loss_D: 0.681369, lr_D: 0.000800 2023-06-19 20:25:47,032 - INFO - [Train] step: 13799, loss_G: 0.739750, lr_G: 0.000800 2023-06-19 20:27:28,330 - INFO - [Train] step: 13899, loss_D: 0.683302, lr_D: 0.000800 2023-06-19 20:27:28,411 - INFO - [Train] step: 13899, loss_G: 0.755292, lr_G: 0.000800 2023-06-19 20:29:09,689 - INFO - [Train] step: 13999, loss_D: 0.682560, lr_D: 0.000800 2023-06-19 20:29:09,771 - INFO - [Train] step: 13999, loss_G: 0.715406, lr_G: 0.000800 2023-06-19 20:30:13,978 - INFO - [Eval] step: 13999, fid: 26.041149 2023-06-19 20:31:55,209 - INFO - [Train] step: 14099, loss_D: 0.679100, lr_D: 0.000800 2023-06-19 20:31:55,290 - INFO - [Train] step: 14099, loss_G: 0.710925, lr_G: 0.000800 2023-06-19 20:33:36,550 - INFO - [Train] step: 14199, loss_D: 0.682770, lr_D: 0.000800 2023-06-19 20:33:36,631 - INFO - [Train] step: 14199, loss_G: 0.708035, lr_G: 0.000800 2023-06-19 20:35:18,080 - INFO - [Train] step: 14299, loss_D: 0.681926, lr_D: 0.000800 2023-06-19 20:35:18,161 - INFO - [Train] step: 14299, loss_G: 0.718095, lr_G: 0.000800 2023-06-19 20:36:59,460 - INFO - [Train] step: 14399, loss_D: 0.680114, lr_D: 0.000800 2023-06-19 20:36:59,541 - INFO - [Train] step: 14399, loss_G: 0.708670, lr_G: 0.000800 2023-06-19 20:38:40,797 - INFO - [Train] step: 14499, loss_D: 0.680881, lr_D: 0.000800 2023-06-19 20:38:40,878 - INFO - [Train] step: 14499, loss_G: 0.724800, lr_G: 0.000800 2023-06-19 20:40:22,146 - INFO - [Train] step: 14599, loss_D: 0.680497, lr_D: 0.000800 2023-06-19 20:40:22,227 - INFO - [Train] step: 14599, loss_G: 0.691703, lr_G: 0.000800 2023-06-19 20:42:03,510 - INFO - [Train] step: 14699, loss_D: 0.674124, lr_D: 0.000800 2023-06-19 20:42:03,591 - INFO - [Train] step: 14699, loss_G: 0.717347, lr_G: 0.000800 2023-06-19 20:43:44,876 - INFO - [Train] step: 14799, loss_D: 0.680445, lr_D: 0.000800 2023-06-19 20:43:44,957 - INFO - [Train] step: 14799, loss_G: 0.726065, lr_G: 0.000800 2023-06-19 20:45:26,353 - INFO - [Train] step: 14899, loss_D: 0.677859, lr_D: 0.000800 2023-06-19 20:45:26,434 - INFO - [Train] step: 14899, loss_G: 0.774889, lr_G: 0.000800 2023-06-19 20:47:07,721 - INFO - [Train] step: 14999, loss_D: 0.681573, lr_D: 0.000800 2023-06-19 20:47:07,802 - INFO - [Train] step: 14999, loss_G: 0.736477, lr_G: 0.000800 2023-06-19 20:48:12,013 - INFO - [Eval] step: 14999, fid: 25.332386 2023-06-19 20:49:53,562 - INFO - [Train] step: 15099, loss_D: 0.683294, lr_D: 0.000800 2023-06-19 20:49:53,643 - INFO - [Train] step: 15099, loss_G: 0.740997, lr_G: 0.000800 2023-06-19 20:51:34,973 - INFO - [Train] step: 15199, loss_D: 0.678847, lr_D: 0.000800 2023-06-19 20:51:35,054 - INFO - [Train] step: 15199, loss_G: 0.726357, lr_G: 0.000800 2023-06-19 20:53:16,306 - INFO - [Train] step: 15299, loss_D: 0.682364, lr_D: 0.000800 2023-06-19 20:53:16,387 - INFO - [Train] step: 15299, loss_G: 0.687479, lr_G: 0.000800 2023-06-19 20:54:57,686 - INFO - [Train] step: 15399, loss_D: 0.681831, lr_D: 0.000800 2023-06-19 20:54:57,767 - INFO - [Train] step: 15399, loss_G: 0.744586, lr_G: 0.000800 2023-06-19 20:56:39,045 - INFO - [Train] step: 15499, loss_D: 0.682556, lr_D: 0.000800 2023-06-19 20:56:39,126 - INFO - [Train] step: 15499, loss_G: 0.746926, lr_G: 0.000800 2023-06-19 20:58:20,520 - INFO - [Train] step: 15599, loss_D: 0.682100, lr_D: 0.000800 2023-06-19 20:58:20,601 - INFO - [Train] step: 15599, loss_G: 0.701137, lr_G: 0.000800 2023-06-19 21:00:01,828 - INFO - [Train] step: 15699, loss_D: 0.681173, lr_D: 0.000800 2023-06-19 21:00:01,910 - INFO - [Train] step: 15699, loss_G: 0.740509, lr_G: 0.000800 2023-06-19 21:01:43,122 - INFO - [Train] step: 15799, loss_D: 0.681483, lr_D: 0.000800 2023-06-19 21:01:43,203 - INFO - [Train] step: 15799, loss_G: 0.717031, lr_G: 0.000800 2023-06-19 21:03:24,465 - INFO - [Train] step: 15899, loss_D: 0.681608, lr_D: 0.000800 2023-06-19 21:03:24,547 - INFO - [Train] step: 15899, loss_G: 0.736133, lr_G: 0.000800 2023-06-19 21:05:05,795 - INFO - [Train] step: 15999, loss_D: 0.681724, lr_D: 0.000800 2023-06-19 21:05:05,876 - INFO - [Train] step: 15999, loss_G: 0.728032, lr_G: 0.000800 2023-06-19 21:06:10,038 - INFO - [Eval] step: 15999, fid: 24.794455 2023-06-19 21:07:51,543 - INFO - [Train] step: 16099, loss_D: 0.681174, lr_D: 0.000800 2023-06-19 21:07:51,625 - INFO - [Train] step: 16099, loss_G: 0.703563, lr_G: 0.000800 2023-06-19 21:09:33,055 - INFO - [Train] step: 16199, loss_D: 0.680780, lr_D: 0.000800 2023-06-19 21:09:33,136 - INFO - [Train] step: 16199, loss_G: 0.710886, lr_G: 0.000800 2023-06-19 21:11:14,430 - INFO - [Train] step: 16299, loss_D: 0.680944, lr_D: 0.000800 2023-06-19 21:11:14,511 - INFO - [Train] step: 16299, loss_G: 0.730250, lr_G: 0.000800 2023-06-19 21:12:55,881 - INFO - [Train] step: 16399, loss_D: 0.680273, lr_D: 0.000800 2023-06-19 21:12:55,962 - INFO - [Train] step: 16399, loss_G: 0.695924, lr_G: 0.000800 2023-06-19 21:14:37,257 - INFO - [Train] step: 16499, loss_D: 0.680312, lr_D: 0.000800 2023-06-19 21:14:37,339 - INFO - [Train] step: 16499, loss_G: 0.730420, lr_G: 0.000800 2023-06-19 21:16:18,630 - INFO - [Train] step: 16599, loss_D: 0.679731, lr_D: 0.000800 2023-06-19 21:16:18,712 - INFO - [Train] step: 16599, loss_G: 0.689682, lr_G: 0.000800 2023-06-19 21:18:00,023 - INFO - [Train] step: 16699, loss_D: 0.678760, lr_D: 0.000800 2023-06-19 21:18:00,104 - INFO - [Train] step: 16699, loss_G: 0.724631, lr_G: 0.000800 2023-06-19 21:19:41,329 - INFO - [Train] step: 16799, loss_D: 0.681130, lr_D: 0.000800 2023-06-19 21:19:41,410 - INFO - [Train] step: 16799, loss_G: 0.722973, lr_G: 0.000800 2023-06-19 21:21:22,950 - INFO - [Train] step: 16899, loss_D: 0.682868, lr_D: 0.000800 2023-06-19 21:21:23,031 - INFO - [Train] step: 16899, loss_G: 0.755437, lr_G: 0.000800 2023-06-19 21:23:04,300 - INFO - [Train] step: 16999, loss_D: 0.679043, lr_D: 0.000800 2023-06-19 21:23:04,382 - INFO - [Train] step: 16999, loss_G: 0.699665, lr_G: 0.000800 2023-06-19 21:24:08,622 - INFO - [Eval] step: 16999, fid: 25.374591 2023-06-19 21:25:49,856 - INFO - [Train] step: 17099, loss_D: 0.680775, lr_D: 0.000800 2023-06-19 21:25:49,938 - INFO - [Train] step: 17099, loss_G: 0.756376, lr_G: 0.000800 2023-06-19 21:27:31,281 - INFO - [Train] step: 17199, loss_D: 0.679466, lr_D: 0.000800 2023-06-19 21:27:31,362 - INFO - [Train] step: 17199, loss_G: 0.707922, lr_G: 0.000800 2023-06-19 21:29:12,630 - INFO - [Train] step: 17299, loss_D: 0.678762, lr_D: 0.000800 2023-06-19 21:29:12,711 - INFO - [Train] step: 17299, loss_G: 0.746499, lr_G: 0.000800 2023-06-19 21:30:53,989 - INFO - [Train] step: 17399, loss_D: 0.677459, lr_D: 0.000800 2023-06-19 21:30:54,070 - INFO - [Train] step: 17399, loss_G: 0.726382, lr_G: 0.000800 2023-06-19 21:32:35,535 - INFO - [Train] step: 17499, loss_D: 0.679260, lr_D: 0.000800 2023-06-19 21:32:35,616 - INFO - [Train] step: 17499, loss_G: 0.702974, lr_G: 0.000800 2023-06-19 21:34:16,860 - INFO - [Train] step: 17599, loss_D: 0.678397, lr_D: 0.000800 2023-06-19 21:34:16,942 - INFO - [Train] step: 17599, loss_G: 0.738549, lr_G: 0.000800 2023-06-19 21:35:58,198 - INFO - [Train] step: 17699, loss_D: 0.679531, lr_D: 0.000800 2023-06-19 21:35:58,279 - INFO - [Train] step: 17699, loss_G: 0.712665, lr_G: 0.000800 2023-06-19 21:37:39,537 - INFO - [Train] step: 17799, loss_D: 0.679979, lr_D: 0.000800 2023-06-19 21:37:39,618 - INFO - [Train] step: 17799, loss_G: 0.716120, lr_G: 0.000800 2023-06-19 21:39:20,890 - INFO - [Train] step: 17899, loss_D: 0.677791, lr_D: 0.000800 2023-06-19 21:39:20,971 - INFO - [Train] step: 17899, loss_G: 0.732071, lr_G: 0.000800 2023-06-19 21:41:02,252 - INFO - [Train] step: 17999, loss_D: 0.680767, lr_D: 0.000800 2023-06-19 21:41:02,333 - INFO - [Train] step: 17999, loss_G: 0.764228, lr_G: 0.000800 2023-06-19 21:42:06,512 - INFO - [Eval] step: 17999, fid: 25.604573 2023-06-19 21:43:47,774 - INFO - [Train] step: 18099, loss_D: 0.678860, lr_D: 0.000800 2023-06-19 21:43:47,855 - INFO - [Train] step: 18099, loss_G: 0.698044, lr_G: 0.000800 2023-06-19 21:45:29,383 - INFO - [Train] step: 18199, loss_D: 0.675944, lr_D: 0.000800 2023-06-19 21:45:29,464 - INFO - [Train] step: 18199, loss_G: 0.748295, lr_G: 0.000800 2023-06-19 21:47:10,734 - INFO - [Train] step: 18299, loss_D: 0.678193, lr_D: 0.000800 2023-06-19 21:47:10,815 - INFO - [Train] step: 18299, loss_G: 0.682828, lr_G: 0.000800 2023-06-19 21:48:52,097 - INFO - [Train] step: 18399, loss_D: 0.677220, lr_D: 0.000800 2023-06-19 21:48:52,178 - INFO - [Train] step: 18399, loss_G: 0.745996, lr_G: 0.000800 2023-06-19 21:50:33,535 - INFO - [Train] step: 18499, loss_D: 0.678414, lr_D: 0.000800 2023-06-19 21:50:33,616 - INFO - [Train] step: 18499, loss_G: 0.727977, lr_G: 0.000800 2023-06-19 21:52:14,917 - INFO - [Train] step: 18599, loss_D: 0.676950, lr_D: 0.000800 2023-06-19 21:52:14,998 - INFO - [Train] step: 18599, loss_G: 0.728954, lr_G: 0.000800 2023-06-19 21:53:56,306 - INFO - [Train] step: 18699, loss_D: 0.680175, lr_D: 0.000800 2023-06-19 21:53:56,387 - INFO - [Train] step: 18699, loss_G: 0.688069, lr_G: 0.000800 2023-06-19 21:55:37,849 - INFO - [Train] step: 18799, loss_D: 0.679417, lr_D: 0.000800 2023-06-19 21:55:37,930 - INFO - [Train] step: 18799, loss_G: 0.765094, lr_G: 0.000800 2023-06-19 21:57:19,188 - INFO - [Train] step: 18899, loss_D: 0.677925, lr_D: 0.000800 2023-06-19 21:57:19,269 - INFO - [Train] step: 18899, loss_G: 0.746799, lr_G: 0.000800 2023-06-19 21:59:00,645 - INFO - [Train] step: 18999, loss_D: 0.678760, lr_D: 0.000800 2023-06-19 21:59:00,727 - INFO - [Train] step: 18999, loss_G: 0.746204, lr_G: 0.000800 2023-06-19 22:00:04,950 - INFO - [Eval] step: 18999, fid: 24.351778 2023-06-19 22:01:46,430 - INFO - [Train] step: 19099, loss_D: 0.676755, lr_D: 0.000800 2023-06-19 22:01:46,511 - INFO - [Train] step: 19099, loss_G: 0.720988, lr_G: 0.000800 2023-06-19 22:03:27,792 - INFO - [Train] step: 19199, loss_D: 0.680108, lr_D: 0.000800 2023-06-19 22:03:27,873 - INFO - [Train] step: 19199, loss_G: 0.694511, lr_G: 0.000800 2023-06-19 22:05:09,195 - INFO - [Train] step: 19299, loss_D: 0.678900, lr_D: 0.000800 2023-06-19 22:05:09,276 - INFO - [Train] step: 19299, loss_G: 0.764115, lr_G: 0.000800 2023-06-19 22:06:50,568 - INFO - [Train] step: 19399, loss_D: 0.678999, lr_D: 0.000800 2023-06-19 22:06:50,648 - INFO - [Train] step: 19399, loss_G: 0.716134, lr_G: 0.000800 2023-06-19 22:08:32,080 - INFO - [Train] step: 19499, loss_D: 0.677015, lr_D: 0.000800 2023-06-19 22:08:32,161 - INFO - [Train] step: 19499, loss_G: 0.687793, lr_G: 0.000800 2023-06-19 22:10:13,456 - INFO - [Train] step: 19599, loss_D: 0.678604, lr_D: 0.000800 2023-06-19 22:10:13,538 - INFO - [Train] step: 19599, loss_G: 0.723553, lr_G: 0.000800 2023-06-19 22:11:54,778 - INFO - [Train] step: 19699, loss_D: 0.678053, lr_D: 0.000800 2023-06-19 22:11:54,859 - INFO - [Train] step: 19699, loss_G: 0.722202, lr_G: 0.000800 2023-06-19 22:13:36,154 - INFO - [Train] step: 19799, loss_D: 0.675444, lr_D: 0.000800 2023-06-19 22:13:36,236 - INFO - [Train] step: 19799, loss_G: 0.772920, lr_G: 0.000800 2023-06-19 22:15:17,576 - INFO - [Train] step: 19899, loss_D: 0.679839, lr_D: 0.000800 2023-06-19 22:15:17,658 - INFO - [Train] step: 19899, loss_G: 0.709853, lr_G: 0.000800 2023-06-19 22:16:58,886 - INFO - [Train] step: 19999, loss_D: 0.677405, lr_D: 0.000800 2023-06-19 22:16:58,967 - INFO - [Train] step: 19999, loss_G: 0.693534, lr_G: 0.000800 2023-06-19 22:18:03,238 - INFO - [Eval] step: 19999, fid: 25.290529 2023-06-19 22:19:44,815 - INFO - [Train] step: 20099, loss_D: 0.676764, lr_D: 0.000800 2023-06-19 22:19:44,895 - INFO - [Train] step: 20099, loss_G: 0.716703, lr_G: 0.000800 2023-06-19 22:21:26,251 - INFO - [Train] step: 20199, loss_D: 0.676467, lr_D: 0.000800 2023-06-19 22:21:26,333 - INFO - [Train] step: 20199, loss_G: 0.723691, lr_G: 0.000800 2023-06-19 22:23:07,599 - INFO - [Train] step: 20299, loss_D: 0.674968, lr_D: 0.000800 2023-06-19 22:23:07,680 - INFO - [Train] step: 20299, loss_G: 0.737255, lr_G: 0.000800 2023-06-19 22:24:49,007 - INFO - [Train] step: 20399, loss_D: 0.675806, lr_D: 0.000800 2023-06-19 22:24:49,089 - INFO - [Train] step: 20399, loss_G: 0.745536, lr_G: 0.000800 2023-06-19 22:26:30,527 - INFO - [Train] step: 20499, loss_D: 0.679000, lr_D: 0.000800 2023-06-19 22:26:30,608 - INFO - [Train] step: 20499, loss_G: 0.695889, lr_G: 0.000800 2023-06-19 22:28:11,915 - INFO - [Train] step: 20599, loss_D: 0.678305, lr_D: 0.000800 2023-06-19 22:28:11,996 - INFO - [Train] step: 20599, loss_G: 0.745835, lr_G: 0.000800 2023-06-19 22:29:53,434 - INFO - [Train] step: 20699, loss_D: 0.676022, lr_D: 0.000800 2023-06-19 22:29:53,516 - INFO - [Train] step: 20699, loss_G: 0.763448, lr_G: 0.000800 2023-06-19 22:31:34,753 - INFO - [Train] step: 20799, loss_D: 0.677272, lr_D: 0.000800 2023-06-19 22:31:34,835 - INFO - [Train] step: 20799, loss_G: 0.762300, lr_G: 0.000800 2023-06-19 22:33:16,082 - INFO - [Train] step: 20899, loss_D: 0.675419, lr_D: 0.000800 2023-06-19 22:33:16,163 - INFO - [Train] step: 20899, loss_G: 0.713084, lr_G: 0.000800 2023-06-19 22:34:57,464 - INFO - [Train] step: 20999, loss_D: 0.674957, lr_D: 0.000800 2023-06-19 22:34:57,545 - INFO - [Train] step: 20999, loss_G: 0.698100, lr_G: 0.000800 2023-06-19 22:36:01,814 - INFO - [Eval] step: 20999, fid: 24.171885 2023-06-19 22:37:43,404 - INFO - [Train] step: 21099, loss_D: 0.675909, lr_D: 0.000800 2023-06-19 22:37:43,485 - INFO - [Train] step: 21099, loss_G: 0.725069, lr_G: 0.000800 2023-06-19 22:39:24,783 - INFO - [Train] step: 21199, loss_D: 0.673739, lr_D: 0.000800 2023-06-19 22:39:24,864 - INFO - [Train] step: 21199, loss_G: 0.723317, lr_G: 0.000800 2023-06-19 22:41:06,177 - INFO - [Train] step: 21299, loss_D: 0.677835, lr_D: 0.000800 2023-06-19 22:41:06,258 - INFO - [Train] step: 21299, loss_G: 0.748385, lr_G: 0.000800 2023-06-19 22:42:47,785 - INFO - [Train] step: 21399, loss_D: 0.681206, lr_D: 0.000800 2023-06-19 22:42:47,866 - INFO - [Train] step: 21399, loss_G: 0.710777, lr_G: 0.000800 2023-06-19 22:44:29,191 - INFO - [Train] step: 21499, loss_D: 0.676376, lr_D: 0.000800 2023-06-19 22:44:29,272 - INFO - [Train] step: 21499, loss_G: 0.767353, lr_G: 0.000800 2023-06-19 22:46:10,613 - INFO - [Train] step: 21599, loss_D: 0.675443, lr_D: 0.000800 2023-06-19 22:46:10,694 - INFO - [Train] step: 21599, loss_G: 0.752930, lr_G: 0.000800 2023-06-19 22:47:52,048 - INFO - [Train] step: 21699, loss_D: 0.676417, lr_D: 0.000800 2023-06-19 22:47:52,129 - INFO - [Train] step: 21699, loss_G: 0.740389, lr_G: 0.000800 2023-06-19 22:49:33,406 - INFO - [Train] step: 21799, loss_D: 0.677224, lr_D: 0.000800 2023-06-19 22:49:33,487 - INFO - [Train] step: 21799, loss_G: 0.739367, lr_G: 0.000800 2023-06-19 22:51:14,785 - INFO - [Train] step: 21899, loss_D: 0.681592, lr_D: 0.000800 2023-06-19 22:51:14,866 - INFO - [Train] step: 21899, loss_G: 0.727914, lr_G: 0.000800 2023-06-19 22:52:56,300 - INFO - [Train] step: 21999, loss_D: 0.673805, lr_D: 0.000800 2023-06-19 22:52:56,381 - INFO - [Train] step: 21999, loss_G: 0.745597, lr_G: 0.000800 2023-06-19 22:54:00,642 - INFO - [Eval] step: 21999, fid: 24.782455 2023-06-19 22:55:41,954 - INFO - [Train] step: 22099, loss_D: 0.676463, lr_D: 0.000800 2023-06-19 22:55:42,035 - INFO - [Train] step: 22099, loss_G: 0.704630, lr_G: 0.000800 2023-06-19 22:57:23,276 - INFO - [Train] step: 22199, loss_D: 0.676031, lr_D: 0.000800 2023-06-19 22:57:23,357 - INFO - [Train] step: 22199, loss_G: 0.737282, lr_G: 0.000800 2023-06-19 22:59:04,651 - INFO - [Train] step: 22299, loss_D: 0.674430, lr_D: 0.000800 2023-06-19 22:59:04,732 - INFO - [Train] step: 22299, loss_G: 0.744823, lr_G: 0.000800 2023-06-19 23:00:46,017 - INFO - [Train] step: 22399, loss_D: 0.676513, lr_D: 0.000800 2023-06-19 23:00:46,098 - INFO - [Train] step: 22399, loss_G: 0.780160, lr_G: 0.000800 2023-06-19 23:02:27,338 - INFO - [Train] step: 22499, loss_D: 0.677050, lr_D: 0.000800 2023-06-19 23:02:27,419 - INFO - [Train] step: 22499, loss_G: 0.753245, lr_G: 0.000800 2023-06-19 23:04:08,740 - INFO - [Train] step: 22599, loss_D: 0.675261, lr_D: 0.000800 2023-06-19 23:04:08,821 - INFO - [Train] step: 22599, loss_G: 0.769119, lr_G: 0.000800 2023-06-19 23:05:50,301 - INFO - [Train] step: 22699, loss_D: 0.676946, lr_D: 0.000800 2023-06-19 23:05:50,383 - INFO - [Train] step: 22699, loss_G: 0.686987, lr_G: 0.000800 2023-06-19 23:07:31,672 - INFO - [Train] step: 22799, loss_D: 0.677104, lr_D: 0.000800 2023-06-19 23:07:31,753 - INFO - [Train] step: 22799, loss_G: 0.682500, lr_G: 0.000800 2023-06-19 23:09:13,008 - INFO - [Train] step: 22899, loss_D: 0.677162, lr_D: 0.000800 2023-06-19 23:09:13,089 - INFO - [Train] step: 22899, loss_G: 0.761831, lr_G: 0.000800 2023-06-19 23:10:54,326 - INFO - [Train] step: 22999, loss_D: 0.676459, lr_D: 0.000800 2023-06-19 23:10:54,407 - INFO - [Train] step: 22999, loss_G: 0.753022, lr_G: 0.000800 2023-06-19 23:11:58,631 - INFO - [Eval] step: 22999, fid: 24.361789 2023-06-19 23:13:39,883 - INFO - [Train] step: 23099, loss_D: 0.675182, lr_D: 0.000800 2023-06-19 23:13:39,964 - INFO - [Train] step: 23099, loss_G: 0.745264, lr_G: 0.000800 2023-06-19 23:15:21,251 - INFO - [Train] step: 23199, loss_D: 0.675330, lr_D: 0.000800 2023-06-19 23:15:21,332 - INFO - [Train] step: 23199, loss_G: 0.750486, lr_G: 0.000800 2023-06-19 23:17:02,770 - INFO - [Train] step: 23299, loss_D: 0.675472, lr_D: 0.000800 2023-06-19 23:17:02,851 - INFO - [Train] step: 23299, loss_G: 0.713752, lr_G: 0.000800 2023-06-19 23:18:44,122 - INFO - [Train] step: 23399, loss_D: 0.675198, lr_D: 0.000800 2023-06-19 23:18:44,203 - INFO - [Train] step: 23399, loss_G: 0.765258, lr_G: 0.000800 2023-06-19 23:20:25,508 - INFO - [Train] step: 23499, loss_D: 0.674258, lr_D: 0.000800 2023-06-19 23:20:25,589 - INFO - [Train] step: 23499, loss_G: 0.726232, lr_G: 0.000800 2023-06-19 23:22:06,896 - INFO - [Train] step: 23599, loss_D: 0.675291, lr_D: 0.000800 2023-06-19 23:22:06,977 - INFO - [Train] step: 23599, loss_G: 0.719550, lr_G: 0.000800 2023-06-19 23:23:48,237 - INFO - [Train] step: 23699, loss_D: 0.678913, lr_D: 0.000800 2023-06-19 23:23:48,318 - INFO - [Train] step: 23699, loss_G: 0.732696, lr_G: 0.000800 2023-06-19 23:25:29,598 - INFO - [Train] step: 23799, loss_D: 0.676426, lr_D: 0.000800 2023-06-19 23:25:29,680 - INFO - [Train] step: 23799, loss_G: 0.690429, lr_G: 0.000800 2023-06-19 23:27:10,980 - INFO - [Train] step: 23899, loss_D: 0.677086, lr_D: 0.000800 2023-06-19 23:27:11,061 - INFO - [Train] step: 23899, loss_G: 0.704296, lr_G: 0.000800 2023-06-19 23:28:52,490 - INFO - [Train] step: 23999, loss_D: 0.674416, lr_D: 0.000800 2023-06-19 23:28:52,571 - INFO - [Train] step: 23999, loss_G: 0.709617, lr_G: 0.000800 2023-06-19 23:29:56,817 - INFO - [Eval] step: 23999, fid: 24.907756 2023-06-19 23:31:38,127 - INFO - [Train] step: 24099, loss_D: 0.677988, lr_D: 0.000800 2023-06-19 23:31:38,208 - INFO - [Train] step: 24099, loss_G: 0.741103, lr_G: 0.000800 2023-06-19 23:33:19,533 - INFO - [Train] step: 24199, loss_D: 0.673511, lr_D: 0.000800 2023-06-19 23:33:19,614 - INFO - [Train] step: 24199, loss_G: 0.748730, lr_G: 0.000800 2023-06-19 23:35:00,996 - INFO - [Train] step: 24299, loss_D: 0.675937, lr_D: 0.000800 2023-06-19 23:35:01,077 - INFO - [Train] step: 24299, loss_G: 0.695799, lr_G: 0.000800 2023-06-19 23:36:42,369 - INFO - [Train] step: 24399, loss_D: 0.677910, lr_D: 0.000800 2023-06-19 23:36:42,450 - INFO - [Train] step: 24399, loss_G: 0.757274, lr_G: 0.000800 2023-06-19 23:38:23,774 - INFO - [Train] step: 24499, loss_D: 0.673696, lr_D: 0.000800 2023-06-19 23:38:23,855 - INFO - [Train] step: 24499, loss_G: 0.736051, lr_G: 0.000800 2023-06-19 23:40:05,324 - INFO - [Train] step: 24599, loss_D: 0.675150, lr_D: 0.000800 2023-06-19 23:40:05,406 - INFO - [Train] step: 24599, loss_G: 0.745591, lr_G: 0.000800 2023-06-19 23:41:46,682 - INFO - [Train] step: 24699, loss_D: 0.675624, lr_D: 0.000800 2023-06-19 23:41:46,763 - INFO - [Train] step: 24699, loss_G: 0.690291, lr_G: 0.000800 2023-06-19 23:43:28,123 - INFO - [Train] step: 24799, loss_D: 0.674619, lr_D: 0.000800 2023-06-19 23:43:28,204 - INFO - [Train] step: 24799, loss_G: 0.686289, lr_G: 0.000800 2023-06-19 23:45:09,514 - INFO - [Train] step: 24899, loss_D: 0.677432, lr_D: 0.000800 2023-06-19 23:45:09,595 - INFO - [Train] step: 24899, loss_G: 0.727654, lr_G: 0.000800 2023-06-19 23:46:50,858 - INFO - [Train] step: 24999, loss_D: 0.674421, lr_D: 0.000800 2023-06-19 23:46:50,939 - INFO - [Train] step: 24999, loss_G: 0.744037, lr_G: 0.000800 2023-06-19 23:47:55,240 - INFO - [Eval] step: 24999, fid: 24.628863 2023-06-19 23:49:36,469 - INFO - [Train] step: 25099, loss_D: 0.673538, lr_D: 0.000800 2023-06-19 23:49:36,551 - INFO - [Train] step: 25099, loss_G: 0.725053, lr_G: 0.000800 2023-06-19 23:51:17,836 - INFO - [Train] step: 25199, loss_D: 0.675832, lr_D: 0.000800 2023-06-19 23:51:17,917 - INFO - [Train] step: 25199, loss_G: 0.695768, lr_G: 0.000800 2023-06-19 23:52:59,343 - INFO - [Train] step: 25299, loss_D: 0.672961, lr_D: 0.000800 2023-06-19 23:52:59,424 - INFO - [Train] step: 25299, loss_G: 0.759596, lr_G: 0.000800 2023-06-19 23:54:40,750 - INFO - [Train] step: 25399, loss_D: 0.675961, lr_D: 0.000800 2023-06-19 23:54:40,831 - INFO - [Train] step: 25399, loss_G: 0.716428, lr_G: 0.000800 2023-06-19 23:56:22,168 - INFO - [Train] step: 25499, loss_D: 0.674125, lr_D: 0.000800 2023-06-19 23:56:22,249 - INFO - [Train] step: 25499, loss_G: 0.739538, lr_G: 0.000800 2023-06-19 23:58:03,526 - INFO - [Train] step: 25599, loss_D: 0.673580, lr_D: 0.000800 2023-06-19 23:58:03,607 - INFO - [Train] step: 25599, loss_G: 0.771830, lr_G: 0.000800 2023-06-19 23:59:44,888 - INFO - [Train] step: 25699, loss_D: 0.674766, lr_D: 0.000800 2023-06-19 23:59:44,970 - INFO - [Train] step: 25699, loss_G: 0.680390, lr_G: 0.000800 2023-06-20 00:01:26,291 - INFO - [Train] step: 25799, loss_D: 0.676705, lr_D: 0.000800 2023-06-20 00:01:26,373 - INFO - [Train] step: 25799, loss_G: 0.788337, lr_G: 0.000800 2023-06-20 00:03:07,860 - INFO - [Train] step: 25899, loss_D: 0.674269, lr_D: 0.000800 2023-06-20 00:03:07,941 - INFO - [Train] step: 25899, loss_G: 0.724979, lr_G: 0.000800 2023-06-20 00:04:49,246 - INFO - [Train] step: 25999, loss_D: 0.671982, lr_D: 0.000800 2023-06-20 00:04:49,327 - INFO - [Train] step: 25999, loss_G: 0.764959, lr_G: 0.000800 2023-06-20 00:05:53,509 - INFO - [Eval] step: 25999, fid: 24.954016 2023-06-20 00:07:34,809 - INFO - [Train] step: 26099, loss_D: 0.674897, lr_D: 0.000800 2023-06-20 00:07:34,890 - INFO - [Train] step: 26099, loss_G: 0.723136, lr_G: 0.000800 2023-06-20 00:09:16,186 - INFO - [Train] step: 26199, loss_D: 0.675096, lr_D: 0.000800 2023-06-20 00:09:16,268 - INFO - [Train] step: 26199, loss_G: 0.707746, lr_G: 0.000800 2023-06-20 00:10:57,566 - INFO - [Train] step: 26299, loss_D: 0.676930, lr_D: 0.000800 2023-06-20 00:10:57,648 - INFO - [Train] step: 26299, loss_G: 0.747193, lr_G: 0.000800 2023-06-20 00:12:38,943 - INFO - [Train] step: 26399, loss_D: 0.674523, lr_D: 0.000800 2023-06-20 00:12:39,024 - INFO - [Train] step: 26399, loss_G: 0.720383, lr_G: 0.000800 2023-06-20 00:14:20,308 - INFO - [Train] step: 26499, loss_D: 0.673296, lr_D: 0.000800 2023-06-20 00:14:20,389 - INFO - [Train] step: 26499, loss_G: 0.729672, lr_G: 0.000800 2023-06-20 00:16:01,876 - INFO - [Train] step: 26599, loss_D: 0.672388, lr_D: 0.000800 2023-06-20 00:16:01,957 - INFO - [Train] step: 26599, loss_G: 0.709369, lr_G: 0.000800 2023-06-20 00:17:43,239 - INFO - [Train] step: 26699, loss_D: 0.674178, lr_D: 0.000800 2023-06-20 00:17:43,320 - INFO - [Train] step: 26699, loss_G: 0.754364, lr_G: 0.000800 2023-06-20 00:19:24,592 - INFO - [Train] step: 26799, loss_D: 0.675879, lr_D: 0.000800 2023-06-20 00:19:24,673 - INFO - [Train] step: 26799, loss_G: 0.751368, lr_G: 0.000800 2023-06-20 00:21:05,938 - INFO - [Train] step: 26899, loss_D: 0.674168, lr_D: 0.000800 2023-06-20 00:21:06,019 - INFO - [Train] step: 26899, loss_G: 0.734729, lr_G: 0.000800 2023-06-20 00:22:47,320 - INFO - [Train] step: 26999, loss_D: 0.672848, lr_D: 0.000800 2023-06-20 00:22:47,401 - INFO - [Train] step: 26999, loss_G: 0.736789, lr_G: 0.000800 2023-06-20 00:23:51,644 - INFO - [Eval] step: 26999, fid: 24.490947 2023-06-20 00:25:32,935 - INFO - [Train] step: 27099, loss_D: 0.672534, lr_D: 0.000800 2023-06-20 00:25:33,016 - INFO - [Train] step: 27099, loss_G: 0.755353, lr_G: 0.000800 2023-06-20 00:27:14,523 - INFO - [Train] step: 27199, loss_D: 0.674146, lr_D: 0.000800 2023-06-20 00:27:14,604 - INFO - [Train] step: 27199, loss_G: 0.789335, lr_G: 0.000800 2023-06-20 00:28:56,067 - INFO - [Train] step: 27299, loss_D: 0.672505, lr_D: 0.000800 2023-06-20 00:28:56,149 - INFO - [Train] step: 27299, loss_G: 0.750616, lr_G: 0.000800 2023-06-20 00:30:37,501 - INFO - [Train] step: 27399, loss_D: 0.675147, lr_D: 0.000800 2023-06-20 00:30:37,583 - INFO - [Train] step: 27399, loss_G: 0.785593, lr_G: 0.000800 2023-06-20 00:32:18,933 - INFO - [Train] step: 27499, loss_D: 0.674763, lr_D: 0.000800 2023-06-20 00:32:19,014 - INFO - [Train] step: 27499, loss_G: 0.762010, lr_G: 0.000800 2023-06-20 00:34:00,345 - INFO - [Train] step: 27599, loss_D: 0.675347, lr_D: 0.000800 2023-06-20 00:34:00,426 - INFO - [Train] step: 27599, loss_G: 0.719164, lr_G: 0.000800 2023-06-20 00:35:41,729 - INFO - [Train] step: 27699, loss_D: 0.672082, lr_D: 0.000800 2023-06-20 00:35:41,810 - INFO - [Train] step: 27699, loss_G: 0.757133, lr_G: 0.000800 2023-06-20 00:37:23,094 - INFO - [Train] step: 27799, loss_D: 0.674505, lr_D: 0.000800 2023-06-20 00:37:23,175 - INFO - [Train] step: 27799, loss_G: 0.753784, lr_G: 0.000800 2023-06-20 00:39:04,631 - INFO - [Train] step: 27899, loss_D: 0.675731, lr_D: 0.000800 2023-06-20 00:39:04,712 - INFO - [Train] step: 27899, loss_G: 0.708116, lr_G: 0.000800 2023-06-20 00:40:46,042 - INFO - [Train] step: 27999, loss_D: 0.674236, lr_D: 0.000800 2023-06-20 00:40:46,123 - INFO - [Train] step: 27999, loss_G: 0.777282, lr_G: 0.000800 2023-06-20 00:41:50,387 - INFO - [Eval] step: 27999, fid: 24.367236 2023-06-20 00:43:31,747 - INFO - [Train] step: 28099, loss_D: 0.671172, lr_D: 0.000800 2023-06-20 00:43:31,829 - INFO - [Train] step: 28099, loss_G: 0.743852, lr_G: 0.000800 2023-06-20 00:45:13,221 - INFO - [Train] step: 28199, loss_D: 0.672376, lr_D: 0.000800 2023-06-20 00:45:13,302 - INFO - [Train] step: 28199, loss_G: 0.766908, lr_G: 0.000800 2023-06-20 00:46:54,729 - INFO - [Train] step: 28299, loss_D: 0.672879, lr_D: 0.000800 2023-06-20 00:46:54,810 - INFO - [Train] step: 28299, loss_G: 0.739280, lr_G: 0.000800 2023-06-20 00:48:36,196 - INFO - [Train] step: 28399, loss_D: 0.674783, lr_D: 0.000800 2023-06-20 00:48:36,277 - INFO - [Train] step: 28399, loss_G: 0.746991, lr_G: 0.000800 2023-06-20 00:50:17,833 - INFO - [Train] step: 28499, loss_D: 0.672181, lr_D: 0.000800 2023-06-20 00:50:17,915 - INFO - [Train] step: 28499, loss_G: 0.756310, lr_G: 0.000800 2023-06-20 00:51:59,283 - INFO - [Train] step: 28599, loss_D: 0.672565, lr_D: 0.000800 2023-06-20 00:51:59,365 - INFO - [Train] step: 28599, loss_G: 0.745985, lr_G: 0.000800 2023-06-20 00:53:40,713 - INFO - [Train] step: 28699, loss_D: 0.671827, lr_D: 0.000800 2023-06-20 00:53:40,794 - INFO - [Train] step: 28699, loss_G: 0.755588, lr_G: 0.000800 2023-06-20 00:55:22,212 - INFO - [Train] step: 28799, loss_D: 0.671730, lr_D: 0.000800 2023-06-20 00:55:22,301 - INFO - [Train] step: 28799, loss_G: 0.770437, lr_G: 0.000800 2023-06-20 00:57:03,651 - INFO - [Train] step: 28899, loss_D: 0.674745, lr_D: 0.000800 2023-06-20 00:57:03,740 - INFO - [Train] step: 28899, loss_G: 0.744377, lr_G: 0.000800 2023-06-20 00:58:45,126 - INFO - [Train] step: 28999, loss_D: 0.673335, lr_D: 0.000800 2023-06-20 00:58:45,207 - INFO - [Train] step: 28999, loss_G: 0.767063, lr_G: 0.000800 2023-06-20 00:59:49,493 - INFO - [Eval] step: 28999, fid: 24.160227 2023-06-20 01:01:31,173 - INFO - [Train] step: 29099, loss_D: 0.670870, lr_D: 0.000800 2023-06-20 01:01:31,254 - INFO - [Train] step: 29099, loss_G: 0.750762, lr_G: 0.000800 2023-06-20 01:03:12,770 - INFO - [Train] step: 29199, loss_D: 0.672368, lr_D: 0.000800 2023-06-20 01:03:12,851 - INFO - [Train] step: 29199, loss_G: 0.708838, lr_G: 0.000800 2023-06-20 01:04:54,208 - INFO - [Train] step: 29299, loss_D: 0.673178, lr_D: 0.000800 2023-06-20 01:04:54,289 - INFO - [Train] step: 29299, loss_G: 0.727626, lr_G: 0.000800 2023-06-20 01:06:35,627 - INFO - [Train] step: 29399, loss_D: 0.674636, lr_D: 0.000800 2023-06-20 01:06:35,708 - INFO - [Train] step: 29399, loss_G: 0.780086, lr_G: 0.000800 2023-06-20 01:08:17,043 - INFO - [Train] step: 29499, loss_D: 0.675668, lr_D: 0.000800 2023-06-20 01:08:17,125 - INFO - [Train] step: 29499, loss_G: 0.690134, lr_G: 0.000800 2023-06-20 01:09:58,496 - INFO - [Train] step: 29599, loss_D: 0.674136, lr_D: 0.000800 2023-06-20 01:09:58,577 - INFO - [Train] step: 29599, loss_G: 0.711216, lr_G: 0.000800 2023-06-20 01:11:39,905 - INFO - [Train] step: 29699, loss_D: 0.672570, lr_D: 0.000800 2023-06-20 01:11:39,986 - INFO - [Train] step: 29699, loss_G: 0.706087, lr_G: 0.000800 2023-06-20 01:13:21,506 - INFO - [Train] step: 29799, loss_D: 0.670599, lr_D: 0.000800 2023-06-20 01:13:21,587 - INFO - [Train] step: 29799, loss_G: 0.716939, lr_G: 0.000800 2023-06-20 01:15:02,960 - INFO - [Train] step: 29899, loss_D: 0.673237, lr_D: 0.000800 2023-06-20 01:15:03,041 - INFO - [Train] step: 29899, loss_G: 0.781553, lr_G: 0.000800 2023-06-20 01:16:44,383 - INFO - [Train] step: 29999, loss_D: 0.671911, lr_D: 0.000800 2023-06-20 01:16:44,464 - INFO - [Train] step: 29999, loss_G: 0.753488, lr_G: 0.000800 2023-06-20 01:17:48,708 - INFO - [Eval] step: 29999, fid: 24.910837 2023-06-20 01:19:30,189 - INFO - [Train] step: 30099, loss_D: 0.674729, lr_D: 0.000800 2023-06-20 01:19:30,270 - INFO - [Train] step: 30099, loss_G: 0.692801, lr_G: 0.000800 2023-06-20 01:21:11,648 - INFO - [Train] step: 30199, loss_D: 0.672365, lr_D: 0.000800 2023-06-20 01:21:11,730 - INFO - [Train] step: 30199, loss_G: 0.699253, lr_G: 0.000800 2023-06-20 01:22:53,104 - INFO - [Train] step: 30299, loss_D: 0.672212, lr_D: 0.000800 2023-06-20 01:22:53,186 - INFO - [Train] step: 30299, loss_G: 0.725065, lr_G: 0.000800 2023-06-20 01:24:34,693 - INFO - [Train] step: 30399, loss_D: 0.670411, lr_D: 0.000800 2023-06-20 01:24:34,775 - INFO - [Train] step: 30399, loss_G: 0.775483, lr_G: 0.000800 2023-06-20 01:26:16,171 - INFO - [Train] step: 30499, loss_D: 0.673425, lr_D: 0.000800 2023-06-20 01:26:16,252 - INFO - [Train] step: 30499, loss_G: 0.777113, lr_G: 0.000800 2023-06-20 01:27:57,659 - INFO - [Train] step: 30599, loss_D: 0.671078, lr_D: 0.000800 2023-06-20 01:27:57,740 - INFO - [Train] step: 30599, loss_G: 0.699937, lr_G: 0.000800 2023-06-20 01:29:39,112 - INFO - [Train] step: 30699, loss_D: 0.672223, lr_D: 0.000800 2023-06-20 01:29:39,194 - INFO - [Train] step: 30699, loss_G: 0.705887, lr_G: 0.000800 2023-06-20 01:31:20,601 - INFO - [Train] step: 30799, loss_D: 0.670819, lr_D: 0.000800 2023-06-20 01:31:20,682 - INFO - [Train] step: 30799, loss_G: 0.789645, lr_G: 0.000800 2023-06-20 01:33:02,039 - INFO - [Train] step: 30899, loss_D: 0.673621, lr_D: 0.000800 2023-06-20 01:33:02,120 - INFO - [Train] step: 30899, loss_G: 0.721487, lr_G: 0.000800 2023-06-20 01:34:43,461 - INFO - [Train] step: 30999, loss_D: 0.669808, lr_D: 0.000800 2023-06-20 01:34:43,542 - INFO - [Train] step: 30999, loss_G: 0.721350, lr_G: 0.000800 2023-06-20 01:35:47,831 - INFO - [Eval] step: 30999, fid: 24.499276 2023-06-20 01:37:29,287 - INFO - [Train] step: 31099, loss_D: 0.672840, lr_D: 0.000800 2023-06-20 01:37:29,368 - INFO - [Train] step: 31099, loss_G: 0.721218, lr_G: 0.000800 2023-06-20 01:39:10,698 - INFO - [Train] step: 31199, loss_D: 0.672114, lr_D: 0.000800 2023-06-20 01:39:10,779 - INFO - [Train] step: 31199, loss_G: 0.701553, lr_G: 0.000800 2023-06-20 01:40:52,083 - INFO - [Train] step: 31299, loss_D: 0.669459, lr_D: 0.000800 2023-06-20 01:40:52,164 - INFO - [Train] step: 31299, loss_G: 0.785985, lr_G: 0.000800 2023-06-20 01:42:33,516 - INFO - [Train] step: 31399, loss_D: 0.672156, lr_D: 0.000800 2023-06-20 01:42:33,598 - INFO - [Train] step: 31399, loss_G: 0.741914, lr_G: 0.000800 2023-06-20 01:44:14,953 - INFO - [Train] step: 31499, loss_D: 0.670720, lr_D: 0.000800 2023-06-20 01:44:15,035 - INFO - [Train] step: 31499, loss_G: 0.750063, lr_G: 0.000800 2023-06-20 01:45:56,327 - INFO - [Train] step: 31599, loss_D: 0.671670, lr_D: 0.000800 2023-06-20 01:45:56,408 - INFO - [Train] step: 31599, loss_G: 0.750834, lr_G: 0.000800 2023-06-20 01:47:37,867 - INFO - [Train] step: 31699, loss_D: 0.674952, lr_D: 0.000800 2023-06-20 01:47:37,949 - INFO - [Train] step: 31699, loss_G: 0.770488, lr_G: 0.000800 2023-06-20 01:49:19,333 - INFO - [Train] step: 31799, loss_D: 0.669750, lr_D: 0.000800 2023-06-20 01:49:19,414 - INFO - [Train] step: 31799, loss_G: 0.739648, lr_G: 0.000800 2023-06-20 01:51:00,748 - INFO - [Train] step: 31899, loss_D: 0.668928, lr_D: 0.000800 2023-06-20 01:51:00,829 - INFO - [Train] step: 31899, loss_G: 0.762817, lr_G: 0.000800 2023-06-20 01:52:42,279 - INFO - [Train] step: 31999, loss_D: 0.672781, lr_D: 0.000800 2023-06-20 01:52:42,360 - INFO - [Train] step: 31999, loss_G: 0.770255, lr_G: 0.000800 2023-06-20 01:53:46,605 - INFO - [Eval] step: 31999, fid: 24.572670 2023-06-20 01:55:27,905 - INFO - [Train] step: 32099, loss_D: 0.672552, lr_D: 0.000800 2023-06-20 01:55:27,987 - INFO - [Train] step: 32099, loss_G: 0.738290, lr_G: 0.000800 2023-06-20 01:57:09,292 - INFO - [Train] step: 32199, loss_D: 0.671080, lr_D: 0.000800 2023-06-20 01:57:09,374 - INFO - [Train] step: 32199, loss_G: 0.718358, lr_G: 0.000800 2023-06-20 01:58:50,680 - INFO - [Train] step: 32299, loss_D: 0.670107, lr_D: 0.000800 2023-06-20 01:58:50,761 - INFO - [Train] step: 32299, loss_G: 0.747656, lr_G: 0.000800 2023-06-20 02:00:32,224 - INFO - [Train] step: 32399, loss_D: 0.670615, lr_D: 0.000800 2023-06-20 02:00:32,305 - INFO - [Train] step: 32399, loss_G: 0.736805, lr_G: 0.000800 2023-06-20 02:02:13,684 - INFO - [Train] step: 32499, loss_D: 0.672436, lr_D: 0.000800 2023-06-20 02:02:13,765 - INFO - [Train] step: 32499, loss_G: 0.724554, lr_G: 0.000800 2023-06-20 02:03:55,052 - INFO - [Train] step: 32599, loss_D: 0.670649, lr_D: 0.000800 2023-06-20 02:03:55,134 - INFO - [Train] step: 32599, loss_G: 0.797424, lr_G: 0.000800 2023-06-20 02:05:36,480 - INFO - [Train] step: 32699, loss_D: 0.677474, lr_D: 0.000800 2023-06-20 02:05:36,562 - INFO - [Train] step: 32699, loss_G: 0.664356, lr_G: 0.000800 2023-06-20 02:07:17,921 - INFO - [Train] step: 32799, loss_D: 0.671693, lr_D: 0.000800 2023-06-20 02:07:18,003 - INFO - [Train] step: 32799, loss_G: 0.768161, lr_G: 0.000800 2023-06-20 02:08:59,313 - INFO - [Train] step: 32899, loss_D: 0.671039, lr_D: 0.000800 2023-06-20 02:08:59,395 - INFO - [Train] step: 32899, loss_G: 0.726742, lr_G: 0.000800 2023-06-20 02:10:41,161 - INFO - [Train] step: 32999, loss_D: 0.672781, lr_D: 0.000800 2023-06-20 02:10:41,243 - INFO - [Train] step: 32999, loss_G: 0.790262, lr_G: 0.000800 2023-06-20 02:11:45,516 - INFO - [Eval] step: 32999, fid: 24.532255 2023-06-20 02:13:26,856 - INFO - [Train] step: 33099, loss_D: 0.667663, lr_D: 0.000800 2023-06-20 02:13:26,937 - INFO - [Train] step: 33099, loss_G: 0.746489, lr_G: 0.000800 2023-06-20 02:15:08,306 - INFO - [Train] step: 33199, loss_D: 0.665950, lr_D: 0.000800 2023-06-20 02:15:08,387 - INFO - [Train] step: 33199, loss_G: 0.758286, lr_G: 0.000800 2023-06-20 02:16:49,747 - INFO - [Train] step: 33299, loss_D: 0.674640, lr_D: 0.000800 2023-06-20 02:16:49,829 - INFO - [Train] step: 33299, loss_G: 0.782620, lr_G: 0.000800 2023-06-20 02:18:31,189 - INFO - [Train] step: 33399, loss_D: 0.671403, lr_D: 0.000800 2023-06-20 02:18:31,270 - INFO - [Train] step: 33399, loss_G: 0.766279, lr_G: 0.000800 2023-06-20 02:20:12,620 - INFO - [Train] step: 33499, loss_D: 0.670808, lr_D: 0.000800 2023-06-20 02:20:12,701 - INFO - [Train] step: 33499, loss_G: 0.743180, lr_G: 0.000800 2023-06-20 02:21:54,013 - INFO - [Train] step: 33599, loss_D: 0.674111, lr_D: 0.000800 2023-06-20 02:21:54,094 - INFO - [Train] step: 33599, loss_G: 0.704006, lr_G: 0.000800 2023-06-20 02:23:35,602 - INFO - [Train] step: 33699, loss_D: 0.671187, lr_D: 0.000800 2023-06-20 02:23:35,683 - INFO - [Train] step: 33699, loss_G: 0.745462, lr_G: 0.000800 2023-06-20 02:25:17,034 - INFO - [Train] step: 33799, loss_D: 0.669650, lr_D: 0.000800 2023-06-20 02:25:17,115 - INFO - [Train] step: 33799, loss_G: 0.747670, lr_G: 0.000800 2023-06-20 02:26:58,471 - INFO - [Train] step: 33899, loss_D: 0.673629, lr_D: 0.000800 2023-06-20 02:26:58,552 - INFO - [Train] step: 33899, loss_G: 0.787520, lr_G: 0.000800 2023-06-20 02:28:39,920 - INFO - [Train] step: 33999, loss_D: 0.669661, lr_D: 0.000800 2023-06-20 02:28:40,001 - INFO - [Train] step: 33999, loss_G: 0.766298, lr_G: 0.000800 2023-06-20 02:29:44,216 - INFO - [Eval] step: 33999, fid: 24.696287 2023-06-20 02:31:25,544 - INFO - [Train] step: 34099, loss_D: 0.670007, lr_D: 0.000800 2023-06-20 02:31:25,626 - INFO - [Train] step: 34099, loss_G: 0.746905, lr_G: 0.000800 2023-06-20 02:33:07,004 - INFO - [Train] step: 34199, loss_D: 0.668701, lr_D: 0.000800 2023-06-20 02:33:07,085 - INFO - [Train] step: 34199, loss_G: 0.734749, lr_G: 0.000800 2023-06-20 02:34:48,630 - INFO - [Train] step: 34299, loss_D: 0.669490, lr_D: 0.000800 2023-06-20 02:34:48,712 - INFO - [Train] step: 34299, loss_G: 0.772613, lr_G: 0.000800 2023-06-20 02:36:30,051 - INFO - [Train] step: 34399, loss_D: 0.670799, lr_D: 0.000800 2023-06-20 02:36:30,133 - INFO - [Train] step: 34399, loss_G: 0.746601, lr_G: 0.000800 2023-06-20 02:38:11,481 - INFO - [Train] step: 34499, loss_D: 0.671720, lr_D: 0.000800 2023-06-20 02:38:11,563 - INFO - [Train] step: 34499, loss_G: 0.689477, lr_G: 0.000800 2023-06-20 02:39:52,922 - INFO - [Train] step: 34599, loss_D: 0.674174, lr_D: 0.000800 2023-06-20 02:39:53,003 - INFO - [Train] step: 34599, loss_G: 0.657997, lr_G: 0.000800 2023-06-20 02:41:34,341 - INFO - [Train] step: 34699, loss_D: 0.672302, lr_D: 0.000800 2023-06-20 02:41:34,423 - INFO - [Train] step: 34699, loss_G: 0.707302, lr_G: 0.000800 2023-06-20 02:43:15,856 - INFO - [Train] step: 34799, loss_D: 0.671588, lr_D: 0.000800 2023-06-20 02:43:15,938 - INFO - [Train] step: 34799, loss_G: 0.724901, lr_G: 0.000800 2023-06-20 02:44:57,280 - INFO - [Train] step: 34899, loss_D: 0.670525, lr_D: 0.000800 2023-06-20 02:44:57,361 - INFO - [Train] step: 34899, loss_G: 0.756783, lr_G: 0.000800 2023-06-20 02:46:41,739 - INFO - [Train] step: 34999, loss_D: 0.668703, lr_D: 0.000800 2023-06-20 02:46:41,820 - INFO - [Train] step: 34999, loss_G: 0.733471, lr_G: 0.000800 2023-06-20 02:47:49,209 - INFO - [Eval] step: 34999, fid: 24.672526 2023-06-20 02:49:37,822 - INFO - [Train] step: 35099, loss_D: 0.671023, lr_D: 0.000800 2023-06-20 02:49:37,903 - INFO - [Train] step: 35099, loss_G: 0.745824, lr_G: 0.000800 2023-06-20 02:51:19,746 - INFO - [Train] step: 35199, loss_D: 0.671368, lr_D: 0.000800 2023-06-20 02:51:19,827 - INFO - [Train] step: 35199, loss_G: 0.758614, lr_G: 0.000800 2023-06-20 02:53:01,192 - INFO - [Train] step: 35299, loss_D: 0.671027, lr_D: 0.000800 2023-06-20 02:53:01,281 - INFO - [Train] step: 35299, loss_G: 0.744737, lr_G: 0.000800 2023-06-20 02:54:57,387 - INFO - [Train] step: 35399, loss_D: 0.669881, lr_D: 0.000800 2023-06-20 02:54:57,468 - INFO - [Train] step: 35399, loss_G: 0.754344, lr_G: 0.000800 2023-06-20 02:56:38,842 - INFO - [Train] step: 35499, loss_D: 0.672298, lr_D: 0.000800 2023-06-20 02:56:38,924 - INFO - [Train] step: 35499, loss_G: 0.720733, lr_G: 0.000800 2023-06-20 02:58:20,462 - INFO - [Train] step: 35599, loss_D: 0.669055, lr_D: 0.000800 2023-06-20 02:58:20,543 - INFO - [Train] step: 35599, loss_G: 0.732367, lr_G: 0.000800 2023-06-20 03:00:01,885 - INFO - [Train] step: 35699, loss_D: 0.671211, lr_D: 0.000800 2023-06-20 03:00:01,966 - INFO - [Train] step: 35699, loss_G: 0.752207, lr_G: 0.000800 2023-06-20 03:01:59,294 - INFO - [Train] step: 35799, loss_D: 0.670069, lr_D: 0.000800 2023-06-20 03:01:59,470 - INFO - [Train] step: 35799, loss_G: 0.813688, lr_G: 0.000800 2023-06-20 03:04:12,795 - INFO - [Train] step: 35899, loss_D: 0.670790, lr_D: 0.000800 2023-06-20 03:04:12,876 - INFO - [Train] step: 35899, loss_G: 0.751746, lr_G: 0.000800 2023-06-20 03:05:54,239 - INFO - [Train] step: 35999, loss_D: 0.670486, lr_D: 0.000800 2023-06-20 03:05:54,321 - INFO - [Train] step: 35999, loss_G: 0.739164, lr_G: 0.000800 2023-06-20 03:06:58,573 - INFO - [Eval] step: 35999, fid: 24.811236 2023-06-20 03:08:39,877 - INFO - [Train] step: 36099, loss_D: 0.669243, lr_D: 0.000800 2023-06-20 03:08:39,958 - INFO - [Train] step: 36099, loss_G: 0.739391, lr_G: 0.000800 2023-06-20 03:10:21,329 - INFO - [Train] step: 36199, loss_D: 0.669976, lr_D: 0.000800 2023-06-20 03:10:21,410 - INFO - [Train] step: 36199, loss_G: 0.725955, lr_G: 0.000800 2023-06-20 03:12:02,968 - INFO - [Train] step: 36299, loss_D: 0.672593, lr_D: 0.000800 2023-06-20 03:12:03,050 - INFO - [Train] step: 36299, loss_G: 0.760848, lr_G: 0.000800 2023-06-20 03:13:44,463 - INFO - [Train] step: 36399, loss_D: 0.668740, lr_D: 0.000800 2023-06-20 03:13:44,544 - INFO - [Train] step: 36399, loss_G: 0.734666, lr_G: 0.000800 2023-06-20 03:15:25,890 - INFO - [Train] step: 36499, loss_D: 0.672422, lr_D: 0.000800 2023-06-20 03:15:25,972 - INFO - [Train] step: 36499, loss_G: 0.638324, lr_G: 0.000800 2023-06-20 03:17:07,339 - INFO - [Train] step: 36599, loss_D: 0.669038, lr_D: 0.000800 2023-06-20 03:17:07,421 - INFO - [Train] step: 36599, loss_G: 0.767898, lr_G: 0.000800 2023-06-20 03:18:48,809 - INFO - [Train] step: 36699, loss_D: 0.671060, lr_D: 0.000800 2023-06-20 03:18:48,890 - INFO - [Train] step: 36699, loss_G: 0.760800, lr_G: 0.000800 2023-06-20 03:20:32,043 - INFO - [Train] step: 36799, loss_D: 0.671337, lr_D: 0.000800 2023-06-20 03:20:32,124 - INFO - [Train] step: 36799, loss_G: 0.744092, lr_G: 0.000800 2023-06-20 03:22:17,262 - INFO - [Train] step: 36899, loss_D: 0.666291, lr_D: 0.000800 2023-06-20 03:22:17,343 - INFO - [Train] step: 36899, loss_G: 0.763802, lr_G: 0.000800 2023-06-20 03:24:00,615 - INFO - [Train] step: 36999, loss_D: 0.670673, lr_D: 0.000800 2023-06-20 03:24:00,781 - INFO - [Train] step: 36999, loss_G: 0.769269, lr_G: 0.000800 2023-06-20 03:25:10,186 - INFO - [Eval] step: 36999, fid: 25.084207 2023-06-20 03:26:51,773 - INFO - [Train] step: 37099, loss_D: 0.666662, lr_D: 0.000800 2023-06-20 03:26:51,854 - INFO - [Train] step: 37099, loss_G: 0.751061, lr_G: 0.000800 2023-06-20 03:28:33,508 - INFO - [Train] step: 37199, loss_D: 0.668691, lr_D: 0.000800 2023-06-20 03:28:33,589 - INFO - [Train] step: 37199, loss_G: 0.771317, lr_G: 0.000800 2023-06-20 03:30:18,335 - INFO - [Train] step: 37299, loss_D: 0.670596, lr_D: 0.000800 2023-06-20 03:30:18,416 - INFO - [Train] step: 37299, loss_G: 0.774825, lr_G: 0.000800 2023-06-20 03:32:34,998 - INFO - [Train] step: 37399, loss_D: 0.672537, lr_D: 0.000800 2023-06-20 03:32:35,080 - INFO - [Train] step: 37399, loss_G: 0.743818, lr_G: 0.000800 2023-06-20 03:34:40,038 - INFO - [Train] step: 37499, loss_D: 0.670727, lr_D: 0.000800 2023-06-20 03:34:40,119 - INFO - [Train] step: 37499, loss_G: 0.753849, lr_G: 0.000800 2023-06-20 03:36:36,267 - INFO - [Train] step: 37599, loss_D: 0.669683, lr_D: 0.000800 2023-06-20 03:36:36,355 - INFO - [Train] step: 37599, loss_G: 0.812831, lr_G: 0.000800 2023-06-20 03:38:43,147 - INFO - [Train] step: 37699, loss_D: 0.669860, lr_D: 0.000800 2023-06-20 03:38:43,229 - INFO - [Train] step: 37699, loss_G: 0.739899, lr_G: 0.000800 2023-06-20 03:40:55,828 - INFO - [Train] step: 37799, loss_D: 0.669868, lr_D: 0.000800 2023-06-20 03:40:56,005 - INFO - [Train] step: 37799, loss_G: 0.745494, lr_G: 0.000800 2023-06-20 03:42:51,426 - INFO - [Train] step: 37899, loss_D: 0.671794, lr_D: 0.000800 2023-06-20 03:42:51,515 - INFO - [Train] step: 37899, loss_G: 0.727205, lr_G: 0.000800 2023-06-20 03:44:37,916 - INFO - [Train] step: 37999, loss_D: 0.668559, lr_D: 0.000800 2023-06-20 03:44:37,998 - INFO - [Train] step: 37999, loss_G: 0.717609, lr_G: 0.000800 2023-06-20 03:45:48,462 - INFO - [Eval] step: 37999, fid: 24.572097 2023-06-20 03:47:43,099 - INFO - [Train] step: 38099, loss_D: 0.670732, lr_D: 0.000800 2023-06-20 03:47:43,270 - INFO - [Train] step: 38099, loss_G: 0.762994, lr_G: 0.000800 2023-06-20 03:49:39,335 - INFO - [Train] step: 38199, loss_D: 0.667981, lr_D: 0.000800 2023-06-20 03:49:39,510 - INFO - [Train] step: 38199, loss_G: 0.740011, lr_G: 0.000800 2023-06-20 03:51:32,364 - INFO - [Train] step: 38299, loss_D: 0.668609, lr_D: 0.000800 2023-06-20 03:51:32,445 - INFO - [Train] step: 38299, loss_G: 0.759141, lr_G: 0.000800 2023-06-20 03:53:32,857 - INFO - [Train] step: 38399, loss_D: 0.668314, lr_D: 0.000800 2023-06-20 03:53:32,947 - INFO - [Train] step: 38399, loss_G: 0.752548, lr_G: 0.000800 2023-06-20 03:55:32,891 - INFO - [Train] step: 38499, loss_D: 0.670051, lr_D: 0.000800 2023-06-20 03:55:32,980 - INFO - [Train] step: 38499, loss_G: 0.763181, lr_G: 0.000800 2023-06-20 03:57:28,470 - INFO - [Train] step: 38599, loss_D: 0.670420, lr_D: 0.000800 2023-06-20 03:57:28,560 - INFO - [Train] step: 38599, loss_G: 0.694513, lr_G: 0.000800 2023-06-20 03:59:17,156 - INFO - [Train] step: 38699, loss_D: 0.666280, lr_D: 0.000800 2023-06-20 03:59:17,238 - INFO - [Train] step: 38699, loss_G: 0.756261, lr_G: 0.000800 2023-06-20 04:01:16,938 - INFO - [Train] step: 38799, loss_D: 0.669346, lr_D: 0.000800 2023-06-20 04:01:17,020 - INFO - [Train] step: 38799, loss_G: 0.787572, lr_G: 0.000800 2023-06-20 04:03:28,512 - INFO - [Train] step: 38899, loss_D: 0.667749, lr_D: 0.000800 2023-06-20 04:03:28,633 - INFO - [Train] step: 38899, loss_G: 0.726201, lr_G: 0.000800 2023-06-20 04:05:42,007 - INFO - [Train] step: 38999, loss_D: 0.669545, lr_D: 0.000800 2023-06-20 04:05:42,181 - INFO - [Train] step: 38999, loss_G: 0.710405, lr_G: 0.000800 2023-06-20 04:06:47,531 - INFO - [Eval] step: 38999, fid: 24.781686 2023-06-20 04:08:28,989 - INFO - [Train] step: 39099, loss_D: 0.671136, lr_D: 0.000800 2023-06-20 04:08:29,070 - INFO - [Train] step: 39099, loss_G: 0.758917, lr_G: 0.000800 2023-06-20 04:10:10,734 - INFO - [Train] step: 39199, loss_D: 0.669397, lr_D: 0.000800 2023-06-20 04:10:10,816 - INFO - [Train] step: 39199, loss_G: 0.720267, lr_G: 0.000800 2023-06-20 04:11:52,436 - INFO - [Train] step: 39299, loss_D: 0.669609, lr_D: 0.000800 2023-06-20 04:11:52,518 - INFO - [Train] step: 39299, loss_G: 0.810875, lr_G: 0.000800 2023-06-20 04:13:54,596 - INFO - [Train] step: 39399, loss_D: 0.668179, lr_D: 0.000800 2023-06-20 04:13:54,758 - INFO - [Train] step: 39399, loss_G: 0.739150, lr_G: 0.000800 2023-06-20 04:15:58,768 - INFO - [Train] step: 39499, loss_D: 0.668785, lr_D: 0.000800 2023-06-20 04:15:58,850 - INFO - [Train] step: 39499, loss_G: 0.734237, lr_G: 0.000800 2023-06-20 04:17:45,030 - INFO - [Train] step: 39599, loss_D: 0.668554, lr_D: 0.000800 2023-06-20 04:17:45,112 - INFO - [Train] step: 39599, loss_G: 0.739620, lr_G: 0.000800 2023-06-20 04:19:51,853 - INFO - [Train] step: 39699, loss_D: 0.666775, lr_D: 0.000800 2023-06-20 04:19:51,934 - INFO - [Train] step: 39699, loss_G: 0.772327, lr_G: 0.000800 2023-06-20 04:21:59,223 - INFO - [Train] step: 39799, loss_D: 0.666253, lr_D: 0.000800 2023-06-20 04:21:59,304 - INFO - [Train] step: 39799, loss_G: 0.715114, lr_G: 0.000800 2023-06-20 04:23:42,585 - INFO - [Train] step: 39899, loss_D: 0.666582, lr_D: 0.000800 2023-06-20 04:23:42,667 - INFO - [Train] step: 39899, loss_G: 0.807368, lr_G: 0.000800 2023-06-20 04:25:27,965 - INFO - [Train] step: 39999, loss_D: 0.669170, lr_D: 0.000800 2023-06-20 04:25:28,046 - INFO - [Train] step: 39999, loss_G: 0.783294, lr_G: 0.000800 2023-06-20 04:26:33,144 - INFO - [Eval] step: 39999, fid: 25.029938 2023-06-20 04:26:33,335 - INFO - Best FID score: 24.1602271110674 2023-06-20 04:26:33,335 - INFO - End of training