Diffusion-Models-Implementations / ddpm_celebahq /output-2023-03-14-11-16-58.log
xyfJASON's picture
Upload ddpm_celebahq checkpoints and training logs
daac404
raw
history blame contribute delete
No virus
32.5 kB
2023-03-14 11:17:00,096 - INFO - Experiment directory: ./runs/ddpm_celebahq/
2023-03-14 11:17:00,098 - INFO - Number of processes: 4
2023-03-14 11:17:00,098 - INFO - Distributed type: MULTI_GPU
2023-03-14 11:17:00,098 - INFO - Mixed precision: fp16
2023-03-14 11:17:00,267 - INFO - Size of training set: 24183
2023-03-14 11:17:00,267 - INFO - Batch size per process: 16
2023-03-14 11:17:00,267 - INFO - Total batch size: 64
2023-03-14 11:17:01,426 - INFO - Resume from ./runs/ddpm_celebahq/ckpt/step349999/
2023-03-14 11:17:02,277 - INFO - Successfully load model from ./runs/ddpm_celebahq/ckpt/step349999/
2023-03-14 11:17:02,418 - INFO - Successfully load ema from ./runs/ddpm_celebahq/ckpt/step349999/
2023-03-14 11:17:03,421 - INFO - Successfully load optimizer from ./runs/ddpm_celebahq/ckpt/step349999/
2023-03-14 11:17:03,424 - INFO - Restart training at step 350000
2023-03-14 11:17:04,704 - INFO - Start training...
2023-03-14 11:25:06,687 - INFO - [Train] step: 350399, loss: 0.003970, lr: 0.000020
2023-03-14 11:33:16,264 - INFO - [Train] step: 350799, loss: 0.000875, lr: 0.000020
2023-03-14 11:41:24,597 - INFO - [Train] step: 351199, loss: 0.004025, lr: 0.000020
2023-03-14 11:49:31,747 - INFO - [Train] step: 351599, loss: 0.004447, lr: 0.000020
2023-03-14 11:57:40,689 - INFO - [Train] step: 351999, loss: 0.014676, lr: 0.000020
2023-03-14 12:05:50,358 - INFO - [Train] step: 352399, loss: 0.008828, lr: 0.000020
2023-03-14 12:14:01,095 - INFO - [Train] step: 352799, loss: 0.001600, lr: 0.000020
2023-03-14 12:22:09,845 - INFO - [Train] step: 353199, loss: 0.012238, lr: 0.000020
2023-03-14 12:30:18,735 - INFO - [Train] step: 353599, loss: 0.003647, lr: 0.000020
2023-03-14 12:38:26,101 - INFO - [Train] step: 353999, loss: 0.009305, lr: 0.000020
2023-03-14 12:46:34,116 - INFO - [Train] step: 354399, loss: 0.010076, lr: 0.000020
2023-03-14 12:54:43,234 - INFO - [Train] step: 354799, loss: 0.004197, lr: 0.000020
2023-03-14 13:09:47,797 - INFO - [Train] step: 355199, loss: 0.010474, lr: 0.000020
2023-03-14 13:17:57,405 - INFO - [Train] step: 355599, loss: 0.004587, lr: 0.000020
2023-03-14 13:26:07,040 - INFO - [Train] step: 355999, loss: 0.005947, lr: 0.000020
2023-03-14 13:34:16,193 - INFO - [Train] step: 356399, loss: 0.002616, lr: 0.000020
2023-03-14 13:42:25,348 - INFO - [Train] step: 356799, loss: 0.016632, lr: 0.000020
2023-03-14 13:50:33,093 - INFO - [Train] step: 357199, loss: 0.009248, lr: 0.000020
2023-03-14 13:58:42,468 - INFO - [Train] step: 357599, loss: 0.007584, lr: 0.000020
2023-03-14 14:06:50,610 - INFO - [Train] step: 357999, loss: 0.005306, lr: 0.000020
2023-03-14 14:15:03,336 - INFO - [Train] step: 358399, loss: 0.005756, lr: 0.000020
2023-03-14 14:23:16,259 - INFO - [Train] step: 358799, loss: 0.001198, lr: 0.000020
2023-03-14 14:31:26,412 - INFO - [Train] step: 359199, loss: 0.003400, lr: 0.000020
2023-03-14 14:39:38,412 - INFO - [Train] step: 359599, loss: 0.004573, lr: 0.000020
2023-03-14 14:47:48,307 - INFO - [Train] step: 359999, loss: 0.015435, lr: 0.000020
2023-03-14 15:03:03,182 - INFO - [Train] step: 360399, loss: 0.008100, lr: 0.000020
2023-03-14 15:11:13,949 - INFO - [Train] step: 360799, loss: 0.007414, lr: 0.000020
2023-03-14 15:19:25,718 - INFO - [Train] step: 361199, loss: 0.005287, lr: 0.000020
2023-03-14 15:27:35,606 - INFO - [Train] step: 361599, loss: 0.015283, lr: 0.000020
2023-03-14 15:35:44,026 - INFO - [Train] step: 361999, loss: 0.002844, lr: 0.000020
2023-03-14 15:43:53,943 - INFO - [Train] step: 362399, loss: 0.004362, lr: 0.000020
2023-03-14 15:52:04,410 - INFO - [Train] step: 362799, loss: 0.001901, lr: 0.000020
2023-03-14 16:00:15,684 - INFO - [Train] step: 363199, loss: 0.007688, lr: 0.000020
2023-03-14 16:08:27,463 - INFO - [Train] step: 363599, loss: 0.008041, lr: 0.000020
2023-03-14 16:16:38,799 - INFO - [Train] step: 363999, loss: 0.006486, lr: 0.000020
2023-03-14 16:24:51,158 - INFO - [Train] step: 364399, loss: 0.002724, lr: 0.000020
2023-03-14 16:33:05,180 - INFO - [Train] step: 364799, loss: 0.011561, lr: 0.000020
2023-03-14 16:48:28,436 - INFO - [Train] step: 365199, loss: 0.005723, lr: 0.000020
2023-03-14 16:56:41,884 - INFO - [Train] step: 365599, loss: 0.005510, lr: 0.000020
2023-03-14 17:04:55,520 - INFO - [Train] step: 365999, loss: 0.005707, lr: 0.000020
2023-03-14 17:13:07,239 - INFO - [Train] step: 366399, loss: 0.004155, lr: 0.000020
2023-03-14 17:21:18,938 - INFO - [Train] step: 366799, loss: 0.004802, lr: 0.000020
2023-03-14 17:29:32,502 - INFO - [Train] step: 367199, loss: 0.008124, lr: 0.000020
2023-03-14 17:37:43,365 - INFO - [Train] step: 367599, loss: 0.003649, lr: 0.000020
2023-03-14 17:45:54,223 - INFO - [Train] step: 367999, loss: 0.011362, lr: 0.000020
2023-03-14 17:54:04,890 - INFO - [Train] step: 368399, loss: 0.031951, lr: 0.000020
2023-03-14 18:02:15,900 - INFO - [Train] step: 368799, loss: 0.003813, lr: 0.000020
2023-03-14 18:10:31,202 - INFO - [Train] step: 369199, loss: 0.002422, lr: 0.000020
2023-03-14 18:18:43,809 - INFO - [Train] step: 369599, loss: 0.005812, lr: 0.000020
2023-03-14 18:26:55,555 - INFO - [Train] step: 369999, loss: 0.018369, lr: 0.000020
2023-03-14 18:42:15,184 - INFO - [Train] step: 370399, loss: 0.005125, lr: 0.000020
2023-03-14 18:50:23,785 - INFO - [Train] step: 370799, loss: 0.021020, lr: 0.000020
2023-03-14 18:58:32,836 - INFO - [Train] step: 371199, loss: 0.008835, lr: 0.000020
2023-03-14 19:06:43,919 - INFO - [Train] step: 371599, loss: 0.002034, lr: 0.000020
2023-03-14 19:14:53,667 - INFO - [Train] step: 371999, loss: 0.020777, lr: 0.000020
2023-03-14 19:23:07,917 - INFO - [Train] step: 372399, loss: 0.002071, lr: 0.000020
2023-03-14 19:31:21,843 - INFO - [Train] step: 372799, loss: 0.009126, lr: 0.000020
2023-03-14 19:39:34,751 - INFO - [Train] step: 373199, loss: 0.003135, lr: 0.000020
2023-03-14 19:47:48,484 - INFO - [Train] step: 373599, loss: 0.007430, lr: 0.000020
2023-03-14 19:55:58,608 - INFO - [Train] step: 373999, loss: 0.009583, lr: 0.000020
2023-03-14 20:04:13,040 - INFO - [Train] step: 374399, loss: 0.012695, lr: 0.000020
2023-03-14 20:12:31,314 - INFO - [Train] step: 374799, loss: 0.002118, lr: 0.000020
2023-03-14 20:27:55,375 - INFO - [Train] step: 375199, loss: 0.011640, lr: 0.000020
2023-03-14 20:36:08,654 - INFO - [Train] step: 375599, loss: 0.001690, lr: 0.000020
2023-03-14 20:44:20,883 - INFO - [Train] step: 375999, loss: 0.011716, lr: 0.000020
2023-03-14 20:52:35,045 - INFO - [Train] step: 376399, loss: 0.002537, lr: 0.000020
2023-03-14 21:00:47,046 - INFO - [Train] step: 376799, loss: 0.013246, lr: 0.000020
2023-03-14 21:08:57,871 - INFO - [Train] step: 377199, loss: 0.004110, lr: 0.000020
2023-03-14 21:17:10,546 - INFO - [Train] step: 377599, loss: 0.003193, lr: 0.000020
2023-03-14 21:25:25,092 - INFO - [Train] step: 377999, loss: 0.003690, lr: 0.000020
2023-03-14 21:33:37,689 - INFO - [Train] step: 378399, loss: 0.011243, lr: 0.000020
2023-03-14 21:41:48,362 - INFO - [Train] step: 378799, loss: 0.004159, lr: 0.000020
2023-03-14 21:49:58,734 - INFO - [Train] step: 379199, loss: 0.005250, lr: 0.000020
2023-03-14 21:58:10,780 - INFO - [Train] step: 379599, loss: 0.013149, lr: 0.000020
2023-03-14 22:06:20,342 - INFO - [Train] step: 379999, loss: 0.010080, lr: 0.000020
2023-03-14 22:21:38,432 - INFO - [Train] step: 380399, loss: 0.008977, lr: 0.000020
2023-03-14 22:29:46,367 - INFO - [Train] step: 380799, loss: 0.005365, lr: 0.000020
2023-03-14 22:37:53,676 - INFO - [Train] step: 381199, loss: 0.006641, lr: 0.000020
2023-03-14 22:46:03,400 - INFO - [Train] step: 381599, loss: 0.002233, lr: 0.000020
2023-03-14 22:54:12,769 - INFO - [Train] step: 381999, loss: 0.015945, lr: 0.000020
2023-03-14 23:02:22,251 - INFO - [Train] step: 382399, loss: 0.004349, lr: 0.000020
2023-03-14 23:10:32,192 - INFO - [Train] step: 382799, loss: 0.008812, lr: 0.000020
2023-03-14 23:18:40,364 - INFO - [Train] step: 383199, loss: 0.008462, lr: 0.000020
2023-03-14 23:26:48,205 - INFO - [Train] step: 383599, loss: 0.003358, lr: 0.000020
2023-03-14 23:34:57,142 - INFO - [Train] step: 383999, loss: 0.012008, lr: 0.000020
2023-03-14 23:43:06,996 - INFO - [Train] step: 384399, loss: 0.004462, lr: 0.000020
2023-03-14 23:51:13,677 - INFO - [Train] step: 384799, loss: 0.004331, lr: 0.000020
2023-03-15 00:06:24,706 - INFO - [Train] step: 385199, loss: 0.009878, lr: 0.000020
2023-03-15 00:14:31,763 - INFO - [Train] step: 385599, loss: 0.002496, lr: 0.000020
2023-03-15 00:22:39,773 - INFO - [Train] step: 385999, loss: 0.005399, lr: 0.000020
2023-03-15 00:30:49,008 - INFO - [Train] step: 386399, loss: 0.006340, lr: 0.000020
2023-03-15 00:39:00,480 - INFO - [Train] step: 386799, loss: 0.007803, lr: 0.000020
2023-03-15 00:47:08,317 - INFO - [Train] step: 387199, loss: 0.002311, lr: 0.000020
2023-03-15 00:55:18,850 - INFO - [Train] step: 387599, loss: 0.005942, lr: 0.000020
2023-03-15 01:03:30,548 - INFO - [Train] step: 387999, loss: 0.003447, lr: 0.000020
2023-03-15 01:11:39,221 - INFO - [Train] step: 388399, loss: 0.017024, lr: 0.000020
2023-03-15 01:19:46,784 - INFO - [Train] step: 388799, loss: 0.007986, lr: 0.000020
2023-03-15 01:27:55,045 - INFO - [Train] step: 389199, loss: 0.004215, lr: 0.000020
2023-03-15 01:36:04,034 - INFO - [Train] step: 389599, loss: 0.005298, lr: 0.000020
2023-03-15 01:44:12,692 - INFO - [Train] step: 389999, loss: 0.007917, lr: 0.000020
2023-03-15 01:59:23,734 - INFO - [Train] step: 390399, loss: 0.018029, lr: 0.000020
2023-03-15 02:07:29,525 - INFO - [Train] step: 390799, loss: 0.005119, lr: 0.000020
2023-03-15 02:15:38,339 - INFO - [Train] step: 391199, loss: 0.007850, lr: 0.000020
2023-03-15 02:23:46,061 - INFO - [Train] step: 391599, loss: 0.002627, lr: 0.000020
2023-03-15 02:31:51,971 - INFO - [Train] step: 391999, loss: 0.020313, lr: 0.000020
2023-03-15 02:39:57,364 - INFO - [Train] step: 392399, loss: 0.010299, lr: 0.000020
2023-03-15 02:48:02,887 - INFO - [Train] step: 392799, loss: 0.018258, lr: 0.000020
2023-03-15 02:56:10,980 - INFO - [Train] step: 393199, loss: 0.020890, lr: 0.000020
2023-03-15 03:04:18,178 - INFO - [Train] step: 393599, loss: 0.002848, lr: 0.000020
2023-03-15 03:12:26,272 - INFO - [Train] step: 393999, loss: 0.006369, lr: 0.000020
2023-03-15 03:20:34,978 - INFO - [Train] step: 394399, loss: 0.009678, lr: 0.000020
2023-03-15 03:28:45,807 - INFO - [Train] step: 394799, loss: 0.007569, lr: 0.000020
2023-03-15 03:43:57,145 - INFO - [Train] step: 395199, loss: 0.001792, lr: 0.000020
2023-03-15 03:52:05,478 - INFO - [Train] step: 395599, loss: 0.019708, lr: 0.000020
2023-03-15 04:00:14,368 - INFO - [Train] step: 395999, loss: 0.007647, lr: 0.000020
2023-03-15 04:08:21,262 - INFO - [Train] step: 396399, loss: 0.009972, lr: 0.000020
2023-03-15 04:16:27,612 - INFO - [Train] step: 396799, loss: 0.009197, lr: 0.000020
2023-03-15 04:24:34,218 - INFO - [Train] step: 397199, loss: 0.003339, lr: 0.000020
2023-03-15 04:32:43,139 - INFO - [Train] step: 397599, loss: 0.004498, lr: 0.000020
2023-03-15 04:40:51,586 - INFO - [Train] step: 397999, loss: 0.005879, lr: 0.000020
2023-03-15 04:49:01,083 - INFO - [Train] step: 398399, loss: 0.002673, lr: 0.000020
2023-03-15 04:57:06,598 - INFO - [Train] step: 398799, loss: 0.004977, lr: 0.000020
2023-03-15 05:05:11,394 - INFO - [Train] step: 399199, loss: 0.014280, lr: 0.000020
2023-03-15 05:13:15,814 - INFO - [Train] step: 399599, loss: 0.003304, lr: 0.000020
2023-03-15 05:21:19,901 - INFO - [Train] step: 399999, loss: 0.026339, lr: 0.000020
2023-03-15 05:36:22,558 - INFO - [Train] step: 400399, loss: 0.020097, lr: 0.000020
2023-03-15 05:44:26,977 - INFO - [Train] step: 400799, loss: 0.005036, lr: 0.000020
2023-03-15 05:52:33,474 - INFO - [Train] step: 401199, loss: 0.012237, lr: 0.000020
2023-03-15 06:00:37,712 - INFO - [Train] step: 401599, loss: 0.008641, lr: 0.000020
2023-03-15 06:08:45,691 - INFO - [Train] step: 401999, loss: 0.004047, lr: 0.000020
2023-03-15 06:16:51,019 - INFO - [Train] step: 402399, loss: 0.011072, lr: 0.000020
2023-03-15 06:24:58,755 - INFO - [Train] step: 402799, loss: 0.003696, lr: 0.000020
2023-03-15 06:33:05,510 - INFO - [Train] step: 403199, loss: 0.003335, lr: 0.000020
2023-03-15 06:41:11,889 - INFO - [Train] step: 403599, loss: 0.005935, lr: 0.000020
2023-03-15 06:49:20,797 - INFO - [Train] step: 403999, loss: 0.008968, lr: 0.000020
2023-03-15 06:57:25,748 - INFO - [Train] step: 404399, loss: 0.005360, lr: 0.000020
2023-03-15 07:05:33,591 - INFO - [Train] step: 404799, loss: 0.007498, lr: 0.000020
2023-03-15 07:20:41,123 - INFO - [Train] step: 405199, loss: 0.013348, lr: 0.000020
2023-03-15 07:28:47,738 - INFO - [Train] step: 405599, loss: 0.004439, lr: 0.000020
2023-03-15 07:36:52,471 - INFO - [Train] step: 405999, loss: 0.005605, lr: 0.000020
2023-03-15 07:44:55,872 - INFO - [Train] step: 406399, loss: 0.010902, lr: 0.000020
2023-03-15 07:52:58,844 - INFO - [Train] step: 406799, loss: 0.014158, lr: 0.000020
2023-03-15 08:01:00,921 - INFO - [Train] step: 407199, loss: 0.005914, lr: 0.000020
2023-03-15 08:09:04,378 - INFO - [Train] step: 407599, loss: 0.003083, lr: 0.000020
2023-03-15 08:17:08,859 - INFO - [Train] step: 407999, loss: 0.012349, lr: 0.000020
2023-03-15 08:25:14,140 - INFO - [Train] step: 408399, loss: 0.003212, lr: 0.000020
2023-03-15 08:33:18,087 - INFO - [Train] step: 408799, loss: 0.010946, lr: 0.000020
2023-03-15 08:41:22,240 - INFO - [Train] step: 409199, loss: 0.005972, lr: 0.000020
2023-03-15 08:49:24,083 - INFO - [Train] step: 409599, loss: 0.006212, lr: 0.000020
2023-03-15 08:57:28,472 - INFO - [Train] step: 409999, loss: 0.017361, lr: 0.000020
2023-03-15 09:12:32,515 - INFO - [Train] step: 410399, loss: 0.004148, lr: 0.000020
2023-03-15 09:20:39,184 - INFO - [Train] step: 410799, loss: 0.017749, lr: 0.000020
2023-03-15 09:28:44,423 - INFO - [Train] step: 411199, loss: 0.005733, lr: 0.000020
2023-03-15 09:36:49,320 - INFO - [Train] step: 411599, loss: 0.011234, lr: 0.000020
2023-03-15 09:44:54,679 - INFO - [Train] step: 411999, loss: 0.005849, lr: 0.000020
2023-03-15 09:52:58,720 - INFO - [Train] step: 412399, loss: 0.017280, lr: 0.000020
2023-03-15 10:01:04,020 - INFO - [Train] step: 412799, loss: 0.002729, lr: 0.000020
2023-03-15 10:09:09,743 - INFO - [Train] step: 413199, loss: 0.003359, lr: 0.000020
2023-03-15 10:17:17,113 - INFO - [Train] step: 413599, loss: 0.006960, lr: 0.000020
2023-03-15 10:25:23,756 - INFO - [Train] step: 413999, loss: 0.012596, lr: 0.000020
2023-03-15 10:33:31,378 - INFO - [Train] step: 414399, loss: 0.006310, lr: 0.000020
2023-03-15 10:41:39,501 - INFO - [Train] step: 414799, loss: 0.002113, lr: 0.000020
2023-03-15 10:56:41,263 - INFO - [Train] step: 415199, loss: 0.005366, lr: 0.000020
2023-03-15 11:04:48,711 - INFO - [Train] step: 415599, loss: 0.002271, lr: 0.000020
2023-03-15 11:12:56,165 - INFO - [Train] step: 415999, loss: 0.006292, lr: 0.000020
2023-03-15 11:21:01,602 - INFO - [Train] step: 416399, loss: 0.017095, lr: 0.000020
2023-03-15 11:29:10,034 - INFO - [Train] step: 416799, loss: 0.002979, lr: 0.000020
2023-03-15 11:37:20,333 - INFO - [Train] step: 417199, loss: 0.009114, lr: 0.000020
2023-03-15 11:45:28,298 - INFO - [Train] step: 417599, loss: 0.011359, lr: 0.000020
2023-03-15 11:53:35,986 - INFO - [Train] step: 417999, loss: 0.002857, lr: 0.000020
2023-03-15 12:01:44,430 - INFO - [Train] step: 418399, loss: 0.010854, lr: 0.000020
2023-03-15 12:09:53,071 - INFO - [Train] step: 418799, loss: 0.004510, lr: 0.000020
2023-03-15 12:18:00,028 - INFO - [Train] step: 419199, loss: 0.006313, lr: 0.000020
2023-03-15 12:26:08,143 - INFO - [Train] step: 419599, loss: 0.035875, lr: 0.000020
2023-03-15 12:34:16,395 - INFO - [Train] step: 419999, loss: 0.012615, lr: 0.000020
2023-03-15 12:49:34,479 - INFO - [Train] step: 420399, loss: 0.005185, lr: 0.000020
2023-03-15 12:57:43,078 - INFO - [Train] step: 420799, loss: 0.006211, lr: 0.000020
2023-03-15 13:05:51,940 - INFO - [Train] step: 421199, loss: 0.001979, lr: 0.000020
2023-03-15 13:14:01,142 - INFO - [Train] step: 421599, loss: 0.009351, lr: 0.000020
2023-03-15 13:22:10,572 - INFO - [Train] step: 421999, loss: 0.002577, lr: 0.000020
2023-03-15 13:30:25,241 - INFO - [Train] step: 422399, loss: 0.002945, lr: 0.000020
2023-03-15 13:38:37,333 - INFO - [Train] step: 422799, loss: 0.018145, lr: 0.000020
2023-03-15 13:46:46,695 - INFO - [Train] step: 423199, loss: 0.004967, lr: 0.000020
2023-03-15 13:54:58,460 - INFO - [Train] step: 423599, loss: 0.006262, lr: 0.000020
2023-03-15 14:03:09,617 - INFO - [Train] step: 423999, loss: 0.005094, lr: 0.000020
2023-03-15 14:11:19,840 - INFO - [Train] step: 424399, loss: 0.006577, lr: 0.000020
2023-03-15 14:19:28,474 - INFO - [Train] step: 424799, loss: 0.007523, lr: 0.000020
2023-03-15 14:34:43,997 - INFO - [Train] step: 425199, loss: 0.011514, lr: 0.000020
2023-03-15 14:42:55,049 - INFO - [Train] step: 425599, loss: 0.004693, lr: 0.000020
2023-03-15 14:51:04,984 - INFO - [Train] step: 425999, loss: 0.004831, lr: 0.000020
2023-03-15 14:59:15,447 - INFO - [Train] step: 426399, loss: 0.010705, lr: 0.000020
2023-03-15 15:07:27,797 - INFO - [Train] step: 426799, loss: 0.003160, lr: 0.000020
2023-03-15 15:15:41,731 - INFO - [Train] step: 427199, loss: 0.002188, lr: 0.000020
2023-03-15 15:23:55,124 - INFO - [Train] step: 427599, loss: 0.004224, lr: 0.000020
2023-03-15 15:32:08,237 - INFO - [Train] step: 427999, loss: 0.009060, lr: 0.000020
2023-03-15 15:40:22,339 - INFO - [Train] step: 428399, loss: 0.011026, lr: 0.000020
2023-03-15 15:48:33,841 - INFO - [Train] step: 428799, loss: 0.022159, lr: 0.000020
2023-03-15 15:56:42,414 - INFO - [Train] step: 429199, loss: 0.013550, lr: 0.000020
2023-03-15 16:04:51,205 - INFO - [Train] step: 429599, loss: 0.008044, lr: 0.000020
2023-03-15 16:13:00,315 - INFO - [Train] step: 429999, loss: 0.011494, lr: 0.000020
2023-03-15 16:28:17,176 - INFO - [Train] step: 430399, loss: 0.002339, lr: 0.000020
2023-03-15 16:36:29,158 - INFO - [Train] step: 430799, loss: 0.011962, lr: 0.000020
2023-03-15 16:44:41,140 - INFO - [Train] step: 431199, loss: 0.006153, lr: 0.000020
2023-03-15 16:52:51,685 - INFO - [Train] step: 431599, loss: 0.002892, lr: 0.000020
2023-03-15 17:01:04,679 - INFO - [Train] step: 431999, loss: 0.007584, lr: 0.000020
2023-03-15 17:09:15,658 - INFO - [Train] step: 432399, loss: 0.005999, lr: 0.000020
2023-03-15 17:17:26,295 - INFO - [Train] step: 432799, loss: 0.005320, lr: 0.000020
2023-03-15 17:25:38,436 - INFO - [Train] step: 433199, loss: 0.003523, lr: 0.000020
2023-03-15 17:33:48,759 - INFO - [Train] step: 433599, loss: 0.001650, lr: 0.000020
2023-03-15 17:41:57,459 - INFO - [Train] step: 433999, loss: 0.002698, lr: 0.000020
2023-03-15 17:50:07,248 - INFO - [Train] step: 434399, loss: 0.008538, lr: 0.000020
2023-03-15 17:58:15,205 - INFO - [Train] step: 434799, loss: 0.007910, lr: 0.000020
2023-03-15 18:13:28,944 - INFO - [Train] step: 435199, loss: 0.002899, lr: 0.000020
2023-03-15 18:21:38,366 - INFO - [Train] step: 435599, loss: 0.004671, lr: 0.000020
2023-03-15 18:29:46,626 - INFO - [Train] step: 435999, loss: 0.004759, lr: 0.000020
2023-03-15 18:37:55,656 - INFO - [Train] step: 436399, loss: 0.003279, lr: 0.000020
2023-03-15 18:46:04,091 - INFO - [Train] step: 436799, loss: 0.001266, lr: 0.000020
2023-03-15 18:54:13,598 - INFO - [Train] step: 437199, loss: 0.002427, lr: 0.000020
2023-03-15 19:02:21,406 - INFO - [Train] step: 437599, loss: 0.004603, lr: 0.000020
2023-03-15 19:10:29,238 - INFO - [Train] step: 437999, loss: 0.017669, lr: 0.000020
2023-03-15 19:18:39,172 - INFO - [Train] step: 438399, loss: 0.006385, lr: 0.000020
2023-03-15 19:26:47,588 - INFO - [Train] step: 438799, loss: 0.005335, lr: 0.000020
2023-03-15 19:34:55,685 - INFO - [Train] step: 439199, loss: 0.003371, lr: 0.000020
2023-03-15 19:43:02,619 - INFO - [Train] step: 439599, loss: 0.004986, lr: 0.000020
2023-03-15 19:51:10,777 - INFO - [Train] step: 439999, loss: 0.012315, lr: 0.000020
2023-03-15 20:06:23,009 - INFO - [Train] step: 440399, loss: 0.008615, lr: 0.000020
2023-03-15 20:14:32,101 - INFO - [Train] step: 440799, loss: 0.009749, lr: 0.000020
2023-03-15 20:22:40,089 - INFO - [Train] step: 441199, loss: 0.021648, lr: 0.000020
2023-03-15 20:30:46,854 - INFO - [Train] step: 441599, loss: 0.002508, lr: 0.000020
2023-03-15 20:38:55,070 - INFO - [Train] step: 441999, loss: 0.004494, lr: 0.000020
2023-03-15 20:47:00,705 - INFO - [Train] step: 442399, loss: 0.008107, lr: 0.000020
2023-03-15 20:55:05,266 - INFO - [Train] step: 442799, loss: 0.013383, lr: 0.000020
2023-03-15 21:03:10,466 - INFO - [Train] step: 443199, loss: 0.004446, lr: 0.000020
2023-03-15 21:11:17,681 - INFO - [Train] step: 443599, loss: 0.006845, lr: 0.000020
2023-03-15 21:19:23,194 - INFO - [Train] step: 443999, loss: 0.005682, lr: 0.000020
2023-03-15 21:27:30,061 - INFO - [Train] step: 444399, loss: 0.003734, lr: 0.000020
2023-03-15 21:35:35,251 - INFO - [Train] step: 444799, loss: 0.005131, lr: 0.000020
2023-03-15 21:50:33,847 - INFO - [Train] step: 445199, loss: 0.002460, lr: 0.000020
2023-03-15 21:58:37,130 - INFO - [Train] step: 445599, loss: 0.023272, lr: 0.000020
2023-03-15 22:06:40,702 - INFO - [Train] step: 445999, loss: 0.010094, lr: 0.000020
2023-03-15 22:14:45,299 - INFO - [Train] step: 446399, loss: 0.008404, lr: 0.000020
2023-03-15 22:22:49,858 - INFO - [Train] step: 446799, loss: 0.020274, lr: 0.000020
2023-03-15 22:30:53,756 - INFO - [Train] step: 447199, loss: 0.002159, lr: 0.000020
2023-03-15 22:38:59,892 - INFO - [Train] step: 447599, loss: 0.005666, lr: 0.000020
2023-03-15 22:47:03,952 - INFO - [Train] step: 447999, loss: 0.001415, lr: 0.000020
2023-03-15 22:55:08,636 - INFO - [Train] step: 448399, loss: 0.005021, lr: 0.000020
2023-03-15 23:03:11,683 - INFO - [Train] step: 448799, loss: 0.020255, lr: 0.000020
2023-03-15 23:11:14,365 - INFO - [Train] step: 449199, loss: 0.004544, lr: 0.000020
2023-03-15 23:19:16,741 - INFO - [Train] step: 449599, loss: 0.009513, lr: 0.000020
2023-03-15 23:27:21,415 - INFO - [Train] step: 449999, loss: 0.009109, lr: 0.000020
2023-03-15 23:42:17,908 - INFO - [Train] step: 450399, loss: 0.004398, lr: 0.000020
2023-03-15 23:50:24,669 - INFO - [Train] step: 450799, loss: 0.002902, lr: 0.000020
2023-03-15 23:58:29,495 - INFO - [Train] step: 451199, loss: 0.028268, lr: 0.000020
2023-03-16 00:06:35,214 - INFO - [Train] step: 451599, loss: 0.004421, lr: 0.000020
2023-03-16 00:14:41,091 - INFO - [Train] step: 451999, loss: 0.012619, lr: 0.000020
2023-03-16 00:22:46,091 - INFO - [Train] step: 452399, loss: 0.005001, lr: 0.000020
2023-03-16 00:30:52,181 - INFO - [Train] step: 452799, loss: 0.007043, lr: 0.000020
2023-03-16 00:38:57,886 - INFO - [Train] step: 453199, loss: 0.003605, lr: 0.000020
2023-03-16 00:47:04,366 - INFO - [Train] step: 453599, loss: 0.004472, lr: 0.000020
2023-03-16 00:55:09,465 - INFO - [Train] step: 453999, loss: 0.004288, lr: 0.000020
2023-03-16 01:03:17,424 - INFO - [Train] step: 454399, loss: 0.003376, lr: 0.000020
2023-03-16 01:11:22,604 - INFO - [Train] step: 454799, loss: 0.006506, lr: 0.000020
2023-03-16 01:26:28,016 - INFO - [Train] step: 455199, loss: 0.003877, lr: 0.000020
2023-03-16 01:34:32,845 - INFO - [Train] step: 455599, loss: 0.005713, lr: 0.000020
2023-03-16 01:42:38,854 - INFO - [Train] step: 455999, loss: 0.009040, lr: 0.000020
2023-03-16 01:50:43,707 - INFO - [Train] step: 456399, loss: 0.021199, lr: 0.000020
2023-03-16 01:58:50,213 - INFO - [Train] step: 456799, loss: 0.007619, lr: 0.000020
2023-03-16 02:06:56,396 - INFO - [Train] step: 457199, loss: 0.004366, lr: 0.000020
2023-03-16 02:15:02,765 - INFO - [Train] step: 457599, loss: 0.002817, lr: 0.000020
2023-03-16 02:23:08,102 - INFO - [Train] step: 457999, loss: 0.012071, lr: 0.000020
2023-03-16 02:31:12,289 - INFO - [Train] step: 458399, loss: 0.014297, lr: 0.000020
2023-03-16 02:39:18,480 - INFO - [Train] step: 458799, loss: 0.006995, lr: 0.000020
2023-03-16 02:47:25,191 - INFO - [Train] step: 459199, loss: 0.011847, lr: 0.000020
2023-03-16 02:55:29,484 - INFO - [Train] step: 459599, loss: 0.003298, lr: 0.000020
2023-03-16 03:03:35,474 - INFO - [Train] step: 459999, loss: 0.005389, lr: 0.000020
2023-03-16 03:18:38,161 - INFO - [Train] step: 460399, loss: 0.003666, lr: 0.000020
2023-03-16 03:26:44,565 - INFO - [Train] step: 460799, loss: 0.007477, lr: 0.000020
2023-03-16 03:34:49,754 - INFO - [Train] step: 461199, loss: 0.010859, lr: 0.000020
2023-03-16 03:42:56,029 - INFO - [Train] step: 461599, loss: 0.014711, lr: 0.000020
2023-03-16 03:51:01,598 - INFO - [Train] step: 461999, loss: 0.009889, lr: 0.000020
2023-03-16 03:59:08,998 - INFO - [Train] step: 462399, loss: 0.005520, lr: 0.000020
2023-03-16 04:07:15,328 - INFO - [Train] step: 462799, loss: 0.002978, lr: 0.000020
2023-03-16 04:15:20,104 - INFO - [Train] step: 463199, loss: 0.022579, lr: 0.000020
2023-03-16 04:23:26,253 - INFO - [Train] step: 463599, loss: 0.009094, lr: 0.000020
2023-03-16 04:31:33,296 - INFO - [Train] step: 463999, loss: 0.002004, lr: 0.000020
2023-03-16 04:39:38,288 - INFO - [Train] step: 464399, loss: 0.007121, lr: 0.000020
2023-03-16 04:47:44,263 - INFO - [Train] step: 464799, loss: 0.003139, lr: 0.000020
2023-03-16 05:02:47,348 - INFO - [Train] step: 465199, loss: 0.010771, lr: 0.000020
2023-03-16 05:10:51,966 - INFO - [Train] step: 465599, loss: 0.003798, lr: 0.000020
2023-03-16 05:18:56,638 - INFO - [Train] step: 465999, loss: 0.004734, lr: 0.000020
2023-03-16 05:27:00,918 - INFO - [Train] step: 466399, loss: 0.004762, lr: 0.000020
2023-03-16 05:35:06,063 - INFO - [Train] step: 466799, loss: 0.023615, lr: 0.000020
2023-03-16 05:43:10,137 - INFO - [Train] step: 467199, loss: 0.024223, lr: 0.000020
2023-03-16 05:51:17,410 - INFO - [Train] step: 467599, loss: 0.001413, lr: 0.000020
2023-03-16 05:59:24,144 - INFO - [Train] step: 467999, loss: 0.001962, lr: 0.000020
2023-03-16 06:07:28,937 - INFO - [Train] step: 468399, loss: 0.007139, lr: 0.000020
2023-03-16 06:15:33,704 - INFO - [Train] step: 468799, loss: 0.003064, lr: 0.000020
2023-03-16 06:23:39,295 - INFO - [Train] step: 469199, loss: 0.003509, lr: 0.000020
2023-03-16 06:31:45,330 - INFO - [Train] step: 469599, loss: 0.013686, lr: 0.000020
2023-03-16 06:39:50,019 - INFO - [Train] step: 469999, loss: 0.005026, lr: 0.000020
2023-03-16 06:54:51,967 - INFO - [Train] step: 470399, loss: 0.004154, lr: 0.000020
2023-03-16 07:02:55,661 - INFO - [Train] step: 470799, loss: 0.009365, lr: 0.000020
2023-03-16 07:11:00,984 - INFO - [Train] step: 471199, loss: 0.003761, lr: 0.000020
2023-03-16 07:19:06,055 - INFO - [Train] step: 471599, loss: 0.010502, lr: 0.000020
2023-03-16 07:27:10,143 - INFO - [Train] step: 471999, loss: 0.003188, lr: 0.000020
2023-03-16 07:35:16,150 - INFO - [Train] step: 472399, loss: 0.002208, lr: 0.000020
2023-03-16 07:43:22,565 - INFO - [Train] step: 472799, loss: 0.004596, lr: 0.000020
2023-03-16 07:51:26,933 - INFO - [Train] step: 473199, loss: 0.014811, lr: 0.000020
2023-03-16 07:59:30,771 - INFO - [Train] step: 473599, loss: 0.002519, lr: 0.000020
2023-03-16 08:07:33,520 - INFO - [Train] step: 473999, loss: 0.010115, lr: 0.000020
2023-03-16 08:15:36,006 - INFO - [Train] step: 474399, loss: 0.025330, lr: 0.000020
2023-03-16 08:23:40,546 - INFO - [Train] step: 474799, loss: 0.010386, lr: 0.000020
2023-03-16 08:38:37,687 - INFO - [Train] step: 475199, loss: 0.002988, lr: 0.000020
2023-03-16 08:46:41,108 - INFO - [Train] step: 475599, loss: 0.010869, lr: 0.000020
2023-03-16 08:54:46,347 - INFO - [Train] step: 475999, loss: 0.002006, lr: 0.000020
2023-03-16 09:02:50,254 - INFO - [Train] step: 476399, loss: 0.017607, lr: 0.000020
2023-03-16 09:10:56,286 - INFO - [Train] step: 476799, loss: 0.002159, lr: 0.000020
2023-03-16 09:19:04,915 - INFO - [Train] step: 477199, loss: 0.011133, lr: 0.000020
2023-03-16 09:27:09,084 - INFO - [Train] step: 477599, loss: 0.009595, lr: 0.000020
2023-03-16 09:35:14,138 - INFO - [Train] step: 477999, loss: 0.023977, lr: 0.000020
2023-03-16 09:43:21,207 - INFO - [Train] step: 478399, loss: 0.003024, lr: 0.000020
2023-03-16 09:51:25,775 - INFO - [Train] step: 478799, loss: 0.003689, lr: 0.000020
2023-03-16 09:59:30,078 - INFO - [Train] step: 479199, loss: 0.006851, lr: 0.000020
2023-03-16 10:07:35,651 - INFO - [Train] step: 479599, loss: 0.005116, lr: 0.000020
2023-03-16 10:15:43,199 - INFO - [Train] step: 479999, loss: 0.002679, lr: 0.000020
2023-03-16 10:30:48,719 - INFO - [Train] step: 480399, loss: 0.002056, lr: 0.000020
2023-03-16 10:38:55,526 - INFO - [Train] step: 480799, loss: 0.005128, lr: 0.000020
2023-03-16 10:47:02,721 - INFO - [Train] step: 481199, loss: 0.009313, lr: 0.000020
2023-03-16 10:55:10,669 - INFO - [Train] step: 481599, loss: 0.004314, lr: 0.000020
2023-03-16 11:03:16,956 - INFO - [Train] step: 481999, loss: 0.003181, lr: 0.000020
2023-03-16 11:11:22,962 - INFO - [Train] step: 482399, loss: 0.010505, lr: 0.000020
2023-03-16 11:19:31,245 - INFO - [Train] step: 482799, loss: 0.005266, lr: 0.000020
2023-03-16 11:27:37,856 - INFO - [Train] step: 483199, loss: 0.001510, lr: 0.000020
2023-03-16 11:35:44,707 - INFO - [Train] step: 483599, loss: 0.017742, lr: 0.000020
2023-03-16 11:43:51,238 - INFO - [Train] step: 483999, loss: 0.014786, lr: 0.000020
2023-03-16 11:51:57,390 - INFO - [Train] step: 484399, loss: 0.003933, lr: 0.000020
2023-03-16 12:00:03,887 - INFO - [Train] step: 484799, loss: 0.004019, lr: 0.000020
2023-03-16 12:15:11,303 - INFO - [Train] step: 485199, loss: 0.014691, lr: 0.000020
2023-03-16 12:23:18,857 - INFO - [Train] step: 485599, loss: 0.012383, lr: 0.000020
2023-03-16 12:31:26,071 - INFO - [Train] step: 485999, loss: 0.034199, lr: 0.000020
2023-03-16 12:39:33,907 - INFO - [Train] step: 486399, loss: 0.009188, lr: 0.000020
2023-03-16 12:47:38,516 - INFO - [Train] step: 486799, loss: 0.007082, lr: 0.000020
2023-03-16 12:55:46,119 - INFO - [Train] step: 487199, loss: 0.008936, lr: 0.000020
2023-03-16 13:03:53,511 - INFO - [Train] step: 487599, loss: 0.007188, lr: 0.000020
2023-03-16 13:12:04,124 - INFO - [Train] step: 487999, loss: 0.026298, lr: 0.000020
2023-03-16 13:20:14,252 - INFO - [Train] step: 488399, loss: 0.001881, lr: 0.000020
2023-03-16 13:28:23,608 - INFO - [Train] step: 488799, loss: 0.011646, lr: 0.000020
2023-03-16 13:36:32,847 - INFO - [Train] step: 489199, loss: 0.006114, lr: 0.000020
2023-03-16 13:44:43,367 - INFO - [Train] step: 489599, loss: 0.011290, lr: 0.000020
2023-03-16 13:52:51,516 - INFO - [Train] step: 489999, loss: 0.004321, lr: 0.000020
2023-03-16 14:08:05,248 - INFO - [Train] step: 490399, loss: 0.001973, lr: 0.000020
2023-03-16 14:16:14,008 - INFO - [Train] step: 490799, loss: 0.009576, lr: 0.000020
2023-03-16 14:24:24,099 - INFO - [Train] step: 491199, loss: 0.011714, lr: 0.000020
2023-03-16 14:32:32,477 - INFO - [Train] step: 491599, loss: 0.017272, lr: 0.000020
2023-03-16 14:40:41,678 - INFO - [Train] step: 491999, loss: 0.003934, lr: 0.000020
2023-03-16 14:48:50,762 - INFO - [Train] step: 492399, loss: 0.004091, lr: 0.000020
2023-03-16 14:56:59,591 - INFO - [Train] step: 492799, loss: 0.002672, lr: 0.000020
2023-03-16 15:05:06,727 - INFO - [Train] step: 493199, loss: 0.002655, lr: 0.000020
2023-03-16 15:13:12,498 - INFO - [Train] step: 493599, loss: 0.003188, lr: 0.000020
2023-03-16 15:21:20,084 - INFO - [Train] step: 493999, loss: 0.008530, lr: 0.000020
2023-03-16 15:29:29,480 - INFO - [Train] step: 494399, loss: 0.037123, lr: 0.000020
2023-03-16 15:37:37,057 - INFO - [Train] step: 494799, loss: 0.017426, lr: 0.000020
2023-03-16 15:52:40,043 - INFO - [Train] step: 495199, loss: 0.007916, lr: 0.000020
2023-03-16 16:00:46,491 - INFO - [Train] step: 495599, loss: 0.006630, lr: 0.000020
2023-03-16 16:08:50,627 - INFO - [Train] step: 495999, loss: 0.006427, lr: 0.000020
2023-03-16 16:16:55,630 - INFO - [Train] step: 496399, loss: 0.002252, lr: 0.000020
2023-03-16 16:24:59,976 - INFO - [Train] step: 496799, loss: 0.003233, lr: 0.000020
2023-03-16 16:33:04,518 - INFO - [Train] step: 497199, loss: 0.002532, lr: 0.000020
2023-03-16 16:41:08,627 - INFO - [Train] step: 497599, loss: 0.013784, lr: 0.000020
2023-03-16 16:49:12,418 - INFO - [Train] step: 497999, loss: 0.007258, lr: 0.000020
2023-03-16 16:57:17,603 - INFO - [Train] step: 498399, loss: 0.008979, lr: 0.000020
2023-03-16 17:05:23,798 - INFO - [Train] step: 498799, loss: 0.005983, lr: 0.000020
2023-03-16 17:13:31,231 - INFO - [Train] step: 499199, loss: 0.003115, lr: 0.000020
2023-03-16 17:21:37,292 - INFO - [Train] step: 499599, loss: 0.006188, lr: 0.000020
2023-03-16 17:29:44,844 - INFO - [Train] step: 499999, loss: 0.004860, lr: 0.000020
2023-03-16 17:36:45,308 - INFO - End of training