Diffusion-Models-Implementations / ddpm_cifar10 /output-2023-03-09-14-06-16.log
xyfJASON's picture
Upload ddpm_cifar10 checkpoints and training logs
d06e700
raw
history blame contribute delete
No virus
168 kB
2023-03-09 14:06:18,569 - INFO - Experiment directory: runs/exp-2023-03-09-14-06-16
2023-03-09 14:06:18,570 - INFO - Number of processes: 2
2023-03-09 14:06:18,570 - INFO - Distributed type: MULTI_GPU
2023-03-09 14:06:18,570 - INFO - Mixed precision: no
2023-03-09 14:06:19,242 - INFO - Size of training set: 50000
2023-03-09 14:06:19,242 - INFO - Batch size per process: 64
2023-03-09 14:06:19,242 - INFO - Total batch size: 128
2023-03-09 14:06:19,785 - INFO - Start training...
2023-03-09 14:07:50,967 - INFO - [Train] step: 399, loss: 0.044539, lr: 0.000200
2023-03-09 14:09:17,677 - INFO - [Train] step: 799, loss: 0.038532, lr: 0.000200
2023-03-09 14:10:44,806 - INFO - [Train] step: 1199, loss: 0.032850, lr: 0.000200
2023-03-09 14:12:12,080 - INFO - [Train] step: 1599, loss: 0.025919, lr: 0.000200
2023-03-09 14:13:39,271 - INFO - [Train] step: 1999, loss: 0.041484, lr: 0.000200
2023-03-09 14:15:06,078 - INFO - [Train] step: 2399, loss: 0.023296, lr: 0.000200
2023-03-09 14:16:32,862 - INFO - [Train] step: 2799, loss: 0.035387, lr: 0.000200
2023-03-09 14:17:59,978 - INFO - [Train] step: 3199, loss: 0.023308, lr: 0.000200
2023-03-09 14:19:26,841 - INFO - [Train] step: 3599, loss: 0.028493, lr: 0.000200
2023-03-09 14:20:53,776 - INFO - [Train] step: 3999, loss: 0.033161, lr: 0.000200
2023-03-09 14:22:20,761 - INFO - [Train] step: 4399, loss: 0.024756, lr: 0.000200
2023-03-09 14:23:47,601 - INFO - [Train] step: 4799, loss: 0.029977, lr: 0.000200
2023-03-09 14:25:41,045 - INFO - [Train] step: 5199, loss: 0.021460, lr: 0.000200
2023-03-09 14:27:08,239 - INFO - [Train] step: 5599, loss: 0.031051, lr: 0.000200
2023-03-09 14:28:35,302 - INFO - [Train] step: 5999, loss: 0.034409, lr: 0.000200
2023-03-09 14:30:02,010 - INFO - [Train] step: 6399, loss: 0.026581, lr: 0.000200
2023-03-09 14:31:28,935 - INFO - [Train] step: 6799, loss: 0.029266, lr: 0.000200
2023-03-09 14:32:55,949 - INFO - [Train] step: 7199, loss: 0.039323, lr: 0.000200
2023-03-09 14:34:23,020 - INFO - [Train] step: 7599, loss: 0.015679, lr: 0.000200
2023-03-09 14:35:50,064 - INFO - [Train] step: 7999, loss: 0.029932, lr: 0.000200
2023-03-09 14:37:17,040 - INFO - [Train] step: 8399, loss: 0.019474, lr: 0.000200
2023-03-09 14:38:43,954 - INFO - [Train] step: 8799, loss: 0.031194, lr: 0.000200
2023-03-09 14:40:11,377 - INFO - [Train] step: 9199, loss: 0.019577, lr: 0.000200
2023-03-09 14:41:38,397 - INFO - [Train] step: 9599, loss: 0.021533, lr: 0.000200
2023-03-09 14:43:05,421 - INFO - [Train] step: 9999, loss: 0.024277, lr: 0.000200
2023-03-09 14:45:00,358 - INFO - [Train] step: 10399, loss: 0.020180, lr: 0.000200
2023-03-09 14:46:27,502 - INFO - [Train] step: 10799, loss: 0.037848, lr: 0.000200
2023-03-09 14:47:54,444 - INFO - [Train] step: 11199, loss: 0.021133, lr: 0.000200
2023-03-09 14:49:21,398 - INFO - [Train] step: 11599, loss: 0.038549, lr: 0.000200
2023-03-09 14:50:48,703 - INFO - [Train] step: 11999, loss: 0.025512, lr: 0.000200
2023-03-09 14:52:15,564 - INFO - [Train] step: 12399, loss: 0.038650, lr: 0.000200
2023-03-09 14:53:42,596 - INFO - [Train] step: 12799, loss: 0.030476, lr: 0.000200
2023-03-09 14:55:09,609 - INFO - [Train] step: 13199, loss: 0.039428, lr: 0.000200
2023-03-09 14:56:36,531 - INFO - [Train] step: 13599, loss: 0.017158, lr: 0.000200
2023-03-09 14:58:03,379 - INFO - [Train] step: 13999, loss: 0.039617, lr: 0.000200
2023-03-09 14:59:30,380 - INFO - [Train] step: 14399, loss: 0.024104, lr: 0.000200
2023-03-09 15:00:57,175 - INFO - [Train] step: 14799, loss: 0.015743, lr: 0.000200
2023-03-09 15:02:50,868 - INFO - [Train] step: 15199, loss: 0.029400, lr: 0.000200
2023-03-09 15:04:18,535 - INFO - [Train] step: 15599, loss: 0.026385, lr: 0.000200
2023-03-09 15:05:45,735 - INFO - [Train] step: 15999, loss: 0.021004, lr: 0.000200
2023-03-09 15:07:12,808 - INFO - [Train] step: 16399, loss: 0.020272, lr: 0.000200
2023-03-09 15:08:39,716 - INFO - [Train] step: 16799, loss: 0.041588, lr: 0.000200
2023-03-09 15:10:06,800 - INFO - [Train] step: 17199, loss: 0.046294, lr: 0.000200
2023-03-09 15:11:34,059 - INFO - [Train] step: 17599, loss: 0.024473, lr: 0.000200
2023-03-09 15:13:01,383 - INFO - [Train] step: 17999, loss: 0.022674, lr: 0.000200
2023-03-09 15:14:28,558 - INFO - [Train] step: 18399, loss: 0.035359, lr: 0.000200
2023-03-09 15:15:55,805 - INFO - [Train] step: 18799, loss: 0.017307, lr: 0.000200
2023-03-09 15:17:23,018 - INFO - [Train] step: 19199, loss: 0.027929, lr: 0.000200
2023-03-09 15:18:50,431 - INFO - [Train] step: 19599, loss: 0.035974, lr: 0.000200
2023-03-09 15:20:17,581 - INFO - [Train] step: 19999, loss: 0.028759, lr: 0.000200
2023-03-09 15:22:12,315 - INFO - [Train] step: 20399, loss: 0.034130, lr: 0.000200
2023-03-09 15:23:39,490 - INFO - [Train] step: 20799, loss: 0.036232, lr: 0.000200
2023-03-09 15:25:06,346 - INFO - [Train] step: 21199, loss: 0.030127, lr: 0.000200
2023-03-09 15:26:33,921 - INFO - [Train] step: 21599, loss: 0.031430, lr: 0.000200
2023-03-09 15:28:01,340 - INFO - [Train] step: 21999, loss: 0.031953, lr: 0.000200
2023-03-09 15:29:28,708 - INFO - [Train] step: 22399, loss: 0.042993, lr: 0.000200
2023-03-09 15:30:56,237 - INFO - [Train] step: 22799, loss: 0.029513, lr: 0.000200
2023-03-09 15:32:23,395 - INFO - [Train] step: 23199, loss: 0.023812, lr: 0.000200
2023-03-09 15:33:50,499 - INFO - [Train] step: 23599, loss: 0.023508, lr: 0.000200
2023-03-09 15:35:17,824 - INFO - [Train] step: 23999, loss: 0.025308, lr: 0.000200
2023-03-09 15:36:44,895 - INFO - [Train] step: 24399, loss: 0.040370, lr: 0.000200
2023-03-09 15:38:12,279 - INFO - [Train] step: 24799, loss: 0.036279, lr: 0.000200
2023-03-09 15:40:05,954 - INFO - [Train] step: 25199, loss: 0.029746, lr: 0.000200
2023-03-09 15:41:34,110 - INFO - [Train] step: 25599, loss: 0.030113, lr: 0.000200
2023-03-09 15:43:01,169 - INFO - [Train] step: 25999, loss: 0.039244, lr: 0.000200
2023-03-09 15:44:28,336 - INFO - [Train] step: 26399, loss: 0.038599, lr: 0.000200
2023-03-09 15:45:55,299 - INFO - [Train] step: 26799, loss: 0.032788, lr: 0.000200
2023-03-09 15:47:22,506 - INFO - [Train] step: 27199, loss: 0.034818, lr: 0.000200
2023-03-09 15:48:49,519 - INFO - [Train] step: 27599, loss: 0.026840, lr: 0.000200
2023-03-09 15:50:16,596 - INFO - [Train] step: 27999, loss: 0.025580, lr: 0.000200
2023-03-09 15:51:43,504 - INFO - [Train] step: 28399, loss: 0.049092, lr: 0.000200
2023-03-09 15:53:10,449 - INFO - [Train] step: 28799, loss: 0.033611, lr: 0.000200
2023-03-09 15:54:37,460 - INFO - [Train] step: 29199, loss: 0.023374, lr: 0.000200
2023-03-09 15:56:04,443 - INFO - [Train] step: 29599, loss: 0.025906, lr: 0.000200
2023-03-09 15:57:31,404 - INFO - [Train] step: 29999, loss: 0.039622, lr: 0.000200
2023-03-09 15:59:26,246 - INFO - [Train] step: 30399, loss: 0.035803, lr: 0.000200
2023-03-09 16:00:53,241 - INFO - [Train] step: 30799, loss: 0.023155, lr: 0.000200
2023-03-09 16:02:20,674 - INFO - [Train] step: 31199, loss: 0.022511, lr: 0.000200
2023-03-09 16:03:48,093 - INFO - [Train] step: 31599, loss: 0.028939, lr: 0.000200
2023-03-09 16:05:15,275 - INFO - [Train] step: 31999, loss: 0.018875, lr: 0.000200
2023-03-09 16:06:42,605 - INFO - [Train] step: 32399, loss: 0.024996, lr: 0.000200
2023-03-09 16:08:09,867 - INFO - [Train] step: 32799, loss: 0.029017, lr: 0.000200
2023-03-09 16:09:36,985 - INFO - [Train] step: 33199, loss: 0.028837, lr: 0.000200
2023-03-09 16:11:05,189 - INFO - [Train] step: 33599, loss: 0.031191, lr: 0.000200
2023-03-09 16:12:33,131 - INFO - [Train] step: 33999, loss: 0.026399, lr: 0.000200
2023-03-09 16:14:01,322 - INFO - [Train] step: 34399, loss: 0.025929, lr: 0.000200
2023-03-09 16:15:29,470 - INFO - [Train] step: 34799, loss: 0.022161, lr: 0.000200
2023-03-09 16:17:24,181 - INFO - [Train] step: 35199, loss: 0.048776, lr: 0.000200
2023-03-09 16:18:51,970 - INFO - [Train] step: 35599, loss: 0.036409, lr: 0.000200
2023-03-09 16:20:19,338 - INFO - [Train] step: 35999, loss: 0.036610, lr: 0.000200
2023-03-09 16:21:46,723 - INFO - [Train] step: 36399, loss: 0.019579, lr: 0.000200
2023-03-09 16:23:14,085 - INFO - [Train] step: 36799, loss: 0.027746, lr: 0.000200
2023-03-09 16:24:41,066 - INFO - [Train] step: 37199, loss: 0.025283, lr: 0.000200
2023-03-09 16:26:08,270 - INFO - [Train] step: 37599, loss: 0.028714, lr: 0.000200
2023-03-09 16:27:35,994 - INFO - [Train] step: 37999, loss: 0.022071, lr: 0.000200
2023-03-09 16:29:02,991 - INFO - [Train] step: 38399, loss: 0.037466, lr: 0.000200
2023-03-09 16:30:30,098 - INFO - [Train] step: 38799, loss: 0.039971, lr: 0.000200
2023-03-09 16:31:57,174 - INFO - [Train] step: 39199, loss: 0.013587, lr: 0.000200
2023-03-09 16:33:24,168 - INFO - [Train] step: 39599, loss: 0.017170, lr: 0.000200
2023-03-09 16:34:51,491 - INFO - [Train] step: 39999, loss: 0.022330, lr: 0.000200
2023-03-09 16:36:46,487 - INFO - [Train] step: 40399, loss: 0.047306, lr: 0.000200
2023-03-09 16:38:14,140 - INFO - [Train] step: 40799, loss: 0.023292, lr: 0.000200
2023-03-09 16:39:41,730 - INFO - [Train] step: 41199, loss: 0.026564, lr: 0.000200
2023-03-09 16:41:09,293 - INFO - [Train] step: 41599, loss: 0.035256, lr: 0.000200
2023-03-09 16:42:36,807 - INFO - [Train] step: 41999, loss: 0.023594, lr: 0.000200
2023-03-09 16:44:04,272 - INFO - [Train] step: 42399, loss: 0.027259, lr: 0.000200
2023-03-09 16:45:31,494 - INFO - [Train] step: 42799, loss: 0.030045, lr: 0.000200
2023-03-09 16:46:58,635 - INFO - [Train] step: 43199, loss: 0.024257, lr: 0.000200
2023-03-09 16:48:25,956 - INFO - [Train] step: 43599, loss: 0.032525, lr: 0.000200
2023-03-09 16:49:53,167 - INFO - [Train] step: 43999, loss: 0.019926, lr: 0.000200
2023-03-09 16:51:20,783 - INFO - [Train] step: 44399, loss: 0.014138, lr: 0.000200
2023-03-09 16:52:48,765 - INFO - [Train] step: 44799, loss: 0.035772, lr: 0.000200
2023-03-09 16:54:43,541 - INFO - [Train] step: 45199, loss: 0.016736, lr: 0.000200
2023-03-09 16:56:11,325 - INFO - [Train] step: 45599, loss: 0.023219, lr: 0.000200
2023-03-09 16:57:38,789 - INFO - [Train] step: 45999, loss: 0.021038, lr: 0.000200
2023-03-09 16:59:06,186 - INFO - [Train] step: 46399, loss: 0.032698, lr: 0.000200
2023-03-09 17:00:33,592 - INFO - [Train] step: 46799, loss: 0.025139, lr: 0.000200
2023-03-09 17:02:01,417 - INFO - [Train] step: 47199, loss: 0.017444, lr: 0.000200
2023-03-09 17:03:28,581 - INFO - [Train] step: 47599, loss: 0.028883, lr: 0.000200
2023-03-09 17:04:55,742 - INFO - [Train] step: 47999, loss: 0.030306, lr: 0.000200
2023-03-09 17:06:22,864 - INFO - [Train] step: 48399, loss: 0.024130, lr: 0.000200
2023-03-09 17:07:49,982 - INFO - [Train] step: 48799, loss: 0.029639, lr: 0.000200
2023-03-09 17:09:17,758 - INFO - [Train] step: 49199, loss: 0.019275, lr: 0.000200
2023-03-09 17:10:45,856 - INFO - [Train] step: 49599, loss: 0.044106, lr: 0.000200
2023-03-09 17:12:13,311 - INFO - [Train] step: 49999, loss: 0.028206, lr: 0.000200
2023-03-09 17:14:08,182 - INFO - [Train] step: 50399, loss: 0.034490, lr: 0.000200
2023-03-09 17:15:35,237 - INFO - [Train] step: 50799, loss: 0.028586, lr: 0.000200
2023-03-09 17:17:02,370 - INFO - [Train] step: 51199, loss: 0.029648, lr: 0.000200
2023-03-09 17:18:29,753 - INFO - [Train] step: 51599, loss: 0.049934, lr: 0.000200
2023-03-09 17:19:57,016 - INFO - [Train] step: 51999, loss: 0.028237, lr: 0.000200
2023-03-09 17:21:24,362 - INFO - [Train] step: 52399, loss: 0.033089, lr: 0.000200
2023-03-09 17:22:51,492 - INFO - [Train] step: 52799, loss: 0.039306, lr: 0.000200
2023-03-09 17:24:18,588 - INFO - [Train] step: 53199, loss: 0.031023, lr: 0.000200
2023-03-09 17:25:45,896 - INFO - [Train] step: 53599, loss: 0.026490, lr: 0.000200
2023-03-09 17:27:13,148 - INFO - [Train] step: 53999, loss: 0.026173, lr: 0.000200
2023-03-09 17:28:40,465 - INFO - [Train] step: 54399, loss: 0.023287, lr: 0.000200
2023-03-09 17:30:07,696 - INFO - [Train] step: 54799, loss: 0.019572, lr: 0.000200
2023-03-09 17:32:01,400 - INFO - [Train] step: 55199, loss: 0.024027, lr: 0.000200
2023-03-09 17:33:28,686 - INFO - [Train] step: 55599, loss: 0.023080, lr: 0.000200
2023-03-09 17:34:55,842 - INFO - [Train] step: 55999, loss: 0.036378, lr: 0.000200
2023-03-09 17:36:23,200 - INFO - [Train] step: 56399, loss: 0.027692, lr: 0.000200
2023-03-09 17:37:50,718 - INFO - [Train] step: 56799, loss: 0.026766, lr: 0.000200
2023-03-09 17:39:18,379 - INFO - [Train] step: 57199, loss: 0.044549, lr: 0.000200
2023-03-09 17:40:45,785 - INFO - [Train] step: 57599, loss: 0.030922, lr: 0.000200
2023-03-09 17:42:13,368 - INFO - [Train] step: 57999, loss: 0.045497, lr: 0.000200
2023-03-09 17:43:40,834 - INFO - [Train] step: 58399, loss: 0.017107, lr: 0.000200
2023-03-09 17:45:08,013 - INFO - [Train] step: 58799, loss: 0.022132, lr: 0.000200
2023-03-09 17:46:35,388 - INFO - [Train] step: 59199, loss: 0.027542, lr: 0.000200
2023-03-09 17:48:02,657 - INFO - [Train] step: 59599, loss: 0.028851, lr: 0.000200
2023-03-09 17:49:29,785 - INFO - [Train] step: 59999, loss: 0.022655, lr: 0.000200
2023-03-09 17:51:24,647 - INFO - [Train] step: 60399, loss: 0.035809, lr: 0.000200
2023-03-09 17:52:51,842 - INFO - [Train] step: 60799, loss: 0.029354, lr: 0.000200
2023-03-09 17:54:19,085 - INFO - [Train] step: 61199, loss: 0.042004, lr: 0.000200
2023-03-09 17:55:46,230 - INFO - [Train] step: 61599, loss: 0.031612, lr: 0.000200
2023-03-09 17:57:13,432 - INFO - [Train] step: 61999, loss: 0.024458, lr: 0.000200
2023-03-09 17:58:40,832 - INFO - [Train] step: 62399, loss: 0.025744, lr: 0.000200
2023-03-09 18:00:08,276 - INFO - [Train] step: 62799, loss: 0.039846, lr: 0.000200
2023-03-09 18:01:35,530 - INFO - [Train] step: 63199, loss: 0.034399, lr: 0.000200
2023-03-09 18:03:02,797 - INFO - [Train] step: 63599, loss: 0.033151, lr: 0.000200
2023-03-09 18:04:30,086 - INFO - [Train] step: 63999, loss: 0.035231, lr: 0.000200
2023-03-09 18:05:57,807 - INFO - [Train] step: 64399, loss: 0.027117, lr: 0.000200
2023-03-09 18:07:25,222 - INFO - [Train] step: 64799, loss: 0.037623, lr: 0.000200
2023-03-09 18:09:19,102 - INFO - [Train] step: 65199, loss: 0.021111, lr: 0.000200
2023-03-09 18:10:46,513 - INFO - [Train] step: 65599, loss: 0.030473, lr: 0.000200
2023-03-09 18:12:14,092 - INFO - [Train] step: 65999, loss: 0.026690, lr: 0.000200
2023-03-09 18:13:41,890 - INFO - [Train] step: 66399, loss: 0.032371, lr: 0.000200
2023-03-09 18:15:09,205 - INFO - [Train] step: 66799, loss: 0.036100, lr: 0.000200
2023-03-09 18:16:36,783 - INFO - [Train] step: 67199, loss: 0.021442, lr: 0.000200
2023-03-09 18:18:04,591 - INFO - [Train] step: 67599, loss: 0.029996, lr: 0.000200
2023-03-09 18:19:32,450 - INFO - [Train] step: 67999, loss: 0.027888, lr: 0.000200
2023-03-09 18:21:00,239 - INFO - [Train] step: 68399, loss: 0.023727, lr: 0.000200
2023-03-09 18:22:27,456 - INFO - [Train] step: 68799, loss: 0.038632, lr: 0.000200
2023-03-09 18:23:55,674 - INFO - [Train] step: 69199, loss: 0.023667, lr: 0.000200
2023-03-09 18:25:25,334 - INFO - [Train] step: 69599, loss: 0.020128, lr: 0.000200
2023-03-09 18:26:53,405 - INFO - [Train] step: 69999, loss: 0.034723, lr: 0.000200
2023-03-09 18:28:48,418 - INFO - [Train] step: 70399, loss: 0.035473, lr: 0.000200
2023-03-09 18:30:16,485 - INFO - [Train] step: 70799, loss: 0.036050, lr: 0.000200
2023-03-09 18:31:43,964 - INFO - [Train] step: 71199, loss: 0.024091, lr: 0.000200
2023-03-09 18:33:11,594 - INFO - [Train] step: 71599, loss: 0.023437, lr: 0.000200
2023-03-09 18:34:39,139 - INFO - [Train] step: 71999, loss: 0.029532, lr: 0.000200
2023-03-09 18:36:06,461 - INFO - [Train] step: 72399, loss: 0.021075, lr: 0.000200
2023-03-09 18:37:33,825 - INFO - [Train] step: 72799, loss: 0.021312, lr: 0.000200
2023-03-09 18:39:00,889 - INFO - [Train] step: 73199, loss: 0.026155, lr: 0.000200
2023-03-09 18:40:28,521 - INFO - [Train] step: 73599, loss: 0.019001, lr: 0.000200
2023-03-09 18:41:55,919 - INFO - [Train] step: 73999, loss: 0.039862, lr: 0.000200
2023-03-09 18:43:23,334 - INFO - [Train] step: 74399, loss: 0.031809, lr: 0.000200
2023-03-09 18:44:51,158 - INFO - [Train] step: 74799, loss: 0.034578, lr: 0.000200
2023-03-09 18:46:44,856 - INFO - [Train] step: 75199, loss: 0.020252, lr: 0.000200
2023-03-09 18:48:12,151 - INFO - [Train] step: 75599, loss: 0.033421, lr: 0.000200
2023-03-09 18:49:39,471 - INFO - [Train] step: 75999, loss: 0.025387, lr: 0.000200
2023-03-09 18:51:06,468 - INFO - [Train] step: 76399, loss: 0.033484, lr: 0.000200
2023-03-09 18:52:34,532 - INFO - [Train] step: 76799, loss: 0.027450, lr: 0.000200
2023-03-09 18:54:02,094 - INFO - [Train] step: 77199, loss: 0.031306, lr: 0.000200
2023-03-09 18:55:30,104 - INFO - [Train] step: 77599, loss: 0.026585, lr: 0.000200
2023-03-09 18:56:58,024 - INFO - [Train] step: 77999, loss: 0.025157, lr: 0.000200
2023-03-09 18:58:26,381 - INFO - [Train] step: 78399, loss: 0.025352, lr: 0.000200
2023-03-09 18:59:53,940 - INFO - [Train] step: 78799, loss: 0.037401, lr: 0.000200
2023-03-09 19:01:21,205 - INFO - [Train] step: 79199, loss: 0.023657, lr: 0.000200
2023-03-09 19:02:49,635 - INFO - [Train] step: 79599, loss: 0.038879, lr: 0.000200
2023-03-09 19:04:16,710 - INFO - [Train] step: 79999, loss: 0.031129, lr: 0.000200
2023-03-09 19:06:12,293 - INFO - [Train] step: 80399, loss: 0.031057, lr: 0.000200
2023-03-09 19:07:40,350 - INFO - [Train] step: 80799, loss: 0.042971, lr: 0.000200
2023-03-09 19:09:07,637 - INFO - [Train] step: 81199, loss: 0.020662, lr: 0.000200
2023-03-09 19:10:34,939 - INFO - [Train] step: 81599, loss: 0.028613, lr: 0.000200
2023-03-09 19:12:02,459 - INFO - [Train] step: 81999, loss: 0.027819, lr: 0.000200
2023-03-09 19:13:29,680 - INFO - [Train] step: 82399, loss: 0.021333, lr: 0.000200
2023-03-09 19:14:58,588 - INFO - [Train] step: 82799, loss: 0.029813, lr: 0.000200
2023-03-09 19:16:27,265 - INFO - [Train] step: 83199, loss: 0.033423, lr: 0.000200
2023-03-09 19:17:55,527 - INFO - [Train] step: 83599, loss: 0.028544, lr: 0.000200
2023-03-09 19:19:23,236 - INFO - [Train] step: 83999, loss: 0.023554, lr: 0.000200
2023-03-09 19:20:51,627 - INFO - [Train] step: 84399, loss: 0.030243, lr: 0.000200
2023-03-09 19:22:19,869 - INFO - [Train] step: 84799, loss: 0.026634, lr: 0.000200
2023-03-09 19:24:14,118 - INFO - [Train] step: 85199, loss: 0.030788, lr: 0.000200
2023-03-09 19:25:41,785 - INFO - [Train] step: 85599, loss: 0.032955, lr: 0.000200
2023-03-09 19:27:10,418 - INFO - [Train] step: 85999, loss: 0.034835, lr: 0.000200
2023-03-09 19:28:41,838 - INFO - [Train] step: 86399, loss: 0.028850, lr: 0.000200
2023-03-09 19:30:13,339 - INFO - [Train] step: 86799, loss: 0.034773, lr: 0.000200
2023-03-09 19:31:44,502 - INFO - [Train] step: 87199, loss: 0.015015, lr: 0.000200
2023-03-09 19:33:15,987 - INFO - [Train] step: 87599, loss: 0.022733, lr: 0.000200
2023-03-09 19:34:48,280 - INFO - [Train] step: 87999, loss: 0.027938, lr: 0.000200
2023-03-09 19:36:20,457 - INFO - [Train] step: 88399, loss: 0.024254, lr: 0.000200
2023-03-09 19:37:52,156 - INFO - [Train] step: 88799, loss: 0.020376, lr: 0.000200
2023-03-09 19:39:26,062 - INFO - [Train] step: 89199, loss: 0.027007, lr: 0.000200
2023-03-09 19:40:57,347 - INFO - [Train] step: 89599, loss: 0.030689, lr: 0.000200
2023-03-09 19:42:29,431 - INFO - [Train] step: 89999, loss: 0.036491, lr: 0.000200
2023-03-09 19:44:30,425 - INFO - [Train] step: 90399, loss: 0.032319, lr: 0.000200
2023-03-09 19:46:02,668 - INFO - [Train] step: 90799, loss: 0.026472, lr: 0.000200
2023-03-09 19:47:35,070 - INFO - [Train] step: 91199, loss: 0.036473, lr: 0.000200
2023-03-09 19:49:06,333 - INFO - [Train] step: 91599, loss: 0.021226, lr: 0.000200
2023-03-09 19:50:36,906 - INFO - [Train] step: 91999, loss: 0.022695, lr: 0.000200
2023-03-09 19:52:08,778 - INFO - [Train] step: 92399, loss: 0.038818, lr: 0.000200
2023-03-09 19:53:40,546 - INFO - [Train] step: 92799, loss: 0.035182, lr: 0.000200
2023-03-09 19:55:11,517 - INFO - [Train] step: 93199, loss: 0.026252, lr: 0.000200
2023-03-09 19:56:43,902 - INFO - [Train] step: 93599, loss: 0.027917, lr: 0.000200
2023-03-09 19:58:15,825 - INFO - [Train] step: 93999, loss: 0.026628, lr: 0.000200
2023-03-09 19:59:47,897 - INFO - [Train] step: 94399, loss: 0.046173, lr: 0.000200
2023-03-09 20:01:19,078 - INFO - [Train] step: 94799, loss: 0.021811, lr: 0.000200
2023-03-09 20:03:18,971 - INFO - [Train] step: 95199, loss: 0.021947, lr: 0.000200
2023-03-09 20:04:52,267 - INFO - [Train] step: 95599, loss: 0.031330, lr: 0.000200
2023-03-09 20:06:25,760 - INFO - [Train] step: 95999, loss: 0.022438, lr: 0.000200
2023-03-09 20:07:57,114 - INFO - [Train] step: 96399, loss: 0.028847, lr: 0.000200
2023-03-09 20:09:28,962 - INFO - [Train] step: 96799, loss: 0.026545, lr: 0.000200
2023-03-09 20:11:00,287 - INFO - [Train] step: 97199, loss: 0.022310, lr: 0.000200
2023-03-09 20:12:33,631 - INFO - [Train] step: 97599, loss: 0.033755, lr: 0.000200
2023-03-09 20:14:04,994 - INFO - [Train] step: 97999, loss: 0.028371, lr: 0.000200
2023-03-09 20:15:37,995 - INFO - [Train] step: 98399, loss: 0.028743, lr: 0.000200
2023-03-09 20:17:10,004 - INFO - [Train] step: 98799, loss: 0.030738, lr: 0.000200
2023-03-09 20:18:41,392 - INFO - [Train] step: 99199, loss: 0.031871, lr: 0.000200
2023-03-09 20:20:12,533 - INFO - [Train] step: 99599, loss: 0.025671, lr: 0.000200
2023-03-09 20:21:43,833 - INFO - [Train] step: 99999, loss: 0.043930, lr: 0.000200
2023-03-09 20:23:44,377 - INFO - [Train] step: 100399, loss: 0.032925, lr: 0.000200
2023-03-09 20:25:16,189 - INFO - [Train] step: 100799, loss: 0.025345, lr: 0.000200
2023-03-09 20:26:46,728 - INFO - [Train] step: 101199, loss: 0.022483, lr: 0.000200
2023-03-09 20:28:18,818 - INFO - [Train] step: 101599, loss: 0.041834, lr: 0.000200
2023-03-09 20:29:50,751 - INFO - [Train] step: 101999, loss: 0.035397, lr: 0.000200
2023-03-09 20:31:22,811 - INFO - [Train] step: 102399, loss: 0.035153, lr: 0.000200
2023-03-09 20:32:55,325 - INFO - [Train] step: 102799, loss: 0.022809, lr: 0.000200
2023-03-09 20:34:28,744 - INFO - [Train] step: 103199, loss: 0.035234, lr: 0.000200
2023-03-09 20:36:00,976 - INFO - [Train] step: 103599, loss: 0.025756, lr: 0.000200
2023-03-09 20:37:35,654 - INFO - [Train] step: 103999, loss: 0.033397, lr: 0.000200
2023-03-09 20:39:09,061 - INFO - [Train] step: 104399, loss: 0.030561, lr: 0.000200
2023-03-09 20:40:42,314 - INFO - [Train] step: 104799, loss: 0.023844, lr: 0.000200
2023-03-09 20:42:44,237 - INFO - [Train] step: 105199, loss: 0.034745, lr: 0.000200
2023-03-09 20:44:15,161 - INFO - [Train] step: 105599, loss: 0.021259, lr: 0.000200
2023-03-09 20:45:47,034 - INFO - [Train] step: 105999, loss: 0.026110, lr: 0.000200
2023-03-09 20:47:18,569 - INFO - [Train] step: 106399, loss: 0.022897, lr: 0.000200
2023-03-09 20:48:50,627 - INFO - [Train] step: 106799, loss: 0.026258, lr: 0.000200
2023-03-09 20:50:22,336 - INFO - [Train] step: 107199, loss: 0.039115, lr: 0.000200
2023-03-09 20:51:53,989 - INFO - [Train] step: 107599, loss: 0.030341, lr: 0.000200
2023-03-09 20:53:25,396 - INFO - [Train] step: 107999, loss: 0.021139, lr: 0.000200
2023-03-09 20:54:58,117 - INFO - [Train] step: 108399, loss: 0.030942, lr: 0.000200
2023-03-09 20:56:29,531 - INFO - [Train] step: 108799, loss: 0.016713, lr: 0.000200
2023-03-09 20:58:02,793 - INFO - [Train] step: 109199, loss: 0.028325, lr: 0.000200
2023-03-09 20:59:36,590 - INFO - [Train] step: 109599, loss: 0.044665, lr: 0.000200
2023-03-09 21:01:09,158 - INFO - [Train] step: 109999, loss: 0.021034, lr: 0.000200
2023-03-09 21:03:09,198 - INFO - [Train] step: 110399, loss: 0.031340, lr: 0.000200
2023-03-09 21:04:40,039 - INFO - [Train] step: 110799, loss: 0.025956, lr: 0.000200
2023-03-09 21:06:08,999 - INFO - [Train] step: 111199, loss: 0.032407, lr: 0.000200
2023-03-09 21:07:38,087 - INFO - [Train] step: 111599, loss: 0.027155, lr: 0.000200
2023-03-09 21:09:08,815 - INFO - [Train] step: 111999, loss: 0.036476, lr: 0.000200
2023-03-09 21:10:37,724 - INFO - [Train] step: 112399, loss: 0.030462, lr: 0.000200
2023-03-09 21:12:06,890 - INFO - [Train] step: 112799, loss: 0.035129, lr: 0.000200
2023-03-09 21:13:37,840 - INFO - [Train] step: 113199, loss: 0.036087, lr: 0.000200
2023-03-09 21:15:06,756 - INFO - [Train] step: 113599, loss: 0.031668, lr: 0.000200
2023-03-09 21:16:37,574 - INFO - [Train] step: 113999, loss: 0.026323, lr: 0.000200
2023-03-09 21:18:12,552 - INFO - [Train] step: 114399, loss: 0.034540, lr: 0.000200
2023-03-09 21:19:43,012 - INFO - [Train] step: 114799, loss: 0.029676, lr: 0.000200
2023-03-09 21:21:40,555 - INFO - [Train] step: 115199, loss: 0.030896, lr: 0.000200
2023-03-09 21:23:11,883 - INFO - [Train] step: 115599, loss: 0.029134, lr: 0.000200
2023-03-09 21:24:42,627 - INFO - [Train] step: 115999, loss: 0.039449, lr: 0.000200
2023-03-09 21:26:13,098 - INFO - [Train] step: 116399, loss: 0.034998, lr: 0.000200
2023-03-09 21:27:43,115 - INFO - [Train] step: 116799, loss: 0.036554, lr: 0.000200
2023-03-09 21:29:12,679 - INFO - [Train] step: 117199, loss: 0.036415, lr: 0.000200
2023-03-09 21:30:42,579 - INFO - [Train] step: 117599, loss: 0.032077, lr: 0.000200
2023-03-09 21:32:11,991 - INFO - [Train] step: 117999, loss: 0.032357, lr: 0.000200
2023-03-09 21:33:42,043 - INFO - [Train] step: 118399, loss: 0.044933, lr: 0.000200
2023-03-09 21:35:10,936 - INFO - [Train] step: 118799, loss: 0.033368, lr: 0.000200
2023-03-09 21:36:43,376 - INFO - [Train] step: 119199, loss: 0.030828, lr: 0.000200
2023-03-09 21:38:12,651 - INFO - [Train] step: 119599, loss: 0.024561, lr: 0.000200
2023-03-09 21:39:42,515 - INFO - [Train] step: 119999, loss: 0.027642, lr: 0.000200
2023-03-09 21:41:41,342 - INFO - [Train] step: 120399, loss: 0.029098, lr: 0.000200
2023-03-09 21:43:11,315 - INFO - [Train] step: 120799, loss: 0.020717, lr: 0.000200
2023-03-09 21:44:42,678 - INFO - [Train] step: 121199, loss: 0.016451, lr: 0.000200
2023-03-09 21:46:14,179 - INFO - [Train] step: 121599, loss: 0.030557, lr: 0.000200
2023-03-09 21:47:54,417 - INFO - [Train] step: 121999, loss: 0.014577, lr: 0.000200
2023-03-09 21:49:32,454 - INFO - [Train] step: 122399, loss: 0.033551, lr: 0.000200
2023-03-09 21:51:10,615 - INFO - [Train] step: 122799, loss: 0.024625, lr: 0.000200
2023-03-09 21:52:52,949 - INFO - [Train] step: 123199, loss: 0.029094, lr: 0.000200
2023-03-09 21:54:34,315 - INFO - [Train] step: 123599, loss: 0.034558, lr: 0.000200
2023-03-09 21:56:17,176 - INFO - [Train] step: 123999, loss: 0.029596, lr: 0.000200
2023-03-09 21:57:59,430 - INFO - [Train] step: 124399, loss: 0.033992, lr: 0.000200
2023-03-09 21:59:44,168 - INFO - [Train] step: 124799, loss: 0.021554, lr: 0.000200
2023-03-09 22:01:55,378 - INFO - [Train] step: 125199, loss: 0.023200, lr: 0.000200
2023-03-09 22:03:37,754 - INFO - [Train] step: 125599, loss: 0.027534, lr: 0.000200
2023-03-09 22:05:18,814 - INFO - [Train] step: 125999, loss: 0.036059, lr: 0.000200
2023-03-09 22:06:59,957 - INFO - [Train] step: 126399, loss: 0.030131, lr: 0.000200
2023-03-09 22:08:40,311 - INFO - [Train] step: 126799, loss: 0.025584, lr: 0.000200
2023-03-09 22:10:20,789 - INFO - [Train] step: 127199, loss: 0.030600, lr: 0.000200
2023-03-09 22:12:01,093 - INFO - [Train] step: 127599, loss: 0.023431, lr: 0.000200
2023-03-09 22:13:44,460 - INFO - [Train] step: 127999, loss: 0.032989, lr: 0.000200
2023-03-09 22:15:24,777 - INFO - [Train] step: 128399, loss: 0.038935, lr: 0.000200
2023-03-09 22:17:05,483 - INFO - [Train] step: 128799, loss: 0.022959, lr: 0.000200
2023-03-09 22:18:46,743 - INFO - [Train] step: 129199, loss: 0.041719, lr: 0.000200
2023-03-09 22:20:27,324 - INFO - [Train] step: 129599, loss: 0.042388, lr: 0.000200
2023-03-09 22:22:09,878 - INFO - [Train] step: 129999, loss: 0.047307, lr: 0.000200
2023-03-09 22:24:21,765 - INFO - [Train] step: 130399, loss: 0.025702, lr: 0.000200
2023-03-09 22:26:03,285 - INFO - [Train] step: 130799, loss: 0.039371, lr: 0.000200
2023-03-09 22:27:46,335 - INFO - [Train] step: 131199, loss: 0.029000, lr: 0.000200
2023-03-09 22:29:27,422 - INFO - [Train] step: 131599, loss: 0.021838, lr: 0.000200
2023-03-09 22:31:10,379 - INFO - [Train] step: 131999, loss: 0.039455, lr: 0.000200
2023-03-09 22:32:51,829 - INFO - [Train] step: 132399, loss: 0.021915, lr: 0.000200
2023-03-09 22:34:33,159 - INFO - [Train] step: 132799, loss: 0.022729, lr: 0.000200
2023-03-09 22:36:14,290 - INFO - [Train] step: 133199, loss: 0.033454, lr: 0.000200
2023-03-09 22:37:56,416 - INFO - [Train] step: 133599, loss: 0.025849, lr: 0.000200
2023-03-09 22:39:40,514 - INFO - [Train] step: 133999, loss: 0.065661, lr: 0.000200
2023-03-09 22:41:21,423 - INFO - [Train] step: 134399, loss: 0.024252, lr: 0.000200
2023-03-09 22:43:02,016 - INFO - [Train] step: 134799, loss: 0.023226, lr: 0.000200
2023-03-09 22:45:13,476 - INFO - [Train] step: 135199, loss: 0.033973, lr: 0.000200
2023-03-09 22:46:54,396 - INFO - [Train] step: 135599, loss: 0.018387, lr: 0.000200
2023-03-09 22:48:34,688 - INFO - [Train] step: 135999, loss: 0.023742, lr: 0.000200
2023-03-09 22:50:18,368 - INFO - [Train] step: 136399, loss: 0.028949, lr: 0.000200
2023-03-09 22:51:57,427 - INFO - [Train] step: 136799, loss: 0.022610, lr: 0.000200
2023-03-09 22:53:40,166 - INFO - [Train] step: 137199, loss: 0.033495, lr: 0.000200
2023-03-09 22:55:20,299 - INFO - [Train] step: 137599, loss: 0.021933, lr: 0.000200
2023-03-09 22:57:00,931 - INFO - [Train] step: 137999, loss: 0.023726, lr: 0.000200
2023-03-09 22:58:42,149 - INFO - [Train] step: 138399, loss: 0.024226, lr: 0.000200
2023-03-09 23:00:26,353 - INFO - [Train] step: 138799, loss: 0.034429, lr: 0.000200
2023-03-09 23:02:09,409 - INFO - [Train] step: 139199, loss: 0.020629, lr: 0.000200
2023-03-09 23:03:50,342 - INFO - [Train] step: 139599, loss: 0.019886, lr: 0.000200
2023-03-09 23:05:31,438 - INFO - [Train] step: 139999, loss: 0.031554, lr: 0.000200
2023-03-09 23:07:46,093 - INFO - [Train] step: 140399, loss: 0.033878, lr: 0.000200
2023-03-09 23:09:27,664 - INFO - [Train] step: 140799, loss: 0.026731, lr: 0.000200
2023-03-09 23:11:07,751 - INFO - [Train] step: 141199, loss: 0.029061, lr: 0.000200
2023-03-09 23:12:48,240 - INFO - [Train] step: 141599, loss: 0.032445, lr: 0.000200
2023-03-09 23:14:30,268 - INFO - [Train] step: 141999, loss: 0.036696, lr: 0.000200
2023-03-09 23:16:11,590 - INFO - [Train] step: 142399, loss: 0.019809, lr: 0.000200
2023-03-09 23:17:52,642 - INFO - [Train] step: 142799, loss: 0.024663, lr: 0.000200
2023-03-09 23:19:33,381 - INFO - [Train] step: 143199, loss: 0.024786, lr: 0.000200
2023-03-09 23:21:13,078 - INFO - [Train] step: 143599, loss: 0.026673, lr: 0.000200
2023-03-09 23:22:50,928 - INFO - [Train] step: 143999, loss: 0.045812, lr: 0.000200
2023-03-09 23:24:22,090 - INFO - [Train] step: 144399, loss: 0.036902, lr: 0.000200
2023-03-09 23:25:52,868 - INFO - [Train] step: 144799, loss: 0.032674, lr: 0.000200
2023-03-09 23:27:59,690 - INFO - [Train] step: 145199, loss: 0.037047, lr: 0.000200
2023-03-09 23:29:42,414 - INFO - [Train] step: 145599, loss: 0.033979, lr: 0.000200
2023-03-09 23:31:24,716 - INFO - [Train] step: 145999, loss: 0.029343, lr: 0.000200
2023-03-09 23:33:05,183 - INFO - [Train] step: 146399, loss: 0.027845, lr: 0.000200
2023-03-09 23:34:46,520 - INFO - [Train] step: 146799, loss: 0.035224, lr: 0.000200
2023-03-09 23:36:26,707 - INFO - [Train] step: 147199, loss: 0.018337, lr: 0.000200
2023-03-09 23:38:06,716 - INFO - [Train] step: 147599, loss: 0.029659, lr: 0.000200
2023-03-09 23:39:48,058 - INFO - [Train] step: 147999, loss: 0.026179, lr: 0.000200
2023-03-09 23:41:29,052 - INFO - [Train] step: 148399, loss: 0.024099, lr: 0.000200
2023-03-09 23:43:11,105 - INFO - [Train] step: 148799, loss: 0.031342, lr: 0.000200
2023-03-09 23:44:52,520 - INFO - [Train] step: 149199, loss: 0.031503, lr: 0.000200
2023-03-09 23:46:36,511 - INFO - [Train] step: 149599, loss: 0.023915, lr: 0.000200
2023-03-09 23:48:18,887 - INFO - [Train] step: 149999, loss: 0.028455, lr: 0.000200
2023-03-09 23:50:32,101 - INFO - [Train] step: 150399, loss: 0.039537, lr: 0.000200
2023-03-09 23:52:13,480 - INFO - [Train] step: 150799, loss: 0.029162, lr: 0.000200
2023-03-09 23:53:57,050 - INFO - [Train] step: 151199, loss: 0.036871, lr: 0.000200
2023-03-09 23:55:39,098 - INFO - [Train] step: 151599, loss: 0.031814, lr: 0.000200
2023-03-09 23:57:21,884 - INFO - [Train] step: 151999, loss: 0.024486, lr: 0.000200
2023-03-09 23:59:03,465 - INFO - [Train] step: 152399, loss: 0.035199, lr: 0.000200
2023-03-10 00:00:44,613 - INFO - [Train] step: 152799, loss: 0.040368, lr: 0.000200
2023-03-10 00:02:26,579 - INFO - [Train] step: 153199, loss: 0.035033, lr: 0.000200
2023-03-10 00:04:08,884 - INFO - [Train] step: 153599, loss: 0.023162, lr: 0.000200
2023-03-10 00:05:51,897 - INFO - [Train] step: 153999, loss: 0.018438, lr: 0.000200
2023-03-10 00:07:34,136 - INFO - [Train] step: 154399, loss: 0.023555, lr: 0.000200
2023-03-10 00:09:15,718 - INFO - [Train] step: 154799, loss: 0.024073, lr: 0.000200
2023-03-10 00:11:28,650 - INFO - [Train] step: 155199, loss: 0.033107, lr: 0.000200
2023-03-10 00:13:10,963 - INFO - [Train] step: 155599, loss: 0.031013, lr: 0.000200
2023-03-10 00:14:53,942 - INFO - [Train] step: 155999, loss: 0.040165, lr: 0.000200
2023-03-10 00:16:36,821 - INFO - [Train] step: 156399, loss: 0.025974, lr: 0.000200
2023-03-10 00:18:18,728 - INFO - [Train] step: 156799, loss: 0.016481, lr: 0.000200
2023-03-10 00:20:01,765 - INFO - [Train] step: 157199, loss: 0.026839, lr: 0.000200
2023-03-10 00:21:43,788 - INFO - [Train] step: 157599, loss: 0.019666, lr: 0.000200
2023-03-10 00:23:27,058 - INFO - [Train] step: 157999, loss: 0.029025, lr: 0.000200
2023-03-10 00:25:10,909 - INFO - [Train] step: 158399, loss: 0.029363, lr: 0.000200
2023-03-10 00:26:53,567 - INFO - [Train] step: 158799, loss: 0.022448, lr: 0.000200
2023-03-10 00:28:35,978 - INFO - [Train] step: 159199, loss: 0.027946, lr: 0.000200
2023-03-10 00:30:18,743 - INFO - [Train] step: 159599, loss: 0.026334, lr: 0.000200
2023-03-10 00:32:00,413 - INFO - [Train] step: 159999, loss: 0.028865, lr: 0.000200
2023-03-10 00:34:16,282 - INFO - [Train] step: 160399, loss: 0.024996, lr: 0.000200
2023-03-10 00:35:58,350 - INFO - [Train] step: 160799, loss: 0.035039, lr: 0.000200
2023-03-10 00:37:41,368 - INFO - [Train] step: 161199, loss: 0.028597, lr: 0.000200
2023-03-10 00:39:24,384 - INFO - [Train] step: 161599, loss: 0.022564, lr: 0.000200
2023-03-10 00:41:06,822 - INFO - [Train] step: 161999, loss: 0.033539, lr: 0.000200
2023-03-10 00:42:49,048 - INFO - [Train] step: 162399, loss: 0.018609, lr: 0.000200
2023-03-10 00:44:31,005 - INFO - [Train] step: 162799, loss: 0.016175, lr: 0.000200
2023-03-10 00:46:12,382 - INFO - [Train] step: 163199, loss: 0.025417, lr: 0.000200
2023-03-10 00:47:54,502 - INFO - [Train] step: 163599, loss: 0.028302, lr: 0.000200
2023-03-10 00:49:36,931 - INFO - [Train] step: 163999, loss: 0.019926, lr: 0.000200
2023-03-10 00:51:19,451 - INFO - [Train] step: 164399, loss: 0.025916, lr: 0.000200
2023-03-10 00:53:02,384 - INFO - [Train] step: 164799, loss: 0.029364, lr: 0.000200
2023-03-10 00:55:15,862 - INFO - [Train] step: 165199, loss: 0.027691, lr: 0.000200
2023-03-10 00:57:00,356 - INFO - [Train] step: 165599, loss: 0.035383, lr: 0.000200
2023-03-10 00:58:38,342 - INFO - [Train] step: 165999, loss: 0.029949, lr: 0.000200
2023-03-10 01:00:11,511 - INFO - [Train] step: 166399, loss: 0.036646, lr: 0.000200
2023-03-10 01:01:45,154 - INFO - [Train] step: 166799, loss: 0.031859, lr: 0.000200
2023-03-10 01:03:27,072 - INFO - [Train] step: 167199, loss: 0.025912, lr: 0.000200
2023-03-10 01:05:09,874 - INFO - [Train] step: 167599, loss: 0.023259, lr: 0.000200
2023-03-10 01:06:51,669 - INFO - [Train] step: 167999, loss: 0.033878, lr: 0.000200
2023-03-10 01:08:33,325 - INFO - [Train] step: 168399, loss: 0.028420, lr: 0.000200
2023-03-10 01:10:15,924 - INFO - [Train] step: 168799, loss: 0.030782, lr: 0.000200
2023-03-10 01:11:57,860 - INFO - [Train] step: 169199, loss: 0.031092, lr: 0.000200
2023-03-10 01:13:40,530 - INFO - [Train] step: 169599, loss: 0.029847, lr: 0.000200
2023-03-10 01:15:23,847 - INFO - [Train] step: 169999, loss: 0.026024, lr: 0.000200
2023-03-10 01:17:38,132 - INFO - [Train] step: 170399, loss: 0.021521, lr: 0.000200
2023-03-10 01:19:20,481 - INFO - [Train] step: 170799, loss: 0.029932, lr: 0.000200
2023-03-10 01:21:02,705 - INFO - [Train] step: 171199, loss: 0.029493, lr: 0.000200
2023-03-10 01:22:44,906 - INFO - [Train] step: 171599, loss: 0.027732, lr: 0.000200
2023-03-10 01:24:26,570 - INFO - [Train] step: 171999, loss: 0.037091, lr: 0.000200
2023-03-10 01:26:08,941 - INFO - [Train] step: 172399, loss: 0.025030, lr: 0.000200
2023-03-10 01:27:53,756 - INFO - [Train] step: 172799, loss: 0.026364, lr: 0.000200
2023-03-10 01:29:36,055 - INFO - [Train] step: 173199, loss: 0.016763, lr: 0.000200
2023-03-10 01:31:18,961 - INFO - [Train] step: 173599, loss: 0.033593, lr: 0.000200
2023-03-10 01:33:00,561 - INFO - [Train] step: 173999, loss: 0.028282, lr: 0.000200
2023-03-10 01:34:42,303 - INFO - [Train] step: 174399, loss: 0.024610, lr: 0.000200
2023-03-10 01:36:24,372 - INFO - [Train] step: 174799, loss: 0.023321, lr: 0.000200
2023-03-10 01:38:37,208 - INFO - [Train] step: 175199, loss: 0.035553, lr: 0.000200
2023-03-10 01:40:20,592 - INFO - [Train] step: 175599, loss: 0.030630, lr: 0.000200
2023-03-10 01:42:03,183 - INFO - [Train] step: 175999, loss: 0.023458, lr: 0.000200
2023-03-10 01:43:45,856 - INFO - [Train] step: 176399, loss: 0.024968, lr: 0.000200
2023-03-10 01:45:27,825 - INFO - [Train] step: 176799, loss: 0.019370, lr: 0.000200
2023-03-10 01:47:10,306 - INFO - [Train] step: 177199, loss: 0.025392, lr: 0.000200
2023-03-10 01:48:52,071 - INFO - [Train] step: 177599, loss: 0.037739, lr: 0.000200
2023-03-10 01:50:33,763 - INFO - [Train] step: 177999, loss: 0.025984, lr: 0.000200
2023-03-10 01:52:17,435 - INFO - [Train] step: 178399, loss: 0.033799, lr: 0.000200
2023-03-10 01:53:59,247 - INFO - [Train] step: 178799, loss: 0.029630, lr: 0.000200
2023-03-10 01:55:42,784 - INFO - [Train] step: 179199, loss: 0.026261, lr: 0.000200
2023-03-10 01:57:24,759 - INFO - [Train] step: 179599, loss: 0.024912, lr: 0.000200
2023-03-10 01:59:06,647 - INFO - [Train] step: 179999, loss: 0.031366, lr: 0.000200
2023-03-10 02:01:20,574 - INFO - [Train] step: 180399, loss: 0.038505, lr: 0.000200
2023-03-10 02:03:02,166 - INFO - [Train] step: 180799, loss: 0.028082, lr: 0.000200
2023-03-10 02:04:45,816 - INFO - [Train] step: 181199, loss: 0.041101, lr: 0.000200
2023-03-10 02:06:27,745 - INFO - [Train] step: 181599, loss: 0.037133, lr: 0.000200
2023-03-10 02:08:10,455 - INFO - [Train] step: 181999, loss: 0.034104, lr: 0.000200
2023-03-10 02:09:52,643 - INFO - [Train] step: 182399, loss: 0.025599, lr: 0.000200
2023-03-10 02:11:36,093 - INFO - [Train] step: 182799, loss: 0.039024, lr: 0.000200
2023-03-10 02:13:17,254 - INFO - [Train] step: 183199, loss: 0.036738, lr: 0.000200
2023-03-10 02:15:00,957 - INFO - [Train] step: 183599, loss: 0.027325, lr: 0.000200
2023-03-10 02:16:43,545 - INFO - [Train] step: 183999, loss: 0.041419, lr: 0.000200
2023-03-10 02:18:26,181 - INFO - [Train] step: 184399, loss: 0.036626, lr: 0.000200
2023-03-10 02:20:08,978 - INFO - [Train] step: 184799, loss: 0.025019, lr: 0.000200
2023-03-10 02:22:20,018 - INFO - [Train] step: 185199, loss: 0.049231, lr: 0.000200
2023-03-10 02:24:03,271 - INFO - [Train] step: 185599, loss: 0.034675, lr: 0.000200
2023-03-10 02:25:46,867 - INFO - [Train] step: 185999, loss: 0.024819, lr: 0.000200
2023-03-10 02:27:28,655 - INFO - [Train] step: 186399, loss: 0.025197, lr: 0.000200
2023-03-10 02:29:12,175 - INFO - [Train] step: 186799, loss: 0.030583, lr: 0.000200
2023-03-10 02:30:55,535 - INFO - [Train] step: 187199, loss: 0.036666, lr: 0.000200
2023-03-10 02:32:39,799 - INFO - [Train] step: 187599, loss: 0.024968, lr: 0.000200
2023-03-10 02:34:14,976 - INFO - [Train] step: 187999, loss: 0.035012, lr: 0.000200
2023-03-10 02:35:50,103 - INFO - [Train] step: 188399, loss: 0.032352, lr: 0.000200
2023-03-10 02:37:24,580 - INFO - [Train] step: 188799, loss: 0.027216, lr: 0.000200
2023-03-10 02:39:06,753 - INFO - [Train] step: 189199, loss: 0.027375, lr: 0.000200
2023-03-10 02:40:48,454 - INFO - [Train] step: 189599, loss: 0.031352, lr: 0.000200
2023-03-10 02:42:31,394 - INFO - [Train] step: 189999, loss: 0.039463, lr: 0.000200
2023-03-10 02:44:45,564 - INFO - [Train] step: 190399, loss: 0.040266, lr: 0.000200
2023-03-10 02:46:28,438 - INFO - [Train] step: 190799, loss: 0.018714, lr: 0.000200
2023-03-10 02:48:10,610 - INFO - [Train] step: 191199, loss: 0.022617, lr: 0.000200
2023-03-10 02:49:52,938 - INFO - [Train] step: 191599, loss: 0.031665, lr: 0.000200
2023-03-10 02:51:36,151 - INFO - [Train] step: 191999, loss: 0.022041, lr: 0.000200
2023-03-10 02:53:18,728 - INFO - [Train] step: 192399, loss: 0.018627, lr: 0.000200
2023-03-10 02:55:00,804 - INFO - [Train] step: 192799, loss: 0.025924, lr: 0.000200
2023-03-10 02:56:43,043 - INFO - [Train] step: 193199, loss: 0.031569, lr: 0.000200
2023-03-10 02:58:24,794 - INFO - [Train] step: 193599, loss: 0.025861, lr: 0.000200
2023-03-10 03:00:06,806 - INFO - [Train] step: 193999, loss: 0.035916, lr: 0.000200
2023-03-10 03:01:48,781 - INFO - [Train] step: 194399, loss: 0.024911, lr: 0.000200
2023-03-10 03:03:32,428 - INFO - [Train] step: 194799, loss: 0.035332, lr: 0.000200
2023-03-10 03:05:45,544 - INFO - [Train] step: 195199, loss: 0.033250, lr: 0.000200
2023-03-10 03:07:30,043 - INFO - [Train] step: 195599, loss: 0.029307, lr: 0.000200
2023-03-10 03:09:11,643 - INFO - [Train] step: 195999, loss: 0.019168, lr: 0.000200
2023-03-10 03:10:55,109 - INFO - [Train] step: 196399, loss: 0.045117, lr: 0.000200
2023-03-10 03:12:39,108 - INFO - [Train] step: 196799, loss: 0.023229, lr: 0.000200
2023-03-10 03:14:20,589 - INFO - [Train] step: 197199, loss: 0.036279, lr: 0.000200
2023-03-10 03:16:03,201 - INFO - [Train] step: 197599, loss: 0.039774, lr: 0.000200
2023-03-10 03:17:45,053 - INFO - [Train] step: 197999, loss: 0.026351, lr: 0.000200
2023-03-10 03:19:28,269 - INFO - [Train] step: 198399, loss: 0.034417, lr: 0.000200
2023-03-10 03:21:10,650 - INFO - [Train] step: 198799, loss: 0.027942, lr: 0.000200
2023-03-10 03:22:52,533 - INFO - [Train] step: 199199, loss: 0.031109, lr: 0.000200
2023-03-10 03:24:36,141 - INFO - [Train] step: 199599, loss: 0.037434, lr: 0.000200
2023-03-10 03:26:19,058 - INFO - [Train] step: 199999, loss: 0.031046, lr: 0.000200
2023-03-10 03:28:32,829 - INFO - [Train] step: 200399, loss: 0.022006, lr: 0.000200
2023-03-10 03:30:14,727 - INFO - [Train] step: 200799, loss: 0.039642, lr: 0.000200
2023-03-10 03:31:56,039 - INFO - [Train] step: 201199, loss: 0.033515, lr: 0.000200
2023-03-10 03:33:38,502 - INFO - [Train] step: 201599, loss: 0.026759, lr: 0.000200
2023-03-10 03:35:20,197 - INFO - [Train] step: 201999, loss: 0.026804, lr: 0.000200
2023-03-10 03:37:03,090 - INFO - [Train] step: 202399, loss: 0.018686, lr: 0.000200
2023-03-10 03:38:46,655 - INFO - [Train] step: 202799, loss: 0.021364, lr: 0.000200
2023-03-10 03:40:30,113 - INFO - [Train] step: 203199, loss: 0.038647, lr: 0.000200
2023-03-10 03:42:12,892 - INFO - [Train] step: 203599, loss: 0.032271, lr: 0.000200
2023-03-10 03:43:55,339 - INFO - [Train] step: 203999, loss: 0.022389, lr: 0.000200
2023-03-10 03:45:38,067 - INFO - [Train] step: 204399, loss: 0.021905, lr: 0.000200
2023-03-10 03:47:19,664 - INFO - [Train] step: 204799, loss: 0.021538, lr: 0.000200
2023-03-10 03:49:32,738 - INFO - [Train] step: 205199, loss: 0.028235, lr: 0.000200
2023-03-10 03:51:14,878 - INFO - [Train] step: 205599, loss: 0.015754, lr: 0.000200
2023-03-10 03:52:57,241 - INFO - [Train] step: 205999, loss: 0.029749, lr: 0.000200
2023-03-10 03:54:39,161 - INFO - [Train] step: 206399, loss: 0.025439, lr: 0.000200
2023-03-10 03:56:22,534 - INFO - [Train] step: 206799, loss: 0.029372, lr: 0.000200
2023-03-10 03:58:03,742 - INFO - [Train] step: 207199, loss: 0.019069, lr: 0.000200
2023-03-10 03:59:45,744 - INFO - [Train] step: 207599, loss: 0.024427, lr: 0.000200
2023-03-10 04:01:27,885 - INFO - [Train] step: 207999, loss: 0.037683, lr: 0.000200
2023-03-10 04:03:10,568 - INFO - [Train] step: 208399, loss: 0.022792, lr: 0.000200
2023-03-10 04:04:52,336 - INFO - [Train] step: 208799, loss: 0.025340, lr: 0.000200
2023-03-10 04:06:35,359 - INFO - [Train] step: 209199, loss: 0.027941, lr: 0.000200
2023-03-10 04:08:16,322 - INFO - [Train] step: 209599, loss: 0.016592, lr: 0.000200
2023-03-10 04:09:49,710 - INFO - [Train] step: 209999, loss: 0.039039, lr: 0.000200
2023-03-10 04:11:52,886 - INFO - [Train] step: 210399, loss: 0.033185, lr: 0.000200
2023-03-10 04:13:32,490 - INFO - [Train] step: 210799, loss: 0.023812, lr: 0.000200
2023-03-10 04:15:14,131 - INFO - [Train] step: 211199, loss: 0.030388, lr: 0.000200
2023-03-10 04:16:56,301 - INFO - [Train] step: 211599, loss: 0.034102, lr: 0.000200
2023-03-10 04:18:39,011 - INFO - [Train] step: 211999, loss: 0.036605, lr: 0.000200
2023-03-10 04:20:20,342 - INFO - [Train] step: 212399, loss: 0.036497, lr: 0.000200
2023-03-10 04:22:03,517 - INFO - [Train] step: 212799, loss: 0.038053, lr: 0.000200
2023-03-10 04:23:45,678 - INFO - [Train] step: 213199, loss: 0.037234, lr: 0.000200
2023-03-10 04:25:28,607 - INFO - [Train] step: 213599, loss: 0.047030, lr: 0.000200
2023-03-10 04:27:11,860 - INFO - [Train] step: 213999, loss: 0.019798, lr: 0.000200
2023-03-10 04:28:54,713 - INFO - [Train] step: 214399, loss: 0.025352, lr: 0.000200
2023-03-10 04:30:36,997 - INFO - [Train] step: 214799, loss: 0.024941, lr: 0.000200
2023-03-10 04:32:50,529 - INFO - [Train] step: 215199, loss: 0.034531, lr: 0.000200
2023-03-10 04:34:33,249 - INFO - [Train] step: 215599, loss: 0.023468, lr: 0.000200
2023-03-10 04:36:16,257 - INFO - [Train] step: 215999, loss: 0.026755, lr: 0.000200
2023-03-10 04:37:58,064 - INFO - [Train] step: 216399, loss: 0.024511, lr: 0.000200
2023-03-10 04:39:42,246 - INFO - [Train] step: 216799, loss: 0.031343, lr: 0.000200
2023-03-10 04:41:24,522 - INFO - [Train] step: 217199, loss: 0.037721, lr: 0.000200
2023-03-10 04:43:06,783 - INFO - [Train] step: 217599, loss: 0.026522, lr: 0.000200
2023-03-10 04:44:48,582 - INFO - [Train] step: 217999, loss: 0.023794, lr: 0.000200
2023-03-10 04:46:31,901 - INFO - [Train] step: 218399, loss: 0.022496, lr: 0.000200
2023-03-10 04:48:13,968 - INFO - [Train] step: 218799, loss: 0.032140, lr: 0.000200
2023-03-10 04:49:55,313 - INFO - [Train] step: 219199, loss: 0.021867, lr: 0.000200
2023-03-10 04:51:38,092 - INFO - [Train] step: 219599, loss: 0.028960, lr: 0.000200
2023-03-10 04:53:19,666 - INFO - [Train] step: 219999, loss: 0.022915, lr: 0.000200
2023-03-10 04:55:34,461 - INFO - [Train] step: 220399, loss: 0.030871, lr: 0.000200
2023-03-10 04:57:16,996 - INFO - [Train] step: 220799, loss: 0.018342, lr: 0.000200
2023-03-10 04:58:59,343 - INFO - [Train] step: 221199, loss: 0.029354, lr: 0.000200
2023-03-10 05:00:42,261 - INFO - [Train] step: 221599, loss: 0.029865, lr: 0.000200
2023-03-10 05:02:25,456 - INFO - [Train] step: 221999, loss: 0.034414, lr: 0.000200
2023-03-10 05:04:07,002 - INFO - [Train] step: 222399, loss: 0.019672, lr: 0.000200
2023-03-10 05:05:48,953 - INFO - [Train] step: 222799, loss: 0.014865, lr: 0.000200
2023-03-10 05:07:30,880 - INFO - [Train] step: 223199, loss: 0.042688, lr: 0.000200
2023-03-10 05:09:13,611 - INFO - [Train] step: 223599, loss: 0.031884, lr: 0.000200
2023-03-10 05:10:54,960 - INFO - [Train] step: 223999, loss: 0.024052, lr: 0.000200
2023-03-10 05:12:36,175 - INFO - [Train] step: 224399, loss: 0.019718, lr: 0.000200
2023-03-10 05:14:17,813 - INFO - [Train] step: 224799, loss: 0.026546, lr: 0.000200
2023-03-10 05:16:31,198 - INFO - [Train] step: 225199, loss: 0.033928, lr: 0.000200
2023-03-10 05:18:14,120 - INFO - [Train] step: 225599, loss: 0.022707, lr: 0.000200
2023-03-10 05:19:55,441 - INFO - [Train] step: 225999, loss: 0.020176, lr: 0.000200
2023-03-10 05:21:38,153 - INFO - [Train] step: 226399, loss: 0.017369, lr: 0.000200
2023-03-10 05:23:20,405 - INFO - [Train] step: 226799, loss: 0.037943, lr: 0.000200
2023-03-10 05:25:02,916 - INFO - [Train] step: 227199, loss: 0.028686, lr: 0.000200
2023-03-10 05:26:44,570 - INFO - [Train] step: 227599, loss: 0.021267, lr: 0.000200
2023-03-10 05:28:26,019 - INFO - [Train] step: 227999, loss: 0.022144, lr: 0.000200
2023-03-10 05:30:09,218 - INFO - [Train] step: 228399, loss: 0.034733, lr: 0.000200
2023-03-10 05:31:52,437 - INFO - [Train] step: 228799, loss: 0.018843, lr: 0.000200
2023-03-10 05:33:34,643 - INFO - [Train] step: 229199, loss: 0.016094, lr: 0.000200
2023-03-10 05:35:16,923 - INFO - [Train] step: 229599, loss: 0.025651, lr: 0.000200
2023-03-10 05:36:59,231 - INFO - [Train] step: 229999, loss: 0.026799, lr: 0.000200
2023-03-10 05:39:15,865 - INFO - [Train] step: 230399, loss: 0.018638, lr: 0.000200
2023-03-10 05:40:58,102 - INFO - [Train] step: 230799, loss: 0.026228, lr: 0.000200
2023-03-10 05:42:41,267 - INFO - [Train] step: 231199, loss: 0.026459, lr: 0.000200
2023-03-10 05:44:20,726 - INFO - [Train] step: 231599, loss: 0.035763, lr: 0.000200
2023-03-10 05:45:55,165 - INFO - [Train] step: 231999, loss: 0.033501, lr: 0.000200
2023-03-10 05:47:29,121 - INFO - [Train] step: 232399, loss: 0.029164, lr: 0.000200
2023-03-10 05:49:10,567 - INFO - [Train] step: 232799, loss: 0.023071, lr: 0.000200
2023-03-10 05:50:53,284 - INFO - [Train] step: 233199, loss: 0.029669, lr: 0.000200
2023-03-10 05:52:35,132 - INFO - [Train] step: 233599, loss: 0.027694, lr: 0.000200
2023-03-10 05:54:18,761 - INFO - [Train] step: 233999, loss: 0.025418, lr: 0.000200
2023-03-10 05:56:01,531 - INFO - [Train] step: 234399, loss: 0.025564, lr: 0.000200
2023-03-10 05:57:43,131 - INFO - [Train] step: 234799, loss: 0.025980, lr: 0.000200
2023-03-10 05:59:55,726 - INFO - [Train] step: 235199, loss: 0.026480, lr: 0.000200
2023-03-10 06:01:37,524 - INFO - [Train] step: 235599, loss: 0.046574, lr: 0.000200
2023-03-10 06:03:19,165 - INFO - [Train] step: 235999, loss: 0.030813, lr: 0.000200
2023-03-10 06:05:01,439 - INFO - [Train] step: 236399, loss: 0.042594, lr: 0.000200
2023-03-10 06:06:44,526 - INFO - [Train] step: 236799, loss: 0.032730, lr: 0.000200
2023-03-10 06:08:26,787 - INFO - [Train] step: 237199, loss: 0.040634, lr: 0.000200
2023-03-10 06:10:09,169 - INFO - [Train] step: 237599, loss: 0.047283, lr: 0.000200
2023-03-10 06:11:50,855 - INFO - [Train] step: 237999, loss: 0.023856, lr: 0.000200
2023-03-10 06:13:32,756 - INFO - [Train] step: 238399, loss: 0.035902, lr: 0.000200
2023-03-10 06:15:14,900 - INFO - [Train] step: 238799, loss: 0.026503, lr: 0.000200
2023-03-10 06:16:56,603 - INFO - [Train] step: 239199, loss: 0.040201, lr: 0.000200
2023-03-10 06:18:38,536 - INFO - [Train] step: 239599, loss: 0.033147, lr: 0.000200
2023-03-10 06:20:20,343 - INFO - [Train] step: 239999, loss: 0.038976, lr: 0.000200
2023-03-10 06:22:36,407 - INFO - [Train] step: 240399, loss: 0.034788, lr: 0.000200
2023-03-10 06:24:18,068 - INFO - [Train] step: 240799, loss: 0.038927, lr: 0.000200
2023-03-10 06:25:59,796 - INFO - [Train] step: 241199, loss: 0.029250, lr: 0.000200
2023-03-10 06:27:41,763 - INFO - [Train] step: 241599, loss: 0.034674, lr: 0.000200
2023-03-10 06:29:24,365 - INFO - [Train] step: 241999, loss: 0.021077, lr: 0.000200
2023-03-10 06:31:06,497 - INFO - [Train] step: 242399, loss: 0.021194, lr: 0.000200
2023-03-10 06:32:47,503 - INFO - [Train] step: 242799, loss: 0.024140, lr: 0.000200
2023-03-10 06:34:30,301 - INFO - [Train] step: 243199, loss: 0.023293, lr: 0.000200
2023-03-10 06:36:11,621 - INFO - [Train] step: 243599, loss: 0.022449, lr: 0.000200
2023-03-10 06:37:53,956 - INFO - [Train] step: 243999, loss: 0.027678, lr: 0.000200
2023-03-10 06:39:37,625 - INFO - [Train] step: 244399, loss: 0.027398, lr: 0.000200
2023-03-10 06:41:19,966 - INFO - [Train] step: 244799, loss: 0.033189, lr: 0.000200
2023-03-10 06:43:35,154 - INFO - [Train] step: 245199, loss: 0.018291, lr: 0.000200
2023-03-10 06:45:18,014 - INFO - [Train] step: 245599, loss: 0.034465, lr: 0.000200
2023-03-10 06:47:00,773 - INFO - [Train] step: 245999, loss: 0.032980, lr: 0.000200
2023-03-10 06:48:43,022 - INFO - [Train] step: 246399, loss: 0.021986, lr: 0.000200
2023-03-10 06:50:25,172 - INFO - [Train] step: 246799, loss: 0.023088, lr: 0.000200
2023-03-10 06:52:06,966 - INFO - [Train] step: 247199, loss: 0.029343, lr: 0.000200
2023-03-10 06:53:49,309 - INFO - [Train] step: 247599, loss: 0.055864, lr: 0.000200
2023-03-10 06:55:32,145 - INFO - [Train] step: 247999, loss: 0.022775, lr: 0.000200
2023-03-10 06:57:14,793 - INFO - [Train] step: 248399, loss: 0.025074, lr: 0.000200
2023-03-10 06:58:57,223 - INFO - [Train] step: 248799, loss: 0.037395, lr: 0.000200
2023-03-10 07:00:39,739 - INFO - [Train] step: 249199, loss: 0.027647, lr: 0.000200
2023-03-10 07:02:22,041 - INFO - [Train] step: 249599, loss: 0.025437, lr: 0.000200
2023-03-10 07:04:04,320 - INFO - [Train] step: 249999, loss: 0.024529, lr: 0.000200
2023-03-10 07:06:18,697 - INFO - [Train] step: 250399, loss: 0.021884, lr: 0.000200
2023-03-10 07:08:03,241 - INFO - [Train] step: 250799, loss: 0.015015, lr: 0.000200
2023-03-10 07:09:44,425 - INFO - [Train] step: 251199, loss: 0.020269, lr: 0.000200
2023-03-10 07:11:25,865 - INFO - [Train] step: 251599, loss: 0.035350, lr: 0.000200
2023-03-10 07:13:07,886 - INFO - [Train] step: 251999, loss: 0.040928, lr: 0.000200
2023-03-10 07:14:49,716 - INFO - [Train] step: 252399, loss: 0.040951, lr: 0.000200
2023-03-10 07:16:31,019 - INFO - [Train] step: 252799, loss: 0.033957, lr: 0.000200
2023-03-10 07:18:13,251 - INFO - [Train] step: 253199, loss: 0.023884, lr: 0.000200
2023-03-10 07:19:49,772 - INFO - [Train] step: 253599, loss: 0.029575, lr: 0.000200
2023-03-10 07:21:23,921 - INFO - [Train] step: 253999, loss: 0.023899, lr: 0.000200
2023-03-10 07:22:57,180 - INFO - [Train] step: 254399, loss: 0.026341, lr: 0.000200
2023-03-10 07:24:38,822 - INFO - [Train] step: 254799, loss: 0.039660, lr: 0.000200
2023-03-10 07:26:52,065 - INFO - [Train] step: 255199, loss: 0.037106, lr: 0.000200
2023-03-10 07:28:36,169 - INFO - [Train] step: 255599, loss: 0.036109, lr: 0.000200
2023-03-10 07:30:16,888 - INFO - [Train] step: 255999, loss: 0.028531, lr: 0.000200
2023-03-10 07:31:57,649 - INFO - [Train] step: 256399, loss: 0.024659, lr: 0.000200
2023-03-10 07:33:39,558 - INFO - [Train] step: 256799, loss: 0.021979, lr: 0.000200
2023-03-10 07:35:20,692 - INFO - [Train] step: 257199, loss: 0.035312, lr: 0.000200
2023-03-10 07:37:02,563 - INFO - [Train] step: 257599, loss: 0.032969, lr: 0.000200
2023-03-10 07:38:45,615 - INFO - [Train] step: 257999, loss: 0.020198, lr: 0.000200
2023-03-10 07:40:27,228 - INFO - [Train] step: 258399, loss: 0.025704, lr: 0.000200
2023-03-10 07:42:09,435 - INFO - [Train] step: 258799, loss: 0.031965, lr: 0.000200
2023-03-10 07:43:50,976 - INFO - [Train] step: 259199, loss: 0.022031, lr: 0.000200
2023-03-10 07:45:32,855 - INFO - [Train] step: 259599, loss: 0.017394, lr: 0.000200
2023-03-10 07:47:13,941 - INFO - [Train] step: 259999, loss: 0.020949, lr: 0.000200
2023-03-10 07:49:28,028 - INFO - [Train] step: 260399, loss: 0.022621, lr: 0.000200
2023-03-10 07:51:09,369 - INFO - [Train] step: 260799, loss: 0.026359, lr: 0.000200
2023-03-10 07:52:51,021 - INFO - [Train] step: 261199, loss: 0.033813, lr: 0.000200
2023-03-10 07:54:33,144 - INFO - [Train] step: 261599, loss: 0.037010, lr: 0.000200
2023-03-10 07:56:14,148 - INFO - [Train] step: 261999, loss: 0.030196, lr: 0.000200
2023-03-10 07:57:56,627 - INFO - [Train] step: 262399, loss: 0.032036, lr: 0.000200
2023-03-10 07:59:40,602 - INFO - [Train] step: 262799, loss: 0.047250, lr: 0.000200
2023-03-10 08:01:23,631 - INFO - [Train] step: 263199, loss: 0.035544, lr: 0.000200
2023-03-10 08:03:06,151 - INFO - [Train] step: 263599, loss: 0.030585, lr: 0.000200
2023-03-10 08:04:47,993 - INFO - [Train] step: 263999, loss: 0.028642, lr: 0.000200
2023-03-10 08:06:29,400 - INFO - [Train] step: 264399, loss: 0.020036, lr: 0.000200
2023-03-10 08:08:13,501 - INFO - [Train] step: 264799, loss: 0.032572, lr: 0.000200
2023-03-10 08:10:28,127 - INFO - [Train] step: 265199, loss: 0.022352, lr: 0.000200
2023-03-10 08:12:11,158 - INFO - [Train] step: 265599, loss: 0.032918, lr: 0.000200
2023-03-10 08:13:52,584 - INFO - [Train] step: 265999, loss: 0.026448, lr: 0.000200
2023-03-10 08:15:34,919 - INFO - [Train] step: 266399, loss: 0.026885, lr: 0.000200
2023-03-10 08:17:17,602 - INFO - [Train] step: 266799, loss: 0.026321, lr: 0.000200
2023-03-10 08:19:00,175 - INFO - [Train] step: 267199, loss: 0.018399, lr: 0.000200
2023-03-10 08:20:43,745 - INFO - [Train] step: 267599, loss: 0.030799, lr: 0.000200
2023-03-10 08:22:26,784 - INFO - [Train] step: 267999, loss: 0.020843, lr: 0.000200
2023-03-10 08:24:09,693 - INFO - [Train] step: 268399, loss: 0.019232, lr: 0.000200
2023-03-10 08:25:52,194 - INFO - [Train] step: 268799, loss: 0.028118, lr: 0.000200
2023-03-10 08:27:33,907 - INFO - [Train] step: 269199, loss: 0.023300, lr: 0.000200
2023-03-10 08:29:16,527 - INFO - [Train] step: 269599, loss: 0.035711, lr: 0.000200
2023-03-10 08:30:58,772 - INFO - [Train] step: 269999, loss: 0.037159, lr: 0.000200
2023-03-10 08:33:12,458 - INFO - [Train] step: 270399, loss: 0.022685, lr: 0.000200
2023-03-10 08:34:55,350 - INFO - [Train] step: 270799, loss: 0.028147, lr: 0.000200
2023-03-10 08:36:37,304 - INFO - [Train] step: 271199, loss: 0.019503, lr: 0.000200
2023-03-10 08:38:21,135 - INFO - [Train] step: 271599, loss: 0.020870, lr: 0.000200
2023-03-10 08:40:04,217 - INFO - [Train] step: 271999, loss: 0.021273, lr: 0.000200
2023-03-10 08:41:46,391 - INFO - [Train] step: 272399, loss: 0.021489, lr: 0.000200
2023-03-10 08:43:29,047 - INFO - [Train] step: 272799, loss: 0.017466, lr: 0.000200
2023-03-10 08:45:10,780 - INFO - [Train] step: 273199, loss: 0.026652, lr: 0.000200
2023-03-10 08:46:52,578 - INFO - [Train] step: 273599, loss: 0.028801, lr: 0.000200
2023-03-10 08:48:35,241 - INFO - [Train] step: 273999, loss: 0.024779, lr: 0.000200
2023-03-10 08:50:17,339 - INFO - [Train] step: 274399, loss: 0.035002, lr: 0.000200
2023-03-10 08:52:00,603 - INFO - [Train] step: 274799, loss: 0.036625, lr: 0.000200
2023-03-10 08:54:12,627 - INFO - [Train] step: 275199, loss: 0.036961, lr: 0.000200
2023-03-10 08:55:46,098 - INFO - [Train] step: 275599, loss: 0.030734, lr: 0.000200
2023-03-10 08:57:20,387 - INFO - [Train] step: 275999, loss: 0.037386, lr: 0.000200
2023-03-10 08:58:57,996 - INFO - [Train] step: 276399, loss: 0.021385, lr: 0.000200
2023-03-10 09:00:40,360 - INFO - [Train] step: 276799, loss: 0.029112, lr: 0.000200
2023-03-10 09:02:23,297 - INFO - [Train] step: 277199, loss: 0.037207, lr: 0.000200
2023-03-10 09:04:05,027 - INFO - [Train] step: 277599, loss: 0.034404, lr: 0.000200
2023-03-10 09:05:46,517 - INFO - [Train] step: 277999, loss: 0.029568, lr: 0.000200
2023-03-10 09:07:28,560 - INFO - [Train] step: 278399, loss: 0.022071, lr: 0.000200
2023-03-10 09:09:11,203 - INFO - [Train] step: 278799, loss: 0.033401, lr: 0.000200
2023-03-10 09:10:53,389 - INFO - [Train] step: 279199, loss: 0.030007, lr: 0.000200
2023-03-10 09:12:35,142 - INFO - [Train] step: 279599, loss: 0.023750, lr: 0.000200
2023-03-10 09:14:16,677 - INFO - [Train] step: 279999, loss: 0.032450, lr: 0.000200
2023-03-10 09:16:30,666 - INFO - [Train] step: 280399, loss: 0.029939, lr: 0.000200
2023-03-10 09:18:13,069 - INFO - [Train] step: 280799, loss: 0.036587, lr: 0.000200
2023-03-10 09:19:55,406 - INFO - [Train] step: 281199, loss: 0.020232, lr: 0.000200
2023-03-10 09:21:37,700 - INFO - [Train] step: 281599, loss: 0.038672, lr: 0.000200
2023-03-10 09:23:19,499 - INFO - [Train] step: 281999, loss: 0.022418, lr: 0.000200
2023-03-10 09:25:01,848 - INFO - [Train] step: 282399, loss: 0.027218, lr: 0.000200
2023-03-10 09:26:43,803 - INFO - [Train] step: 282799, loss: 0.022790, lr: 0.000200
2023-03-10 09:28:26,043 - INFO - [Train] step: 283199, loss: 0.032155, lr: 0.000200
2023-03-10 09:30:08,711 - INFO - [Train] step: 283599, loss: 0.026490, lr: 0.000200
2023-03-10 09:31:50,569 - INFO - [Train] step: 283999, loss: 0.022833, lr: 0.000200
2023-03-10 09:33:32,063 - INFO - [Train] step: 284399, loss: 0.030181, lr: 0.000200
2023-03-10 09:35:13,020 - INFO - [Train] step: 284799, loss: 0.015181, lr: 0.000200
2023-03-10 09:37:26,425 - INFO - [Train] step: 285199, loss: 0.032445, lr: 0.000200
2023-03-10 09:39:08,785 - INFO - [Train] step: 285599, loss: 0.032089, lr: 0.000200
2023-03-10 09:40:51,309 - INFO - [Train] step: 285999, loss: 0.036690, lr: 0.000200
2023-03-10 09:42:32,686 - INFO - [Train] step: 286399, loss: 0.025351, lr: 0.000200
2023-03-10 09:44:15,181 - INFO - [Train] step: 286799, loss: 0.039954, lr: 0.000200
2023-03-10 09:45:57,598 - INFO - [Train] step: 287199, loss: 0.024563, lr: 0.000200
2023-03-10 09:47:39,663 - INFO - [Train] step: 287599, loss: 0.029550, lr: 0.000200
2023-03-10 09:49:23,351 - INFO - [Train] step: 287999, loss: 0.038669, lr: 0.000200
2023-03-10 09:51:05,502 - INFO - [Train] step: 288399, loss: 0.019955, lr: 0.000200
2023-03-10 09:52:48,142 - INFO - [Train] step: 288799, loss: 0.035678, lr: 0.000200
2023-03-10 09:54:31,868 - INFO - [Train] step: 289199, loss: 0.023465, lr: 0.000200
2023-03-10 09:56:15,033 - INFO - [Train] step: 289599, loss: 0.030825, lr: 0.000200
2023-03-10 09:57:57,033 - INFO - [Train] step: 289999, loss: 0.034128, lr: 0.000200
2023-03-10 10:00:11,174 - INFO - [Train] step: 290399, loss: 0.030092, lr: 0.000200
2023-03-10 10:01:54,155 - INFO - [Train] step: 290799, loss: 0.025265, lr: 0.000200
2023-03-10 10:03:36,964 - INFO - [Train] step: 291199, loss: 0.022674, lr: 0.000200
2023-03-10 10:05:19,579 - INFO - [Train] step: 291599, loss: 0.025827, lr: 0.000200
2023-03-10 10:07:01,597 - INFO - [Train] step: 291999, loss: 0.024462, lr: 0.000200
2023-03-10 10:08:45,468 - INFO - [Train] step: 292399, loss: 0.029087, lr: 0.000200
2023-03-10 10:10:27,378 - INFO - [Train] step: 292799, loss: 0.023307, lr: 0.000200
2023-03-10 10:12:10,802 - INFO - [Train] step: 293199, loss: 0.035562, lr: 0.000200
2023-03-10 10:13:54,931 - INFO - [Train] step: 293599, loss: 0.037671, lr: 0.000200
2023-03-10 10:15:36,575 - INFO - [Train] step: 293999, loss: 0.024606, lr: 0.000200
2023-03-10 10:17:17,893 - INFO - [Train] step: 294399, loss: 0.023206, lr: 0.000200
2023-03-10 10:18:59,013 - INFO - [Train] step: 294799, loss: 0.036423, lr: 0.000200
2023-03-10 10:21:11,251 - INFO - [Train] step: 295199, loss: 0.047419, lr: 0.000200
2023-03-10 10:22:53,343 - INFO - [Train] step: 295599, loss: 0.026624, lr: 0.000200
2023-03-10 10:24:35,494 - INFO - [Train] step: 295999, loss: 0.035137, lr: 0.000200
2023-03-10 10:26:18,784 - INFO - [Train] step: 296399, loss: 0.042826, lr: 0.000200
2023-03-10 10:28:01,310 - INFO - [Train] step: 296799, loss: 0.024643, lr: 0.000200
2023-03-10 10:29:41,551 - INFO - [Train] step: 297199, loss: 0.034167, lr: 0.000200
2023-03-10 10:31:15,046 - INFO - [Train] step: 297599, loss: 0.040746, lr: 0.000200
2023-03-10 10:32:48,083 - INFO - [Train] step: 297999, loss: 0.027481, lr: 0.000200
2023-03-10 10:34:28,835 - INFO - [Train] step: 298399, loss: 0.023171, lr: 0.000200
2023-03-10 10:36:10,601 - INFO - [Train] step: 298799, loss: 0.036747, lr: 0.000200
2023-03-10 10:37:54,119 - INFO - [Train] step: 299199, loss: 0.026300, lr: 0.000200
2023-03-10 10:39:36,214 - INFO - [Train] step: 299599, loss: 0.030690, lr: 0.000200
2023-03-10 10:41:18,029 - INFO - [Train] step: 299999, loss: 0.016649, lr: 0.000200
2023-03-10 10:43:34,680 - INFO - [Train] step: 300399, loss: 0.024690, lr: 0.000200
2023-03-10 10:45:19,109 - INFO - [Train] step: 300799, loss: 0.019667, lr: 0.000200
2023-03-10 10:47:01,008 - INFO - [Train] step: 301199, loss: 0.015816, lr: 0.000200
2023-03-10 10:48:43,048 - INFO - [Train] step: 301599, loss: 0.014847, lr: 0.000200
2023-03-10 10:50:26,057 - INFO - [Train] step: 301999, loss: 0.015023, lr: 0.000200
2023-03-10 10:52:07,944 - INFO - [Train] step: 302399, loss: 0.040401, lr: 0.000200
2023-03-10 10:53:50,538 - INFO - [Train] step: 302799, loss: 0.019601, lr: 0.000200
2023-03-10 10:55:31,947 - INFO - [Train] step: 303199, loss: 0.036169, lr: 0.000200
2023-03-10 10:57:15,199 - INFO - [Train] step: 303599, loss: 0.031094, lr: 0.000200
2023-03-10 10:58:57,114 - INFO - [Train] step: 303999, loss: 0.027723, lr: 0.000200
2023-03-10 11:00:40,003 - INFO - [Train] step: 304399, loss: 0.031244, lr: 0.000200
2023-03-10 11:02:23,638 - INFO - [Train] step: 304799, loss: 0.025609, lr: 0.000200
2023-03-10 11:04:36,246 - INFO - [Train] step: 305199, loss: 0.021086, lr: 0.000200
2023-03-10 11:06:18,381 - INFO - [Train] step: 305599, loss: 0.033204, lr: 0.000200
2023-03-10 11:08:01,062 - INFO - [Train] step: 305999, loss: 0.024129, lr: 0.000200
2023-03-10 11:09:44,183 - INFO - [Train] step: 306399, loss: 0.037839, lr: 0.000200
2023-03-10 11:11:26,160 - INFO - [Train] step: 306799, loss: 0.035701, lr: 0.000200
2023-03-10 11:13:08,844 - INFO - [Train] step: 307199, loss: 0.029769, lr: 0.000200
2023-03-10 11:14:51,575 - INFO - [Train] step: 307599, loss: 0.031374, lr: 0.000200
2023-03-10 11:16:33,838 - INFO - [Train] step: 307999, loss: 0.038320, lr: 0.000200
2023-03-10 11:18:16,290 - INFO - [Train] step: 308399, loss: 0.030920, lr: 0.000200
2023-03-10 11:19:58,671 - INFO - [Train] step: 308799, loss: 0.024210, lr: 0.000200
2023-03-10 11:21:42,888 - INFO - [Train] step: 309199, loss: 0.030529, lr: 0.000200
2023-03-10 11:23:25,999 - INFO - [Train] step: 309599, loss: 0.042874, lr: 0.000200
2023-03-10 11:25:09,236 - INFO - [Train] step: 309999, loss: 0.025595, lr: 0.000200
2023-03-10 11:27:23,922 - INFO - [Train] step: 310399, loss: 0.018534, lr: 0.000200
2023-03-10 11:29:05,595 - INFO - [Train] step: 310799, loss: 0.042966, lr: 0.000200
2023-03-10 11:30:48,163 - INFO - [Train] step: 311199, loss: 0.023472, lr: 0.000200
2023-03-10 11:32:29,471 - INFO - [Train] step: 311599, loss: 0.042655, lr: 0.000200
2023-03-10 11:34:11,258 - INFO - [Train] step: 311999, loss: 0.019340, lr: 0.000200
2023-03-10 11:35:54,577 - INFO - [Train] step: 312399, loss: 0.033291, lr: 0.000200
2023-03-10 11:37:36,789 - INFO - [Train] step: 312799, loss: 0.033939, lr: 0.000200
2023-03-10 11:39:18,374 - INFO - [Train] step: 313199, loss: 0.027928, lr: 0.000200
2023-03-10 11:41:00,530 - INFO - [Train] step: 313599, loss: 0.027680, lr: 0.000200
2023-03-10 11:42:43,687 - INFO - [Train] step: 313999, loss: 0.020440, lr: 0.000200
2023-03-10 11:44:26,831 - INFO - [Train] step: 314399, loss: 0.024034, lr: 0.000200
2023-03-10 11:46:09,107 - INFO - [Train] step: 314799, loss: 0.016330, lr: 0.000200
2023-03-10 11:48:20,962 - INFO - [Train] step: 315199, loss: 0.026296, lr: 0.000200
2023-03-10 11:50:03,758 - INFO - [Train] step: 315599, loss: 0.026573, lr: 0.000200
2023-03-10 11:51:46,533 - INFO - [Train] step: 315999, loss: 0.031060, lr: 0.000200
2023-03-10 11:53:28,428 - INFO - [Train] step: 316399, loss: 0.039446, lr: 0.000200
2023-03-10 11:55:12,378 - INFO - [Train] step: 316799, loss: 0.032524, lr: 0.000200
2023-03-10 11:56:55,405 - INFO - [Train] step: 317199, loss: 0.024450, lr: 0.000200
2023-03-10 11:58:36,481 - INFO - [Train] step: 317599, loss: 0.027188, lr: 0.000200
2023-03-10 12:00:18,726 - INFO - [Train] step: 317999, loss: 0.018941, lr: 0.000200
2023-03-10 12:02:02,509 - INFO - [Train] step: 318399, loss: 0.036740, lr: 0.000200
2023-03-10 12:03:47,007 - INFO - [Train] step: 318799, loss: 0.032515, lr: 0.000200
2023-03-10 12:05:25,067 - INFO - [Train] step: 319199, loss: 0.038687, lr: 0.000200
2023-03-10 12:06:59,007 - INFO - [Train] step: 319599, loss: 0.025664, lr: 0.000200
2023-03-10 12:08:32,957 - INFO - [Train] step: 319999, loss: 0.014214, lr: 0.000200
2023-03-10 12:10:45,671 - INFO - [Train] step: 320399, loss: 0.060878, lr: 0.000200
2023-03-10 12:12:27,908 - INFO - [Train] step: 320799, loss: 0.030685, lr: 0.000200
2023-03-10 12:14:10,417 - INFO - [Train] step: 321199, loss: 0.028028, lr: 0.000200
2023-03-10 12:15:54,065 - INFO - [Train] step: 321599, loss: 0.024088, lr: 0.000200
2023-03-10 12:17:36,834 - INFO - [Train] step: 321999, loss: 0.035455, lr: 0.000200
2023-03-10 12:19:20,117 - INFO - [Train] step: 322399, loss: 0.031546, lr: 0.000200
2023-03-10 12:21:03,429 - INFO - [Train] step: 322799, loss: 0.029549, lr: 0.000200
2023-03-10 12:22:45,925 - INFO - [Train] step: 323199, loss: 0.025931, lr: 0.000200
2023-03-10 12:24:28,249 - INFO - [Train] step: 323599, loss: 0.035383, lr: 0.000200
2023-03-10 12:26:10,991 - INFO - [Train] step: 323999, loss: 0.025172, lr: 0.000200
2023-03-10 12:27:53,271 - INFO - [Train] step: 324399, loss: 0.017796, lr: 0.000200
2023-03-10 12:29:36,091 - INFO - [Train] step: 324799, loss: 0.029296, lr: 0.000200
2023-03-10 12:31:49,633 - INFO - [Train] step: 325199, loss: 0.035442, lr: 0.000200
2023-03-10 12:33:31,558 - INFO - [Train] step: 325599, loss: 0.031816, lr: 0.000200
2023-03-10 12:35:14,193 - INFO - [Train] step: 325999, loss: 0.020834, lr: 0.000200
2023-03-10 12:36:57,628 - INFO - [Train] step: 326399, loss: 0.032061, lr: 0.000200
2023-03-10 12:38:40,320 - INFO - [Train] step: 326799, loss: 0.025899, lr: 0.000200
2023-03-10 12:40:22,322 - INFO - [Train] step: 327199, loss: 0.027747, lr: 0.000200
2023-03-10 12:42:05,760 - INFO - [Train] step: 327599, loss: 0.025277, lr: 0.000200
2023-03-10 12:43:49,261 - INFO - [Train] step: 327999, loss: 0.035637, lr: 0.000200
2023-03-10 12:45:32,569 - INFO - [Train] step: 328399, loss: 0.014514, lr: 0.000200
2023-03-10 12:47:15,253 - INFO - [Train] step: 328799, loss: 0.036135, lr: 0.000200
2023-03-10 12:48:56,364 - INFO - [Train] step: 329199, loss: 0.033146, lr: 0.000200
2023-03-10 12:50:38,637 - INFO - [Train] step: 329599, loss: 0.014968, lr: 0.000200
2023-03-10 12:52:21,790 - INFO - [Train] step: 329999, loss: 0.037072, lr: 0.000200
2023-03-10 12:54:36,200 - INFO - [Train] step: 330399, loss: 0.025580, lr: 0.000200
2023-03-10 12:56:18,687 - INFO - [Train] step: 330799, loss: 0.024549, lr: 0.000200
2023-03-10 12:58:01,577 - INFO - [Train] step: 331199, loss: 0.021797, lr: 0.000200
2023-03-10 12:59:43,991 - INFO - [Train] step: 331599, loss: 0.036093, lr: 0.000200
2023-03-10 13:01:26,643 - INFO - [Train] step: 331999, loss: 0.016386, lr: 0.000200
2023-03-10 13:03:09,324 - INFO - [Train] step: 332399, loss: 0.032612, lr: 0.000200
2023-03-10 13:04:52,168 - INFO - [Train] step: 332799, loss: 0.035644, lr: 0.000200
2023-03-10 13:06:35,552 - INFO - [Train] step: 333199, loss: 0.025766, lr: 0.000200
2023-03-10 13:08:18,398 - INFO - [Train] step: 333599, loss: 0.019169, lr: 0.000200
2023-03-10 13:10:01,089 - INFO - [Train] step: 333999, loss: 0.021728, lr: 0.000200
2023-03-10 13:11:44,820 - INFO - [Train] step: 334399, loss: 0.024367, lr: 0.000200
2023-03-10 13:13:27,618 - INFO - [Train] step: 334799, loss: 0.029162, lr: 0.000200
2023-03-10 13:15:42,844 - INFO - [Train] step: 335199, loss: 0.028631, lr: 0.000200
2023-03-10 13:17:25,028 - INFO - [Train] step: 335599, loss: 0.042865, lr: 0.000200
2023-03-10 13:19:08,254 - INFO - [Train] step: 335999, loss: 0.028417, lr: 0.000200
2023-03-10 13:20:50,736 - INFO - [Train] step: 336399, loss: 0.032194, lr: 0.000200
2023-03-10 13:22:33,778 - INFO - [Train] step: 336799, loss: 0.028410, lr: 0.000200
2023-03-10 13:24:16,234 - INFO - [Train] step: 337199, loss: 0.032601, lr: 0.000200
2023-03-10 13:25:58,394 - INFO - [Train] step: 337599, loss: 0.021189, lr: 0.000200
2023-03-10 13:27:41,370 - INFO - [Train] step: 337999, loss: 0.053751, lr: 0.000200
2023-03-10 13:29:24,600 - INFO - [Train] step: 338399, loss: 0.020550, lr: 0.000200
2023-03-10 13:31:06,622 - INFO - [Train] step: 338799, loss: 0.021701, lr: 0.000200
2023-03-10 13:32:48,704 - INFO - [Train] step: 339199, loss: 0.019025, lr: 0.000200
2023-03-10 13:34:31,484 - INFO - [Train] step: 339599, loss: 0.026928, lr: 0.000200
2023-03-10 13:36:12,672 - INFO - [Train] step: 339999, loss: 0.016827, lr: 0.000200
2023-03-10 13:38:27,915 - INFO - [Train] step: 340399, loss: 0.028262, lr: 0.000200
2023-03-10 13:40:09,594 - INFO - [Train] step: 340799, loss: 0.038741, lr: 0.000200
2023-03-10 13:41:43,939 - INFO - [Train] step: 341199, loss: 0.029496, lr: 0.000200
2023-03-10 13:43:21,219 - INFO - [Train] step: 341599, loss: 0.019480, lr: 0.000200
2023-03-10 13:44:59,723 - INFO - [Train] step: 341999, loss: 0.051265, lr: 0.000200
2023-03-10 13:46:42,722 - INFO - [Train] step: 342399, loss: 0.023305, lr: 0.000200
2023-03-10 13:48:26,358 - INFO - [Train] step: 342799, loss: 0.037763, lr: 0.000200
2023-03-10 13:50:09,646 - INFO - [Train] step: 343199, loss: 0.023637, lr: 0.000200
2023-03-10 13:51:53,146 - INFO - [Train] step: 343599, loss: 0.025797, lr: 0.000200
2023-03-10 13:53:37,836 - INFO - [Train] step: 343999, loss: 0.022229, lr: 0.000200
2023-03-10 13:55:21,870 - INFO - [Train] step: 344399, loss: 0.026025, lr: 0.000200
2023-03-10 13:57:04,673 - INFO - [Train] step: 344799, loss: 0.016660, lr: 0.000200
2023-03-10 13:59:16,919 - INFO - [Train] step: 345199, loss: 0.035834, lr: 0.000200
2023-03-10 14:00:59,690 - INFO - [Train] step: 345599, loss: 0.032512, lr: 0.000200
2023-03-10 14:02:40,973 - INFO - [Train] step: 345999, loss: 0.031157, lr: 0.000200
2023-03-10 14:04:27,482 - INFO - [Train] step: 346399, loss: 0.027957, lr: 0.000200
2023-03-10 14:06:10,529 - INFO - [Train] step: 346799, loss: 0.027068, lr: 0.000200
2023-03-10 14:07:52,863 - INFO - [Train] step: 347199, loss: 0.013523, lr: 0.000200
2023-03-10 14:09:34,700 - INFO - [Train] step: 347599, loss: 0.026319, lr: 0.000200
2023-03-10 14:11:16,704 - INFO - [Train] step: 347999, loss: 0.024923, lr: 0.000200
2023-03-10 14:12:58,887 - INFO - [Train] step: 348399, loss: 0.031871, lr: 0.000200
2023-03-10 14:14:41,137 - INFO - [Train] step: 348799, loss: 0.028514, lr: 0.000200
2023-03-10 14:16:23,364 - INFO - [Train] step: 349199, loss: 0.035477, lr: 0.000200
2023-03-10 14:18:06,480 - INFO - [Train] step: 349599, loss: 0.031506, lr: 0.000200
2023-03-10 14:19:51,448 - INFO - [Train] step: 349999, loss: 0.013771, lr: 0.000200
2023-03-10 14:22:06,570 - INFO - [Train] step: 350399, loss: 0.029451, lr: 0.000200
2023-03-10 14:23:49,230 - INFO - [Train] step: 350799, loss: 0.027352, lr: 0.000200
2023-03-10 14:25:32,616 - INFO - [Train] step: 351199, loss: 0.019392, lr: 0.000200
2023-03-10 14:27:14,918 - INFO - [Train] step: 351599, loss: 0.023308, lr: 0.000200
2023-03-10 14:28:56,965 - INFO - [Train] step: 351999, loss: 0.022139, lr: 0.000200
2023-03-10 14:30:39,985 - INFO - [Train] step: 352399, loss: 0.030674, lr: 0.000200
2023-03-10 14:32:22,535 - INFO - [Train] step: 352799, loss: 0.040542, lr: 0.000200
2023-03-10 14:34:03,759 - INFO - [Train] step: 353199, loss: 0.021595, lr: 0.000200
2023-03-10 14:35:47,070 - INFO - [Train] step: 353599, loss: 0.030127, lr: 0.000200
2023-03-10 14:37:28,826 - INFO - [Train] step: 353999, loss: 0.030716, lr: 0.000200
2023-03-10 14:39:10,871 - INFO - [Train] step: 354399, loss: 0.020815, lr: 0.000200
2023-03-10 14:40:52,321 - INFO - [Train] step: 354799, loss: 0.041460, lr: 0.000200
2023-03-10 14:43:04,932 - INFO - [Train] step: 355199, loss: 0.037689, lr: 0.000200
2023-03-10 14:44:47,351 - INFO - [Train] step: 355599, loss: 0.019737, lr: 0.000200
2023-03-10 14:46:31,890 - INFO - [Train] step: 355999, loss: 0.032996, lr: 0.000200
2023-03-10 14:48:15,680 - INFO - [Train] step: 356399, loss: 0.033524, lr: 0.000200
2023-03-10 14:49:58,607 - INFO - [Train] step: 356799, loss: 0.027061, lr: 0.000200
2023-03-10 14:51:41,857 - INFO - [Train] step: 357199, loss: 0.029481, lr: 0.000200
2023-03-10 14:53:23,537 - INFO - [Train] step: 357599, loss: 0.022982, lr: 0.000200
2023-03-10 14:55:07,342 - INFO - [Train] step: 357999, loss: 0.025093, lr: 0.000200
2023-03-10 14:56:52,636 - INFO - [Train] step: 358399, loss: 0.025035, lr: 0.000200
2023-03-10 14:58:37,152 - INFO - [Train] step: 358799, loss: 0.021274, lr: 0.000200
2023-03-10 15:00:19,875 - INFO - [Train] step: 359199, loss: 0.021313, lr: 0.000200
2023-03-10 15:02:03,122 - INFO - [Train] step: 359599, loss: 0.018964, lr: 0.000200
2023-03-10 15:03:46,907 - INFO - [Train] step: 359999, loss: 0.021291, lr: 0.000200
2023-03-10 15:06:03,873 - INFO - [Train] step: 360399, loss: 0.027143, lr: 0.000200
2023-03-10 15:07:48,223 - INFO - [Train] step: 360799, loss: 0.023704, lr: 0.000200
2023-03-10 15:09:32,869 - INFO - [Train] step: 361199, loss: 0.018795, lr: 0.000200
2023-03-10 15:11:17,105 - INFO - [Train] step: 361599, loss: 0.028667, lr: 0.000200
2023-03-10 15:13:01,823 - INFO - [Train] step: 361999, loss: 0.035749, lr: 0.000200
2023-03-10 15:14:47,519 - INFO - [Train] step: 362399, loss: 0.022481, lr: 0.000200
2023-03-10 15:16:29,369 - INFO - [Train] step: 362799, loss: 0.028018, lr: 0.000200
2023-03-10 15:18:03,492 - INFO - [Train] step: 363199, loss: 0.042739, lr: 0.000200
2023-03-10 15:19:38,302 - INFO - [Train] step: 363599, loss: 0.023154, lr: 0.000200
2023-03-10 15:21:18,199 - INFO - [Train] step: 363999, loss: 0.023028, lr: 0.000200
2023-03-10 15:23:00,508 - INFO - [Train] step: 364399, loss: 0.039000, lr: 0.000200
2023-03-10 15:24:45,691 - INFO - [Train] step: 364799, loss: 0.025135, lr: 0.000200
2023-03-10 15:27:02,765 - INFO - [Train] step: 365199, loss: 0.038952, lr: 0.000200
2023-03-10 15:28:50,164 - INFO - [Train] step: 365599, loss: 0.018549, lr: 0.000200
2023-03-10 15:30:33,059 - INFO - [Train] step: 365999, loss: 0.022129, lr: 0.000200
2023-03-10 15:32:17,794 - INFO - [Train] step: 366399, loss: 0.027228, lr: 0.000200
2023-03-10 15:34:01,538 - INFO - [Train] step: 366799, loss: 0.032223, lr: 0.000200
2023-03-10 15:35:45,065 - INFO - [Train] step: 367199, loss: 0.031922, lr: 0.000200
2023-03-10 15:37:29,651 - INFO - [Train] step: 367599, loss: 0.026263, lr: 0.000200
2023-03-10 15:39:12,667 - INFO - [Train] step: 367999, loss: 0.022136, lr: 0.000200
2023-03-10 15:40:56,515 - INFO - [Train] step: 368399, loss: 0.030063, lr: 0.000200
2023-03-10 15:42:38,665 - INFO - [Train] step: 368799, loss: 0.051341, lr: 0.000200
2023-03-10 15:44:20,574 - INFO - [Train] step: 369199, loss: 0.021911, lr: 0.000200
2023-03-10 15:46:06,935 - INFO - [Train] step: 369599, loss: 0.037807, lr: 0.000200
2023-03-10 15:47:48,320 - INFO - [Train] step: 369999, loss: 0.031412, lr: 0.000200
2023-03-10 15:50:03,146 - INFO - [Train] step: 370399, loss: 0.021149, lr: 0.000200
2023-03-10 15:51:46,717 - INFO - [Train] step: 370799, loss: 0.023836, lr: 0.000200
2023-03-10 15:53:31,579 - INFO - [Train] step: 371199, loss: 0.036983, lr: 0.000200
2023-03-10 15:55:14,564 - INFO - [Train] step: 371599, loss: 0.016131, lr: 0.000200
2023-03-10 15:56:58,445 - INFO - [Train] step: 371999, loss: 0.026495, lr: 0.000200
2023-03-10 15:58:40,589 - INFO - [Train] step: 372399, loss: 0.040617, lr: 0.000200
2023-03-10 16:00:22,370 - INFO - [Train] step: 372799, loss: 0.018888, lr: 0.000200
2023-03-10 16:02:05,252 - INFO - [Train] step: 373199, loss: 0.021950, lr: 0.000200
2023-03-10 16:03:47,584 - INFO - [Train] step: 373599, loss: 0.024717, lr: 0.000200
2023-03-10 16:05:30,726 - INFO - [Train] step: 373999, loss: 0.022129, lr: 0.000200
2023-03-10 16:07:13,445 - INFO - [Train] step: 374399, loss: 0.029347, lr: 0.000200
2023-03-10 16:08:56,347 - INFO - [Train] step: 374799, loss: 0.035691, lr: 0.000200
2023-03-10 16:11:09,273 - INFO - [Train] step: 375199, loss: 0.020397, lr: 0.000200
2023-03-10 16:12:53,287 - INFO - [Train] step: 375599, loss: 0.022544, lr: 0.000200
2023-03-10 16:14:36,402 - INFO - [Train] step: 375999, loss: 0.024228, lr: 0.000200
2023-03-10 16:16:21,371 - INFO - [Train] step: 376399, loss: 0.024926, lr: 0.000200
2023-03-10 16:18:05,643 - INFO - [Train] step: 376799, loss: 0.020744, lr: 0.000200
2023-03-10 16:19:48,686 - INFO - [Train] step: 377199, loss: 0.034363, lr: 0.000200
2023-03-10 16:21:33,553 - INFO - [Train] step: 377599, loss: 0.030884, lr: 0.000200
2023-03-10 16:23:16,033 - INFO - [Train] step: 377999, loss: 0.026088, lr: 0.000200
2023-03-10 16:24:57,756 - INFO - [Train] step: 378399, loss: 0.041546, lr: 0.000200
2023-03-10 16:26:39,306 - INFO - [Train] step: 378799, loss: 0.026836, lr: 0.000200
2023-03-10 16:28:21,002 - INFO - [Train] step: 379199, loss: 0.026563, lr: 0.000200
2023-03-10 16:30:04,101 - INFO - [Train] step: 379599, loss: 0.018966, lr: 0.000200
2023-03-10 16:31:45,241 - INFO - [Train] step: 379999, loss: 0.025374, lr: 0.000200
2023-03-10 16:34:00,814 - INFO - [Train] step: 380399, loss: 0.019979, lr: 0.000200
2023-03-10 16:35:43,284 - INFO - [Train] step: 380799, loss: 0.027666, lr: 0.000200
2023-03-10 16:37:26,163 - INFO - [Train] step: 381199, loss: 0.030459, lr: 0.000200
2023-03-10 16:39:10,153 - INFO - [Train] step: 381599, loss: 0.027061, lr: 0.000200
2023-03-10 16:40:53,242 - INFO - [Train] step: 381999, loss: 0.023279, lr: 0.000200
2023-03-10 16:42:35,601 - INFO - [Train] step: 382399, loss: 0.023703, lr: 0.000200
2023-03-10 16:44:17,636 - INFO - [Train] step: 382799, loss: 0.032738, lr: 0.000200
2023-03-10 16:45:59,418 - INFO - [Train] step: 383199, loss: 0.029217, lr: 0.000200
2023-03-10 16:47:43,390 - INFO - [Train] step: 383599, loss: 0.027534, lr: 0.000200
2023-03-10 16:49:26,173 - INFO - [Train] step: 383999, loss: 0.022151, lr: 0.000200
2023-03-10 16:51:10,714 - INFO - [Train] step: 384399, loss: 0.027255, lr: 0.000200
2023-03-10 16:52:46,296 - INFO - [Train] step: 384799, loss: 0.023971, lr: 0.000200
2023-03-10 16:54:48,786 - INFO - [Train] step: 385199, loss: 0.042993, lr: 0.000200
2023-03-10 16:56:26,572 - INFO - [Train] step: 385599, loss: 0.027602, lr: 0.000200
2023-03-10 16:58:08,428 - INFO - [Train] step: 385999, loss: 0.039287, lr: 0.000200
2023-03-10 16:59:52,315 - INFO - [Train] step: 386399, loss: 0.032820, lr: 0.000200
2023-03-10 17:01:35,092 - INFO - [Train] step: 386799, loss: 0.026216, lr: 0.000200
2023-03-10 17:03:17,071 - INFO - [Train] step: 387199, loss: 0.025115, lr: 0.000200
2023-03-10 17:04:58,310 - INFO - [Train] step: 387599, loss: 0.024573, lr: 0.000200
2023-03-10 17:06:42,198 - INFO - [Train] step: 387999, loss: 0.021860, lr: 0.000200
2023-03-10 17:08:26,006 - INFO - [Train] step: 388399, loss: 0.025521, lr: 0.000200
2023-03-10 17:10:09,563 - INFO - [Train] step: 388799, loss: 0.026433, lr: 0.000200
2023-03-10 17:11:51,767 - INFO - [Train] step: 389199, loss: 0.029369, lr: 0.000200
2023-03-10 17:13:34,933 - INFO - [Train] step: 389599, loss: 0.031923, lr: 0.000200
2023-03-10 17:15:17,409 - INFO - [Train] step: 389999, loss: 0.026438, lr: 0.000200
2023-03-10 17:17:32,707 - INFO - [Train] step: 390399, loss: 0.016615, lr: 0.000200
2023-03-10 17:19:15,301 - INFO - [Train] step: 390799, loss: 0.021874, lr: 0.000200
2023-03-10 17:20:57,208 - INFO - [Train] step: 391199, loss: 0.037202, lr: 0.000200
2023-03-10 17:22:41,649 - INFO - [Train] step: 391599, loss: 0.026732, lr: 0.000200
2023-03-10 17:24:23,350 - INFO - [Train] step: 391999, loss: 0.025426, lr: 0.000200
2023-03-10 17:26:04,738 - INFO - [Train] step: 392399, loss: 0.029743, lr: 0.000200
2023-03-10 17:27:48,636 - INFO - [Train] step: 392799, loss: 0.029883, lr: 0.000200
2023-03-10 17:29:30,363 - INFO - [Train] step: 393199, loss: 0.042839, lr: 0.000200
2023-03-10 17:31:13,298 - INFO - [Train] step: 393599, loss: 0.028344, lr: 0.000200
2023-03-10 17:32:55,114 - INFO - [Train] step: 393999, loss: 0.031369, lr: 0.000200
2023-03-10 17:34:37,229 - INFO - [Train] step: 394399, loss: 0.027497, lr: 0.000200
2023-03-10 17:36:20,609 - INFO - [Train] step: 394799, loss: 0.035720, lr: 0.000200
2023-03-10 17:38:32,386 - INFO - [Train] step: 395199, loss: 0.027157, lr: 0.000200
2023-03-10 17:40:16,064 - INFO - [Train] step: 395599, loss: 0.033364, lr: 0.000200
2023-03-10 17:41:57,970 - INFO - [Train] step: 395999, loss: 0.013800, lr: 0.000200
2023-03-10 17:43:39,736 - INFO - [Train] step: 396399, loss: 0.033356, lr: 0.000200
2023-03-10 17:45:21,272 - INFO - [Train] step: 396799, loss: 0.035015, lr: 0.000200
2023-03-10 17:47:05,246 - INFO - [Train] step: 397199, loss: 0.031735, lr: 0.000200
2023-03-10 17:48:48,597 - INFO - [Train] step: 397599, loss: 0.018940, lr: 0.000200
2023-03-10 17:50:31,789 - INFO - [Train] step: 397999, loss: 0.028669, lr: 0.000200
2023-03-10 17:52:14,279 - INFO - [Train] step: 398399, loss: 0.022696, lr: 0.000200
2023-03-10 17:53:56,080 - INFO - [Train] step: 398799, loss: 0.019376, lr: 0.000200
2023-03-10 17:55:39,725 - INFO - [Train] step: 399199, loss: 0.027300, lr: 0.000200
2023-03-10 17:57:22,236 - INFO - [Train] step: 399599, loss: 0.033151, lr: 0.000200
2023-03-10 17:59:05,770 - INFO - [Train] step: 399999, loss: 0.021288, lr: 0.000200
2023-03-10 18:01:20,457 - INFO - [Train] step: 400399, loss: 0.023961, lr: 0.000200
2023-03-10 18:03:02,781 - INFO - [Train] step: 400799, loss: 0.023341, lr: 0.000200
2023-03-10 18:04:45,576 - INFO - [Train] step: 401199, loss: 0.037324, lr: 0.000200
2023-03-10 18:06:29,494 - INFO - [Train] step: 401599, loss: 0.023463, lr: 0.000200
2023-03-10 18:08:11,144 - INFO - [Train] step: 401999, loss: 0.025830, lr: 0.000200
2023-03-10 18:09:52,315 - INFO - [Train] step: 402399, loss: 0.025738, lr: 0.000200
2023-03-10 18:11:37,879 - INFO - [Train] step: 402799, loss: 0.023220, lr: 0.000200
2023-03-10 18:13:20,548 - INFO - [Train] step: 403199, loss: 0.028134, lr: 0.000200
2023-03-10 18:15:02,409 - INFO - [Train] step: 403599, loss: 0.031173, lr: 0.000200
2023-03-10 18:16:43,811 - INFO - [Train] step: 403999, loss: 0.027824, lr: 0.000200
2023-03-10 18:18:27,290 - INFO - [Train] step: 404399, loss: 0.018199, lr: 0.000200
2023-03-10 18:20:10,370 - INFO - [Train] step: 404799, loss: 0.021688, lr: 0.000200
2023-03-10 18:22:22,846 - INFO - [Train] step: 405199, loss: 0.021608, lr: 0.000200
2023-03-10 18:24:06,178 - INFO - [Train] step: 405599, loss: 0.025170, lr: 0.000200
2023-03-10 18:25:48,503 - INFO - [Train] step: 405999, loss: 0.030580, lr: 0.000200
2023-03-10 18:27:26,642 - INFO - [Train] step: 406399, loss: 0.020444, lr: 0.000200
2023-03-10 18:29:00,299 - INFO - [Train] step: 406799, loss: 0.013582, lr: 0.000200
2023-03-10 18:30:34,298 - INFO - [Train] step: 407199, loss: 0.030125, lr: 0.000200
2023-03-10 18:32:15,154 - INFO - [Train] step: 407599, loss: 0.037625, lr: 0.000200
2023-03-10 18:33:58,603 - INFO - [Train] step: 407999, loss: 0.040903, lr: 0.000200
2023-03-10 18:35:41,237 - INFO - [Train] step: 408399, loss: 0.015564, lr: 0.000200
2023-03-10 18:37:25,385 - INFO - [Train] step: 408799, loss: 0.032588, lr: 0.000200
2023-03-10 18:39:09,222 - INFO - [Train] step: 409199, loss: 0.025522, lr: 0.000200
2023-03-10 18:40:52,300 - INFO - [Train] step: 409599, loss: 0.035405, lr: 0.000200
2023-03-10 18:42:35,202 - INFO - [Train] step: 409999, loss: 0.028996, lr: 0.000200
2023-03-10 18:44:50,244 - INFO - [Train] step: 410399, loss: 0.041242, lr: 0.000200
2023-03-10 18:46:31,865 - INFO - [Train] step: 410799, loss: 0.033680, lr: 0.000200
2023-03-10 18:48:16,653 - INFO - [Train] step: 411199, loss: 0.025289, lr: 0.000200
2023-03-10 18:49:58,360 - INFO - [Train] step: 411599, loss: 0.023296, lr: 0.000200
2023-03-10 18:51:40,581 - INFO - [Train] step: 411999, loss: 0.018378, lr: 0.000200
2023-03-10 18:53:22,761 - INFO - [Train] step: 412399, loss: 0.025359, lr: 0.000200
2023-03-10 18:55:05,505 - INFO - [Train] step: 412799, loss: 0.042454, lr: 0.000200
2023-03-10 18:56:49,274 - INFO - [Train] step: 413199, loss: 0.036518, lr: 0.000200
2023-03-10 18:58:31,037 - INFO - [Train] step: 413599, loss: 0.029643, lr: 0.000200
2023-03-10 19:00:14,393 - INFO - [Train] step: 413999, loss: 0.025312, lr: 0.000200
2023-03-10 19:01:58,024 - INFO - [Train] step: 414399, loss: 0.021798, lr: 0.000200
2023-03-10 19:03:40,651 - INFO - [Train] step: 414799, loss: 0.033422, lr: 0.000200
2023-03-10 19:05:52,713 - INFO - [Train] step: 415199, loss: 0.031548, lr: 0.000200
2023-03-10 19:07:35,558 - INFO - [Train] step: 415599, loss: 0.028099, lr: 0.000200
2023-03-10 19:09:17,901 - INFO - [Train] step: 415999, loss: 0.027923, lr: 0.000200
2023-03-10 19:11:03,024 - INFO - [Train] step: 416399, loss: 0.025036, lr: 0.000200
2023-03-10 19:12:45,050 - INFO - [Train] step: 416799, loss: 0.037986, lr: 0.000200
2023-03-10 19:14:27,725 - INFO - [Train] step: 417199, loss: 0.034799, lr: 0.000200
2023-03-10 19:16:11,692 - INFO - [Train] step: 417599, loss: 0.035247, lr: 0.000200
2023-03-10 19:17:53,651 - INFO - [Train] step: 417999, loss: 0.024990, lr: 0.000200
2023-03-10 19:19:36,013 - INFO - [Train] step: 418399, loss: 0.034105, lr: 0.000200
2023-03-10 19:21:18,004 - INFO - [Train] step: 418799, loss: 0.033857, lr: 0.000200
2023-03-10 19:23:00,120 - INFO - [Train] step: 419199, loss: 0.020779, lr: 0.000200
2023-03-10 19:24:42,630 - INFO - [Train] step: 419599, loss: 0.032761, lr: 0.000200
2023-03-10 19:26:24,433 - INFO - [Train] step: 419999, loss: 0.029654, lr: 0.000200
2023-03-10 19:28:38,654 - INFO - [Train] step: 420399, loss: 0.031273, lr: 0.000200
2023-03-10 19:30:20,505 - INFO - [Train] step: 420799, loss: 0.027450, lr: 0.000200
2023-03-10 19:32:03,777 - INFO - [Train] step: 421199, loss: 0.023619, lr: 0.000200
2023-03-10 19:33:45,981 - INFO - [Train] step: 421599, loss: 0.023806, lr: 0.000200
2023-03-10 19:35:29,473 - INFO - [Train] step: 421999, loss: 0.023131, lr: 0.000200
2023-03-10 19:37:12,205 - INFO - [Train] step: 422399, loss: 0.043014, lr: 0.000200
2023-03-10 19:38:54,395 - INFO - [Train] step: 422799, loss: 0.031128, lr: 0.000200
2023-03-10 19:40:36,843 - INFO - [Train] step: 423199, loss: 0.026797, lr: 0.000200
2023-03-10 19:42:18,386 - INFO - [Train] step: 423599, loss: 0.025878, lr: 0.000200
2023-03-10 19:44:00,739 - INFO - [Train] step: 423999, loss: 0.040456, lr: 0.000200
2023-03-10 19:45:42,975 - INFO - [Train] step: 424399, loss: 0.009181, lr: 0.000200
2023-03-10 19:47:24,419 - INFO - [Train] step: 424799, loss: 0.034713, lr: 0.000200
2023-03-10 19:49:36,159 - INFO - [Train] step: 425199, loss: 0.031522, lr: 0.000200
2023-03-10 19:51:17,981 - INFO - [Train] step: 425599, loss: 0.022441, lr: 0.000200
2023-03-10 19:53:00,082 - INFO - [Train] step: 425999, loss: 0.021501, lr: 0.000200
2023-03-10 19:54:41,342 - INFO - [Train] step: 426399, loss: 0.026957, lr: 0.000200
2023-03-10 19:56:24,464 - INFO - [Train] step: 426799, loss: 0.024249, lr: 0.000200
2023-03-10 19:58:08,345 - INFO - [Train] step: 427199, loss: 0.013641, lr: 0.000200
2023-03-10 19:59:50,571 - INFO - [Train] step: 427599, loss: 0.024098, lr: 0.000200
2023-03-10 20:01:32,939 - INFO - [Train] step: 427999, loss: 0.028528, lr: 0.000200
2023-03-10 20:03:08,779 - INFO - [Train] step: 428399, loss: 0.030292, lr: 0.000200
2023-03-10 20:04:41,297 - INFO - [Train] step: 428799, loss: 0.024271, lr: 0.000200
2023-03-10 20:06:14,083 - INFO - [Train] step: 429199, loss: 0.015570, lr: 0.000200
2023-03-10 20:07:56,262 - INFO - [Train] step: 429599, loss: 0.032142, lr: 0.000200
2023-03-10 20:09:38,436 - INFO - [Train] step: 429999, loss: 0.029501, lr: 0.000200
2023-03-10 20:11:51,759 - INFO - [Train] step: 430399, loss: 0.028533, lr: 0.000200
2023-03-10 20:13:33,837 - INFO - [Train] step: 430799, loss: 0.024351, lr: 0.000200
2023-03-10 20:15:15,647 - INFO - [Train] step: 431199, loss: 0.024076, lr: 0.000200
2023-03-10 20:16:58,469 - INFO - [Train] step: 431599, loss: 0.017913, lr: 0.000200
2023-03-10 20:18:41,110 - INFO - [Train] step: 431999, loss: 0.025017, lr: 0.000200
2023-03-10 20:20:24,314 - INFO - [Train] step: 432399, loss: 0.013439, lr: 0.000200
2023-03-10 20:22:09,339 - INFO - [Train] step: 432799, loss: 0.031664, lr: 0.000200
2023-03-10 20:23:51,337 - INFO - [Train] step: 433199, loss: 0.029507, lr: 0.000200
2023-03-10 20:25:34,081 - INFO - [Train] step: 433599, loss: 0.026521, lr: 0.000200
2023-03-10 20:27:16,652 - INFO - [Train] step: 433999, loss: 0.042946, lr: 0.000200
2023-03-10 20:28:58,955 - INFO - [Train] step: 434399, loss: 0.012240, lr: 0.000200
2023-03-10 20:30:42,681 - INFO - [Train] step: 434799, loss: 0.033340, lr: 0.000200
2023-03-10 20:32:56,729 - INFO - [Train] step: 435199, loss: 0.032275, lr: 0.000200
2023-03-10 20:34:38,750 - INFO - [Train] step: 435599, loss: 0.031759, lr: 0.000200
2023-03-10 20:36:24,550 - INFO - [Train] step: 435999, loss: 0.043802, lr: 0.000200
2023-03-10 20:38:07,716 - INFO - [Train] step: 436399, loss: 0.029648, lr: 0.000200
2023-03-10 20:39:49,833 - INFO - [Train] step: 436799, loss: 0.025855, lr: 0.000200
2023-03-10 20:41:31,511 - INFO - [Train] step: 437199, loss: 0.035825, lr: 0.000200
2023-03-10 20:43:14,489 - INFO - [Train] step: 437599, loss: 0.032463, lr: 0.000200
2023-03-10 20:44:55,269 - INFO - [Train] step: 437999, loss: 0.031178, lr: 0.000200
2023-03-10 20:46:37,385 - INFO - [Train] step: 438399, loss: 0.042982, lr: 0.000200
2023-03-10 20:48:19,135 - INFO - [Train] step: 438799, loss: 0.021897, lr: 0.000200
2023-03-10 20:50:01,506 - INFO - [Train] step: 439199, loss: 0.021245, lr: 0.000200
2023-03-10 20:51:44,237 - INFO - [Train] step: 439599, loss: 0.025613, lr: 0.000200
2023-03-10 20:53:26,696 - INFO - [Train] step: 439999, loss: 0.033961, lr: 0.000200
2023-03-10 20:55:41,167 - INFO - [Train] step: 440399, loss: 0.027076, lr: 0.000200
2023-03-10 20:57:24,977 - INFO - [Train] step: 440799, loss: 0.031290, lr: 0.000200
2023-03-10 20:59:09,120 - INFO - [Train] step: 441199, loss: 0.032401, lr: 0.000200
2023-03-10 21:00:53,795 - INFO - [Train] step: 441599, loss: 0.023366, lr: 0.000200
2023-03-10 21:02:39,367 - INFO - [Train] step: 441999, loss: 0.016649, lr: 0.000200
2023-03-10 21:04:23,578 - INFO - [Train] step: 442399, loss: 0.024848, lr: 0.000200
2023-03-10 21:06:06,929 - INFO - [Train] step: 442799, loss: 0.027014, lr: 0.000200
2023-03-10 21:07:49,776 - INFO - [Train] step: 443199, loss: 0.038616, lr: 0.000200
2023-03-10 21:09:32,484 - INFO - [Train] step: 443599, loss: 0.038735, lr: 0.000200
2023-03-10 21:11:15,304 - INFO - [Train] step: 443999, loss: 0.037054, lr: 0.000200
2023-03-10 21:12:59,114 - INFO - [Train] step: 444399, loss: 0.022802, lr: 0.000200
2023-03-10 21:14:42,583 - INFO - [Train] step: 444799, loss: 0.031240, lr: 0.000200
2023-03-10 21:16:56,097 - INFO - [Train] step: 445199, loss: 0.031301, lr: 0.000200
2023-03-10 21:18:37,713 - INFO - [Train] step: 445599, loss: 0.050352, lr: 0.000200
2023-03-10 21:20:19,400 - INFO - [Train] step: 445999, loss: 0.030057, lr: 0.000200
2023-03-10 21:22:02,529 - INFO - [Train] step: 446399, loss: 0.033640, lr: 0.000200
2023-03-10 21:23:46,155 - INFO - [Train] step: 446799, loss: 0.019549, lr: 0.000200
2023-03-10 21:25:27,971 - INFO - [Train] step: 447199, loss: 0.022429, lr: 0.000200
2023-03-10 21:27:10,921 - INFO - [Train] step: 447599, loss: 0.027096, lr: 0.000200
2023-03-10 21:28:56,284 - INFO - [Train] step: 447999, loss: 0.030744, lr: 0.000200
2023-03-10 21:30:38,670 - INFO - [Train] step: 448399, loss: 0.029772, lr: 0.000200
2023-03-10 21:32:21,802 - INFO - [Train] step: 448799, loss: 0.034951, lr: 0.000200
2023-03-10 21:34:03,814 - INFO - [Train] step: 449199, loss: 0.034505, lr: 0.000200
2023-03-10 21:35:44,901 - INFO - [Train] step: 449599, loss: 0.038586, lr: 0.000200
2023-03-10 21:37:27,933 - INFO - [Train] step: 449999, loss: 0.038345, lr: 0.000200
2023-03-10 21:39:30,956 - INFO - [Train] step: 450399, loss: 0.013052, lr: 0.000200
2023-03-10 21:41:03,814 - INFO - [Train] step: 450799, loss: 0.023229, lr: 0.000200
2023-03-10 21:42:43,637 - INFO - [Train] step: 451199, loss: 0.035016, lr: 0.000200
2023-03-10 21:44:27,014 - INFO - [Train] step: 451599, loss: 0.020742, lr: 0.000200
2023-03-10 21:46:09,141 - INFO - [Train] step: 451999, loss: 0.026348, lr: 0.000200
2023-03-10 21:47:52,239 - INFO - [Train] step: 452399, loss: 0.020703, lr: 0.000200
2023-03-10 21:49:34,864 - INFO - [Train] step: 452799, loss: 0.025129, lr: 0.000200
2023-03-10 21:51:17,884 - INFO - [Train] step: 453199, loss: 0.024020, lr: 0.000200
2023-03-10 21:52:59,847 - INFO - [Train] step: 453599, loss: 0.028096, lr: 0.000200
2023-03-10 21:54:41,603 - INFO - [Train] step: 453999, loss: 0.023354, lr: 0.000200
2023-03-10 21:56:22,823 - INFO - [Train] step: 454399, loss: 0.022575, lr: 0.000200
2023-03-10 21:58:04,393 - INFO - [Train] step: 454799, loss: 0.024618, lr: 0.000200
2023-03-10 22:00:17,263 - INFO - [Train] step: 455199, loss: 0.022457, lr: 0.000200
2023-03-10 22:01:58,244 - INFO - [Train] step: 455599, loss: 0.034556, lr: 0.000200
2023-03-10 22:03:42,583 - INFO - [Train] step: 455999, loss: 0.041829, lr: 0.000200
2023-03-10 22:05:24,520 - INFO - [Train] step: 456399, loss: 0.022625, lr: 0.000200
2023-03-10 22:07:06,986 - INFO - [Train] step: 456799, loss: 0.027538, lr: 0.000200
2023-03-10 22:08:48,677 - INFO - [Train] step: 457199, loss: 0.025559, lr: 0.000200
2023-03-10 22:10:30,995 - INFO - [Train] step: 457599, loss: 0.016978, lr: 0.000200
2023-03-10 22:12:12,902 - INFO - [Train] step: 457999, loss: 0.012517, lr: 0.000200
2023-03-10 22:13:56,680 - INFO - [Train] step: 458399, loss: 0.017018, lr: 0.000200
2023-03-10 22:15:39,666 - INFO - [Train] step: 458799, loss: 0.031181, lr: 0.000200
2023-03-10 22:17:23,440 - INFO - [Train] step: 459199, loss: 0.021951, lr: 0.000200
2023-03-10 22:19:05,327 - INFO - [Train] step: 459599, loss: 0.030455, lr: 0.000200
2023-03-10 22:20:48,079 - INFO - [Train] step: 459999, loss: 0.017150, lr: 0.000200
2023-03-10 22:23:00,892 - INFO - [Train] step: 460399, loss: 0.027841, lr: 0.000200
2023-03-10 22:24:43,694 - INFO - [Train] step: 460799, loss: 0.025771, lr: 0.000200
2023-03-10 22:26:26,168 - INFO - [Train] step: 461199, loss: 0.023044, lr: 0.000200
2023-03-10 22:28:09,164 - INFO - [Train] step: 461599, loss: 0.028935, lr: 0.000200
2023-03-10 22:29:51,638 - INFO - [Train] step: 461999, loss: 0.025923, lr: 0.000200
2023-03-10 22:31:34,840 - INFO - [Train] step: 462399, loss: 0.039752, lr: 0.000200
2023-03-10 22:33:16,779 - INFO - [Train] step: 462799, loss: 0.017242, lr: 0.000200
2023-03-10 22:34:59,670 - INFO - [Train] step: 463199, loss: 0.020470, lr: 0.000200
2023-03-10 22:36:41,965 - INFO - [Train] step: 463599, loss: 0.026836, lr: 0.000200
2023-03-10 22:38:24,922 - INFO - [Train] step: 463999, loss: 0.034893, lr: 0.000200
2023-03-10 22:40:07,154 - INFO - [Train] step: 464399, loss: 0.031905, lr: 0.000200
2023-03-10 22:41:48,239 - INFO - [Train] step: 464799, loss: 0.029437, lr: 0.000200
2023-03-10 22:44:01,040 - INFO - [Train] step: 465199, loss: 0.021069, lr: 0.000200
2023-03-10 22:45:43,140 - INFO - [Train] step: 465599, loss: 0.029145, lr: 0.000200
2023-03-10 22:47:24,188 - INFO - [Train] step: 465999, loss: 0.035546, lr: 0.000200
2023-03-10 22:49:07,683 - INFO - [Train] step: 466399, loss: 0.024657, lr: 0.000200
2023-03-10 22:50:51,196 - INFO - [Train] step: 466799, loss: 0.021906, lr: 0.000200
2023-03-10 22:52:33,227 - INFO - [Train] step: 467199, loss: 0.021075, lr: 0.000200
2023-03-10 22:54:15,817 - INFO - [Train] step: 467599, loss: 0.036184, lr: 0.000200
2023-03-10 22:55:57,849 - INFO - [Train] step: 467999, loss: 0.017704, lr: 0.000200
2023-03-10 22:57:41,394 - INFO - [Train] step: 468399, loss: 0.020127, lr: 0.000200
2023-03-10 22:59:25,178 - INFO - [Train] step: 468799, loss: 0.029565, lr: 0.000200
2023-03-10 23:01:06,955 - INFO - [Train] step: 469199, loss: 0.037371, lr: 0.000200
2023-03-10 23:02:48,634 - INFO - [Train] step: 469599, loss: 0.022167, lr: 0.000200
2023-03-10 23:04:30,463 - INFO - [Train] step: 469999, loss: 0.047451, lr: 0.000200
2023-03-10 23:06:43,594 - INFO - [Train] step: 470399, loss: 0.015834, lr: 0.000200
2023-03-10 23:08:25,171 - INFO - [Train] step: 470799, loss: 0.020712, lr: 0.000200
2023-03-10 23:10:07,640 - INFO - [Train] step: 471199, loss: 0.032194, lr: 0.000200
2023-03-10 23:11:48,729 - INFO - [Train] step: 471599, loss: 0.036296, lr: 0.000200
2023-03-10 23:13:26,852 - INFO - [Train] step: 471999, loss: 0.024809, lr: 0.000200
2023-03-10 23:15:02,297 - INFO - [Train] step: 472399, loss: 0.039000, lr: 0.000200
2023-03-10 23:16:38,045 - INFO - [Train] step: 472799, loss: 0.042202, lr: 0.000200
2023-03-10 23:18:18,770 - INFO - [Train] step: 473199, loss: 0.025496, lr: 0.000200
2023-03-10 23:20:00,149 - INFO - [Train] step: 473599, loss: 0.021370, lr: 0.000200
2023-03-10 23:21:42,713 - INFO - [Train] step: 473999, loss: 0.045534, lr: 0.000200
2023-03-10 23:23:26,035 - INFO - [Train] step: 474399, loss: 0.020978, lr: 0.000200
2023-03-10 23:25:10,076 - INFO - [Train] step: 474799, loss: 0.031871, lr: 0.000200
2023-03-10 23:27:23,854 - INFO - [Train] step: 475199, loss: 0.028090, lr: 0.000200
2023-03-10 23:29:05,918 - INFO - [Train] step: 475599, loss: 0.016506, lr: 0.000200
2023-03-10 23:30:47,830 - INFO - [Train] step: 475999, loss: 0.030463, lr: 0.000200
2023-03-10 23:32:30,128 - INFO - [Train] step: 476399, loss: 0.017393, lr: 0.000200
2023-03-10 23:34:11,843 - INFO - [Train] step: 476799, loss: 0.023469, lr: 0.000200
2023-03-10 23:35:54,191 - INFO - [Train] step: 477199, loss: 0.023373, lr: 0.000200
2023-03-10 23:37:37,090 - INFO - [Train] step: 477599, loss: 0.032134, lr: 0.000200
2023-03-10 23:39:19,850 - INFO - [Train] step: 477999, loss: 0.020879, lr: 0.000200
2023-03-10 23:41:03,275 - INFO - [Train] step: 478399, loss: 0.026544, lr: 0.000200
2023-03-10 23:42:46,377 - INFO - [Train] step: 478799, loss: 0.028985, lr: 0.000200
2023-03-10 23:44:28,658 - INFO - [Train] step: 479199, loss: 0.029176, lr: 0.000200
2023-03-10 23:46:10,644 - INFO - [Train] step: 479599, loss: 0.021878, lr: 0.000200
2023-03-10 23:47:52,621 - INFO - [Train] step: 479999, loss: 0.026000, lr: 0.000200
2023-03-10 23:50:06,505 - INFO - [Train] step: 480399, loss: 0.029289, lr: 0.000200
2023-03-10 23:51:49,800 - INFO - [Train] step: 480799, loss: 0.022680, lr: 0.000200
2023-03-10 23:53:31,076 - INFO - [Train] step: 481199, loss: 0.023891, lr: 0.000200
2023-03-10 23:55:12,399 - INFO - [Train] step: 481599, loss: 0.030456, lr: 0.000200
2023-03-10 23:56:54,797 - INFO - [Train] step: 481999, loss: 0.017138, lr: 0.000200
2023-03-10 23:58:37,271 - INFO - [Train] step: 482399, loss: 0.020398, lr: 0.000200
2023-03-11 00:00:19,303 - INFO - [Train] step: 482799, loss: 0.016322, lr: 0.000200
2023-03-11 00:02:00,145 - INFO - [Train] step: 483199, loss: 0.033316, lr: 0.000200
2023-03-11 00:03:42,975 - INFO - [Train] step: 483599, loss: 0.034274, lr: 0.000200
2023-03-11 00:05:25,243 - INFO - [Train] step: 483999, loss: 0.043766, lr: 0.000200
2023-03-11 00:07:06,255 - INFO - [Train] step: 484399, loss: 0.030277, lr: 0.000200
2023-03-11 00:08:48,119 - INFO - [Train] step: 484799, loss: 0.035444, lr: 0.000200
2023-03-11 00:11:01,162 - INFO - [Train] step: 485199, loss: 0.031326, lr: 0.000200
2023-03-11 00:12:44,304 - INFO - [Train] step: 485599, loss: 0.024171, lr: 0.000200
2023-03-11 00:14:27,492 - INFO - [Train] step: 485999, loss: 0.034419, lr: 0.000200
2023-03-11 00:16:10,144 - INFO - [Train] step: 486399, loss: 0.022506, lr: 0.000200
2023-03-11 00:17:52,001 - INFO - [Train] step: 486799, loss: 0.019630, lr: 0.000200
2023-03-11 00:19:33,245 - INFO - [Train] step: 487199, loss: 0.021345, lr: 0.000200
2023-03-11 00:21:14,860 - INFO - [Train] step: 487599, loss: 0.021322, lr: 0.000200
2023-03-11 00:22:56,461 - INFO - [Train] step: 487999, loss: 0.033395, lr: 0.000200
2023-03-11 00:24:38,604 - INFO - [Train] step: 488399, loss: 0.029070, lr: 0.000200
2023-03-11 00:26:19,870 - INFO - [Train] step: 488799, loss: 0.026873, lr: 0.000200
2023-03-11 00:28:02,156 - INFO - [Train] step: 489199, loss: 0.024174, lr: 0.000200
2023-03-11 00:29:44,167 - INFO - [Train] step: 489599, loss: 0.035889, lr: 0.000200
2023-03-11 00:31:27,058 - INFO - [Train] step: 489999, loss: 0.031467, lr: 0.000200
2023-03-11 00:33:41,929 - INFO - [Train] step: 490399, loss: 0.030309, lr: 0.000200
2023-03-11 00:35:24,551 - INFO - [Train] step: 490799, loss: 0.023509, lr: 0.000200
2023-03-11 00:37:05,972 - INFO - [Train] step: 491199, loss: 0.029083, lr: 0.000200
2023-03-11 00:38:47,122 - INFO - [Train] step: 491599, loss: 0.032163, lr: 0.000200
2023-03-11 00:40:29,931 - INFO - [Train] step: 491999, loss: 0.020218, lr: 0.000200
2023-03-11 00:42:14,013 - INFO - [Train] step: 492399, loss: 0.024855, lr: 0.000200
2023-03-11 00:43:56,404 - INFO - [Train] step: 492799, loss: 0.049681, lr: 0.000200
2023-03-11 00:45:38,921 - INFO - [Train] step: 493199, loss: 0.027381, lr: 0.000200
2023-03-11 00:47:21,876 - INFO - [Train] step: 493599, loss: 0.029986, lr: 0.000200
2023-03-11 00:48:59,127 - INFO - [Train] step: 493999, loss: 0.026004, lr: 0.000200
2023-03-11 00:50:32,459 - INFO - [Train] step: 494399, loss: 0.020072, lr: 0.000200
2023-03-11 00:52:08,527 - INFO - [Train] step: 494799, loss: 0.018208, lr: 0.000200
2023-03-11 00:54:20,490 - INFO - [Train] step: 495199, loss: 0.019521, lr: 0.000200
2023-03-11 00:56:01,924 - INFO - [Train] step: 495599, loss: 0.034215, lr: 0.000200
2023-03-11 00:57:45,635 - INFO - [Train] step: 495999, loss: 0.027655, lr: 0.000200
2023-03-11 00:59:26,850 - INFO - [Train] step: 496399, loss: 0.025161, lr: 0.000200
2023-03-11 01:01:08,959 - INFO - [Train] step: 496799, loss: 0.031968, lr: 0.000200
2023-03-11 01:02:51,589 - INFO - [Train] step: 497199, loss: 0.025443, lr: 0.000200
2023-03-11 01:04:33,771 - INFO - [Train] step: 497599, loss: 0.020604, lr: 0.000200
2023-03-11 01:06:16,417 - INFO - [Train] step: 497999, loss: 0.035153, lr: 0.000200
2023-03-11 01:07:59,755 - INFO - [Train] step: 498399, loss: 0.030182, lr: 0.000200
2023-03-11 01:09:41,989 - INFO - [Train] step: 498799, loss: 0.027526, lr: 0.000200
2023-03-11 01:11:24,857 - INFO - [Train] step: 499199, loss: 0.038287, lr: 0.000200
2023-03-11 01:13:07,837 - INFO - [Train] step: 499599, loss: 0.028442, lr: 0.000200
2023-03-11 01:14:49,502 - INFO - [Train] step: 499999, loss: 0.028576, lr: 0.000200
2023-03-11 01:17:02,906 - INFO - [Train] step: 500399, loss: 0.020362, lr: 0.000200
2023-03-11 01:18:45,068 - INFO - [Train] step: 500799, loss: 0.036856, lr: 0.000200
2023-03-11 01:20:27,415 - INFO - [Train] step: 501199, loss: 0.033719, lr: 0.000200
2023-03-11 01:22:10,721 - INFO - [Train] step: 501599, loss: 0.021055, lr: 0.000200
2023-03-11 01:23:52,820 - INFO - [Train] step: 501999, loss: 0.026492, lr: 0.000200
2023-03-11 01:25:34,881 - INFO - [Train] step: 502399, loss: 0.027340, lr: 0.000200
2023-03-11 01:27:16,793 - INFO - [Train] step: 502799, loss: 0.024699, lr: 0.000200
2023-03-11 01:28:59,225 - INFO - [Train] step: 503199, loss: 0.019780, lr: 0.000200
2023-03-11 01:30:40,878 - INFO - [Train] step: 503599, loss: 0.024131, lr: 0.000200
2023-03-11 01:32:22,530 - INFO - [Train] step: 503999, loss: 0.038246, lr: 0.000200
2023-03-11 01:34:05,066 - INFO - [Train] step: 504399, loss: 0.019212, lr: 0.000200
2023-03-11 01:35:48,382 - INFO - [Train] step: 504799, loss: 0.022100, lr: 0.000200
2023-03-11 01:38:05,088 - INFO - [Train] step: 505199, loss: 0.021934, lr: 0.000200
2023-03-11 01:39:47,260 - INFO - [Train] step: 505599, loss: 0.026686, lr: 0.000200
2023-03-11 01:41:28,951 - INFO - [Train] step: 505999, loss: 0.034988, lr: 0.000200
2023-03-11 01:43:10,478 - INFO - [Train] step: 506399, loss: 0.029692, lr: 0.000200
2023-03-11 01:44:52,225 - INFO - [Train] step: 506799, loss: 0.028317, lr: 0.000200
2023-03-11 01:46:34,892 - INFO - [Train] step: 507199, loss: 0.023073, lr: 0.000200
2023-03-11 01:48:17,145 - INFO - [Train] step: 507599, loss: 0.039102, lr: 0.000200
2023-03-11 01:49:59,274 - INFO - [Train] step: 507999, loss: 0.028901, lr: 0.000200
2023-03-11 01:51:42,033 - INFO - [Train] step: 508399, loss: 0.038829, lr: 0.000200
2023-03-11 01:53:24,984 - INFO - [Train] step: 508799, loss: 0.018607, lr: 0.000200
2023-03-11 01:55:08,078 - INFO - [Train] step: 509199, loss: 0.054066, lr: 0.000200
2023-03-11 01:56:50,274 - INFO - [Train] step: 509599, loss: 0.021172, lr: 0.000200
2023-03-11 01:58:32,739 - INFO - [Train] step: 509999, loss: 0.026961, lr: 0.000200
2023-03-11 02:00:46,471 - INFO - [Train] step: 510399, loss: 0.018270, lr: 0.000200
2023-03-11 02:02:29,850 - INFO - [Train] step: 510799, loss: 0.025722, lr: 0.000200
2023-03-11 02:04:12,128 - INFO - [Train] step: 511199, loss: 0.031351, lr: 0.000200
2023-03-11 02:05:54,556 - INFO - [Train] step: 511599, loss: 0.026921, lr: 0.000200
2023-03-11 02:07:37,397 - INFO - [Train] step: 511999, loss: 0.024276, lr: 0.000200
2023-03-11 02:09:19,058 - INFO - [Train] step: 512399, loss: 0.026941, lr: 0.000200
2023-03-11 02:11:00,772 - INFO - [Train] step: 512799, loss: 0.030720, lr: 0.000200
2023-03-11 02:12:42,262 - INFO - [Train] step: 513199, loss: 0.020747, lr: 0.000200
2023-03-11 02:14:24,564 - INFO - [Train] step: 513599, loss: 0.036467, lr: 0.000200
2023-03-11 02:16:06,516 - INFO - [Train] step: 513999, loss: 0.026044, lr: 0.000200
2023-03-11 02:17:48,971 - INFO - [Train] step: 514399, loss: 0.019958, lr: 0.000200
2023-03-11 02:19:30,743 - INFO - [Train] step: 514799, loss: 0.029629, lr: 0.000200
2023-03-11 02:21:43,705 - INFO - [Train] step: 515199, loss: 0.021715, lr: 0.000200
2023-03-11 02:23:25,447 - INFO - [Train] step: 515599, loss: 0.023783, lr: 0.000200
2023-03-11 02:24:58,524 - INFO - [Train] step: 515999, loss: 0.013915, lr: 0.000200
2023-03-11 02:26:31,790 - INFO - [Train] step: 516399, loss: 0.027201, lr: 0.000200
2023-03-11 02:28:07,893 - INFO - [Train] step: 516799, loss: 0.021591, lr: 0.000200
2023-03-11 02:29:50,755 - INFO - [Train] step: 517199, loss: 0.039288, lr: 0.000200
2023-03-11 02:31:32,357 - INFO - [Train] step: 517599, loss: 0.011968, lr: 0.000200
2023-03-11 02:33:16,243 - INFO - [Train] step: 517999, loss: 0.029244, lr: 0.000200
2023-03-11 02:34:58,004 - INFO - [Train] step: 518399, loss: 0.023959, lr: 0.000200
2023-03-11 02:36:41,654 - INFO - [Train] step: 518799, loss: 0.026320, lr: 0.000200
2023-03-11 02:38:24,403 - INFO - [Train] step: 519199, loss: 0.024104, lr: 0.000200
2023-03-11 02:40:07,218 - INFO - [Train] step: 519599, loss: 0.025987, lr: 0.000200
2023-03-11 02:41:50,606 - INFO - [Train] step: 519999, loss: 0.043320, lr: 0.000200
2023-03-11 02:44:05,886 - INFO - [Train] step: 520399, loss: 0.026904, lr: 0.000200
2023-03-11 02:45:47,083 - INFO - [Train] step: 520799, loss: 0.011966, lr: 0.000200
2023-03-11 02:47:29,263 - INFO - [Train] step: 521199, loss: 0.019935, lr: 0.000200
2023-03-11 02:49:11,188 - INFO - [Train] step: 521599, loss: 0.030593, lr: 0.000200
2023-03-11 02:50:54,379 - INFO - [Train] step: 521999, loss: 0.037690, lr: 0.000200
2023-03-11 02:52:37,318 - INFO - [Train] step: 522399, loss: 0.023554, lr: 0.000200
2023-03-11 02:54:21,044 - INFO - [Train] step: 522799, loss: 0.025800, lr: 0.000200
2023-03-11 02:56:03,098 - INFO - [Train] step: 523199, loss: 0.025518, lr: 0.000200
2023-03-11 02:57:44,710 - INFO - [Train] step: 523599, loss: 0.032718, lr: 0.000200
2023-03-11 02:59:27,540 - INFO - [Train] step: 523999, loss: 0.026441, lr: 0.000200
2023-03-11 03:01:09,061 - INFO - [Train] step: 524399, loss: 0.031789, lr: 0.000200
2023-03-11 03:02:51,834 - INFO - [Train] step: 524799, loss: 0.024400, lr: 0.000200
2023-03-11 03:05:05,061 - INFO - [Train] step: 525199, loss: 0.022073, lr: 0.000200
2023-03-11 03:06:47,677 - INFO - [Train] step: 525599, loss: 0.023217, lr: 0.000200
2023-03-11 03:08:31,687 - INFO - [Train] step: 525999, loss: 0.025623, lr: 0.000200
2023-03-11 03:10:15,012 - INFO - [Train] step: 526399, loss: 0.026305, lr: 0.000200
2023-03-11 03:11:56,661 - INFO - [Train] step: 526799, loss: 0.029734, lr: 0.000200
2023-03-11 03:13:39,311 - INFO - [Train] step: 527199, loss: 0.036114, lr: 0.000200
2023-03-11 03:15:20,341 - INFO - [Train] step: 527599, loss: 0.017975, lr: 0.000200
2023-03-11 03:17:03,023 - INFO - [Train] step: 527999, loss: 0.027799, lr: 0.000200
2023-03-11 03:18:46,020 - INFO - [Train] step: 528399, loss: 0.031106, lr: 0.000200
2023-03-11 03:20:28,426 - INFO - [Train] step: 528799, loss: 0.018561, lr: 0.000200
2023-03-11 03:22:12,031 - INFO - [Train] step: 529199, loss: 0.025763, lr: 0.000200
2023-03-11 03:23:54,061 - INFO - [Train] step: 529599, loss: 0.030601, lr: 0.000200
2023-03-11 03:25:37,621 - INFO - [Train] step: 529999, loss: 0.031678, lr: 0.000200
2023-03-11 03:27:51,638 - INFO - [Train] step: 530399, loss: 0.044532, lr: 0.000200
2023-03-11 03:29:34,363 - INFO - [Train] step: 530799, loss: 0.018691, lr: 0.000200
2023-03-11 03:31:16,631 - INFO - [Train] step: 531199, loss: 0.039127, lr: 0.000200
2023-03-11 03:32:58,739 - INFO - [Train] step: 531599, loss: 0.026812, lr: 0.000200
2023-03-11 03:34:41,501 - INFO - [Train] step: 531999, loss: 0.023788, lr: 0.000200
2023-03-11 03:36:22,504 - INFO - [Train] step: 532399, loss: 0.018293, lr: 0.000200
2023-03-11 03:38:04,644 - INFO - [Train] step: 532799, loss: 0.024379, lr: 0.000200
2023-03-11 03:39:48,262 - INFO - [Train] step: 533199, loss: 0.035836, lr: 0.000200
2023-03-11 03:41:30,583 - INFO - [Train] step: 533599, loss: 0.043838, lr: 0.000200
2023-03-11 03:43:12,590 - INFO - [Train] step: 533999, loss: 0.024618, lr: 0.000200
2023-03-11 03:44:54,470 - INFO - [Train] step: 534399, loss: 0.029253, lr: 0.000200
2023-03-11 03:46:36,675 - INFO - [Train] step: 534799, loss: 0.024839, lr: 0.000200
2023-03-11 03:48:49,244 - INFO - [Train] step: 535199, loss: 0.036920, lr: 0.000200
2023-03-11 03:50:31,857 - INFO - [Train] step: 535599, loss: 0.034313, lr: 0.000200
2023-03-11 03:52:15,900 - INFO - [Train] step: 535999, loss: 0.028148, lr: 0.000200
2023-03-11 03:53:59,310 - INFO - [Train] step: 536399, loss: 0.038342, lr: 0.000200
2023-03-11 03:55:42,353 - INFO - [Train] step: 536799, loss: 0.027760, lr: 0.000200
2023-03-11 03:57:24,646 - INFO - [Train] step: 537199, loss: 0.021819, lr: 0.000200
2023-03-11 03:59:05,245 - INFO - [Train] step: 537599, loss: 0.019809, lr: 0.000200
2023-03-11 04:00:39,137 - INFO - [Train] step: 537999, loss: 0.017628, lr: 0.000200
2023-03-11 04:02:11,561 - INFO - [Train] step: 538399, loss: 0.043671, lr: 0.000200
2023-03-11 04:03:51,147 - INFO - [Train] step: 538799, loss: 0.032103, lr: 0.000200
2023-03-11 04:05:32,672 - INFO - [Train] step: 539199, loss: 0.024616, lr: 0.000200
2023-03-11 04:07:14,800 - INFO - [Train] step: 539599, loss: 0.026302, lr: 0.000200
2023-03-11 04:08:55,971 - INFO - [Train] step: 539999, loss: 0.020861, lr: 0.000200
2023-03-11 04:11:09,229 - INFO - [Train] step: 540399, loss: 0.032685, lr: 0.000200
2023-03-11 04:12:50,671 - INFO - [Train] step: 540799, loss: 0.020326, lr: 0.000200
2023-03-11 04:14:33,020 - INFO - [Train] step: 541199, loss: 0.036796, lr: 0.000200
2023-03-11 04:16:15,533 - INFO - [Train] step: 541599, loss: 0.039057, lr: 0.000200
2023-03-11 04:17:58,564 - INFO - [Train] step: 541999, loss: 0.025228, lr: 0.000200
2023-03-11 04:19:42,853 - INFO - [Train] step: 542399, loss: 0.024108, lr: 0.000200
2023-03-11 04:21:25,851 - INFO - [Train] step: 542799, loss: 0.035436, lr: 0.000200
2023-03-11 04:23:09,555 - INFO - [Train] step: 543199, loss: 0.031479, lr: 0.000200
2023-03-11 04:24:51,735 - INFO - [Train] step: 543599, loss: 0.015816, lr: 0.000200
2023-03-11 04:26:33,457 - INFO - [Train] step: 543999, loss: 0.025242, lr: 0.000200
2023-03-11 04:28:18,324 - INFO - [Train] step: 544399, loss: 0.023158, lr: 0.000200
2023-03-11 04:30:01,359 - INFO - [Train] step: 544799, loss: 0.026339, lr: 0.000200
2023-03-11 04:32:15,278 - INFO - [Train] step: 545199, loss: 0.044575, lr: 0.000200
2023-03-11 04:33:58,030 - INFO - [Train] step: 545599, loss: 0.024351, lr: 0.000200
2023-03-11 04:35:40,382 - INFO - [Train] step: 545999, loss: 0.030484, lr: 0.000200
2023-03-11 04:37:23,068 - INFO - [Train] step: 546399, loss: 0.018178, lr: 0.000200
2023-03-11 04:39:04,844 - INFO - [Train] step: 546799, loss: 0.019922, lr: 0.000200
2023-03-11 04:40:49,581 - INFO - [Train] step: 547199, loss: 0.032777, lr: 0.000200
2023-03-11 04:42:31,665 - INFO - [Train] step: 547599, loss: 0.031229, lr: 0.000200
2023-03-11 04:44:15,132 - INFO - [Train] step: 547999, loss: 0.030593, lr: 0.000200
2023-03-11 04:45:57,449 - INFO - [Train] step: 548399, loss: 0.021047, lr: 0.000200
2023-03-11 04:47:39,184 - INFO - [Train] step: 548799, loss: 0.034316, lr: 0.000200
2023-03-11 04:49:21,180 - INFO - [Train] step: 549199, loss: 0.036999, lr: 0.000200
2023-03-11 04:51:03,450 - INFO - [Train] step: 549599, loss: 0.018877, lr: 0.000200
2023-03-11 04:52:45,833 - INFO - [Train] step: 549999, loss: 0.014030, lr: 0.000200
2023-03-11 04:55:00,409 - INFO - [Train] step: 550399, loss: 0.021667, lr: 0.000200
2023-03-11 04:56:42,230 - INFO - [Train] step: 550799, loss: 0.041801, lr: 0.000200
2023-03-11 04:58:24,691 - INFO - [Train] step: 551199, loss: 0.032784, lr: 0.000200
2023-03-11 05:00:07,915 - INFO - [Train] step: 551599, loss: 0.034218, lr: 0.000200
2023-03-11 05:01:50,040 - INFO - [Train] step: 551999, loss: 0.040939, lr: 0.000200
2023-03-11 05:03:32,509 - INFO - [Train] step: 552399, loss: 0.031203, lr: 0.000200
2023-03-11 05:05:13,903 - INFO - [Train] step: 552799, loss: 0.025906, lr: 0.000200
2023-03-11 05:06:56,924 - INFO - [Train] step: 553199, loss: 0.024916, lr: 0.000200
2023-03-11 05:08:39,368 - INFO - [Train] step: 553599, loss: 0.043218, lr: 0.000200
2023-03-11 05:10:22,538 - INFO - [Train] step: 553999, loss: 0.026039, lr: 0.000200
2023-03-11 05:12:04,904 - INFO - [Train] step: 554399, loss: 0.028689, lr: 0.000200
2023-03-11 05:13:46,497 - INFO - [Train] step: 554799, loss: 0.036337, lr: 0.000200
2023-03-11 05:15:58,855 - INFO - [Train] step: 555199, loss: 0.028384, lr: 0.000200
2023-03-11 05:17:41,210 - INFO - [Train] step: 555599, loss: 0.028245, lr: 0.000200
2023-03-11 05:19:23,730 - INFO - [Train] step: 555999, loss: 0.046476, lr: 0.000200
2023-03-11 05:21:06,983 - INFO - [Train] step: 556399, loss: 0.042028, lr: 0.000200
2023-03-11 05:22:49,199 - INFO - [Train] step: 556799, loss: 0.031898, lr: 0.000200
2023-03-11 05:24:30,606 - INFO - [Train] step: 557199, loss: 0.035840, lr: 0.000200
2023-03-11 05:26:12,497 - INFO - [Train] step: 557599, loss: 0.034491, lr: 0.000200
2023-03-11 05:27:54,908 - INFO - [Train] step: 557999, loss: 0.026980, lr: 0.000200
2023-03-11 05:29:37,743 - INFO - [Train] step: 558399, loss: 0.037118, lr: 0.000200
2023-03-11 05:31:19,800 - INFO - [Train] step: 558799, loss: 0.022685, lr: 0.000200
2023-03-11 05:33:02,031 - INFO - [Train] step: 559199, loss: 0.029149, lr: 0.000200
2023-03-11 05:34:41,953 - INFO - [Train] step: 559599, loss: 0.028087, lr: 0.000200
2023-03-11 05:36:16,277 - INFO - [Train] step: 559999, loss: 0.041195, lr: 0.000200
2023-03-11 05:38:20,599 - INFO - [Train] step: 560399, loss: 0.033756, lr: 0.000200
2023-03-11 05:40:03,790 - INFO - [Train] step: 560799, loss: 0.032650, lr: 0.000200
2023-03-11 05:41:47,713 - INFO - [Train] step: 561199, loss: 0.027553, lr: 0.000200
2023-03-11 05:43:28,383 - INFO - [Train] step: 561599, loss: 0.036035, lr: 0.000200
2023-03-11 05:45:10,344 - INFO - [Train] step: 561999, loss: 0.023004, lr: 0.000200
2023-03-11 05:46:53,807 - INFO - [Train] step: 562399, loss: 0.028566, lr: 0.000200
2023-03-11 05:48:35,615 - INFO - [Train] step: 562799, loss: 0.031960, lr: 0.000200
2023-03-11 05:50:19,774 - INFO - [Train] step: 563199, loss: 0.028875, lr: 0.000200
2023-03-11 05:52:01,192 - INFO - [Train] step: 563599, loss: 0.030183, lr: 0.000200
2023-03-11 05:53:45,192 - INFO - [Train] step: 563999, loss: 0.023365, lr: 0.000200
2023-03-11 05:55:26,370 - INFO - [Train] step: 564399, loss: 0.044530, lr: 0.000200
2023-03-11 05:57:07,839 - INFO - [Train] step: 564799, loss: 0.029014, lr: 0.000200
2023-03-11 05:59:18,644 - INFO - [Train] step: 565199, loss: 0.023890, lr: 0.000200
2023-03-11 06:01:00,835 - INFO - [Train] step: 565599, loss: 0.034989, lr: 0.000200
2023-03-11 06:02:46,015 - INFO - [Train] step: 565999, loss: 0.025574, lr: 0.000200
2023-03-11 06:04:26,954 - INFO - [Train] step: 566399, loss: 0.024861, lr: 0.000200
2023-03-11 06:06:06,922 - INFO - [Train] step: 566799, loss: 0.027739, lr: 0.000200
2023-03-11 06:07:49,920 - INFO - [Train] step: 567199, loss: 0.033717, lr: 0.000200
2023-03-11 06:09:31,883 - INFO - [Train] step: 567599, loss: 0.027975, lr: 0.000200
2023-03-11 06:11:14,426 - INFO - [Train] step: 567999, loss: 0.022727, lr: 0.000200
2023-03-11 06:12:57,243 - INFO - [Train] step: 568399, loss: 0.023426, lr: 0.000200
2023-03-11 06:14:42,898 - INFO - [Train] step: 568799, loss: 0.032998, lr: 0.000200
2023-03-11 06:16:24,190 - INFO - [Train] step: 569199, loss: 0.020795, lr: 0.000200
2023-03-11 06:18:06,561 - INFO - [Train] step: 569599, loss: 0.047512, lr: 0.000200
2023-03-11 06:19:49,027 - INFO - [Train] step: 569999, loss: 0.041555, lr: 0.000200
2023-03-11 06:22:02,581 - INFO - [Train] step: 570399, loss: 0.037739, lr: 0.000200
2023-03-11 06:23:44,178 - INFO - [Train] step: 570799, loss: 0.019152, lr: 0.000200
2023-03-11 06:25:26,206 - INFO - [Train] step: 571199, loss: 0.032457, lr: 0.000200
2023-03-11 06:27:07,675 - INFO - [Train] step: 571599, loss: 0.034695, lr: 0.000200
2023-03-11 06:28:50,729 - INFO - [Train] step: 571999, loss: 0.028818, lr: 0.000200
2023-03-11 06:30:34,201 - INFO - [Train] step: 572399, loss: 0.029297, lr: 0.000200
2023-03-11 06:32:16,132 - INFO - [Train] step: 572799, loss: 0.022229, lr: 0.000200
2023-03-11 06:33:57,368 - INFO - [Train] step: 573199, loss: 0.028682, lr: 0.000200
2023-03-11 06:35:39,398 - INFO - [Train] step: 573599, loss: 0.028887, lr: 0.000200
2023-03-11 06:37:23,053 - INFO - [Train] step: 573999, loss: 0.015962, lr: 0.000200
2023-03-11 06:39:04,983 - INFO - [Train] step: 574399, loss: 0.013580, lr: 0.000200
2023-03-11 06:40:48,767 - INFO - [Train] step: 574799, loss: 0.031883, lr: 0.000200
2023-03-11 06:43:02,558 - INFO - [Train] step: 575199, loss: 0.034119, lr: 0.000200
2023-03-11 06:44:44,674 - INFO - [Train] step: 575599, loss: 0.028721, lr: 0.000200
2023-03-11 06:46:27,294 - INFO - [Train] step: 575999, loss: 0.023155, lr: 0.000200
2023-03-11 06:48:09,908 - INFO - [Train] step: 576399, loss: 0.022812, lr: 0.000200
2023-03-11 06:49:52,520 - INFO - [Train] step: 576799, loss: 0.037230, lr: 0.000200
2023-03-11 06:51:34,081 - INFO - [Train] step: 577199, loss: 0.027587, lr: 0.000200
2023-03-11 06:53:15,005 - INFO - [Train] step: 577599, loss: 0.036926, lr: 0.000200
2023-03-11 06:54:59,747 - INFO - [Train] step: 577999, loss: 0.025333, lr: 0.000200
2023-03-11 06:56:44,359 - INFO - [Train] step: 578399, loss: 0.028770, lr: 0.000200
2023-03-11 06:58:26,089 - INFO - [Train] step: 578799, loss: 0.022871, lr: 0.000200
2023-03-11 07:00:06,777 - INFO - [Train] step: 579199, loss: 0.025927, lr: 0.000200
2023-03-11 07:01:47,995 - INFO - [Train] step: 579599, loss: 0.031692, lr: 0.000200
2023-03-11 07:03:29,038 - INFO - [Train] step: 579999, loss: 0.013183, lr: 0.000200
2023-03-11 07:05:42,598 - INFO - [Train] step: 580399, loss: 0.020532, lr: 0.000200
2023-03-11 07:07:25,172 - INFO - [Train] step: 580799, loss: 0.030154, lr: 0.000200
2023-03-11 07:09:09,264 - INFO - [Train] step: 581199, loss: 0.029561, lr: 0.000200
2023-03-11 07:10:43,705 - INFO - [Train] step: 581599, loss: 0.022415, lr: 0.000200
2023-03-11 07:12:18,877 - INFO - [Train] step: 581999, loss: 0.025307, lr: 0.000200
2023-03-11 07:13:52,868 - INFO - [Train] step: 582399, loss: 0.033874, lr: 0.000200
2023-03-11 07:15:35,405 - INFO - [Train] step: 582799, loss: 0.027663, lr: 0.000200
2023-03-11 07:17:16,035 - INFO - [Train] step: 583199, loss: 0.014650, lr: 0.000200
2023-03-11 07:18:57,296 - INFO - [Train] step: 583599, loss: 0.036729, lr: 0.000200
2023-03-11 07:20:38,865 - INFO - [Train] step: 583999, loss: 0.028482, lr: 0.000200
2023-03-11 07:22:21,351 - INFO - [Train] step: 584399, loss: 0.027786, lr: 0.000200
2023-03-11 07:24:05,379 - INFO - [Train] step: 584799, loss: 0.031978, lr: 0.000200
2023-03-11 07:26:17,148 - INFO - [Train] step: 585199, loss: 0.024457, lr: 0.000200
2023-03-11 07:27:58,620 - INFO - [Train] step: 585599, loss: 0.033274, lr: 0.000200
2023-03-11 07:29:39,608 - INFO - [Train] step: 585999, loss: 0.038868, lr: 0.000200
2023-03-11 07:31:21,365 - INFO - [Train] step: 586399, loss: 0.045269, lr: 0.000200
2023-03-11 07:33:02,862 - INFO - [Train] step: 586799, loss: 0.027654, lr: 0.000200
2023-03-11 07:34:46,237 - INFO - [Train] step: 587199, loss: 0.029394, lr: 0.000200
2023-03-11 07:36:26,839 - INFO - [Train] step: 587599, loss: 0.032044, lr: 0.000200
2023-03-11 07:38:08,133 - INFO - [Train] step: 587999, loss: 0.023844, lr: 0.000200
2023-03-11 07:39:49,845 - INFO - [Train] step: 588399, loss: 0.024209, lr: 0.000200
2023-03-11 07:41:30,637 - INFO - [Train] step: 588799, loss: 0.033854, lr: 0.000200
2023-03-11 07:43:13,262 - INFO - [Train] step: 589199, loss: 0.018583, lr: 0.000200
2023-03-11 07:44:56,089 - INFO - [Train] step: 589599, loss: 0.021904, lr: 0.000200
2023-03-11 07:46:36,821 - INFO - [Train] step: 589999, loss: 0.024118, lr: 0.000200
2023-03-11 07:48:50,505 - INFO - [Train] step: 590399, loss: 0.019807, lr: 0.000200
2023-03-11 07:50:30,945 - INFO - [Train] step: 590799, loss: 0.022174, lr: 0.000200
2023-03-11 07:52:12,262 - INFO - [Train] step: 591199, loss: 0.034561, lr: 0.000200
2023-03-11 07:53:52,839 - INFO - [Train] step: 591599, loss: 0.021357, lr: 0.000200
2023-03-11 07:55:35,498 - INFO - [Train] step: 591999, loss: 0.021965, lr: 0.000200
2023-03-11 07:57:17,484 - INFO - [Train] step: 592399, loss: 0.020675, lr: 0.000200
2023-03-11 07:59:00,907 - INFO - [Train] step: 592799, loss: 0.017530, lr: 0.000200
2023-03-11 08:00:47,166 - INFO - [Train] step: 593199, loss: 0.031778, lr: 0.000200
2023-03-11 08:02:30,955 - INFO - [Train] step: 593599, loss: 0.020031, lr: 0.000200
2023-03-11 08:04:13,389 - INFO - [Train] step: 593999, loss: 0.032173, lr: 0.000200
2023-03-11 08:05:55,717 - INFO - [Train] step: 594399, loss: 0.034938, lr: 0.000200
2023-03-11 08:07:39,852 - INFO - [Train] step: 594799, loss: 0.022038, lr: 0.000200
2023-03-11 08:09:54,054 - INFO - [Train] step: 595199, loss: 0.026107, lr: 0.000200
2023-03-11 08:11:35,725 - INFO - [Train] step: 595599, loss: 0.036778, lr: 0.000200
2023-03-11 08:13:19,633 - INFO - [Train] step: 595999, loss: 0.024553, lr: 0.000200
2023-03-11 08:15:01,014 - INFO - [Train] step: 596399, loss: 0.019418, lr: 0.000200
2023-03-11 08:16:42,167 - INFO - [Train] step: 596799, loss: 0.019888, lr: 0.000200
2023-03-11 08:18:23,595 - INFO - [Train] step: 597199, loss: 0.019003, lr: 0.000200
2023-03-11 08:20:05,877 - INFO - [Train] step: 597599, loss: 0.020268, lr: 0.000200
2023-03-11 08:21:47,664 - INFO - [Train] step: 597999, loss: 0.029934, lr: 0.000200
2023-03-11 08:23:31,782 - INFO - [Train] step: 598399, loss: 0.034068, lr: 0.000200
2023-03-11 08:25:15,969 - INFO - [Train] step: 598799, loss: 0.028451, lr: 0.000200
2023-03-11 08:26:56,734 - INFO - [Train] step: 599199, loss: 0.019481, lr: 0.000200
2023-03-11 08:28:37,544 - INFO - [Train] step: 599599, loss: 0.024932, lr: 0.000200
2023-03-11 08:30:20,629 - INFO - [Train] step: 599999, loss: 0.036056, lr: 0.000200
2023-03-11 08:32:35,378 - INFO - [Train] step: 600399, loss: 0.034433, lr: 0.000200
2023-03-11 08:34:17,981 - INFO - [Train] step: 600799, loss: 0.029444, lr: 0.000200
2023-03-11 08:36:04,541 - INFO - [Train] step: 601199, loss: 0.026627, lr: 0.000200
2023-03-11 08:37:45,610 - INFO - [Train] step: 601599, loss: 0.033329, lr: 0.000200
2023-03-11 08:39:27,748 - INFO - [Train] step: 601999, loss: 0.022973, lr: 0.000200
2023-03-11 08:41:09,222 - INFO - [Train] step: 602399, loss: 0.040853, lr: 0.000200
2023-03-11 08:42:50,015 - INFO - [Train] step: 602799, loss: 0.034383, lr: 0.000200
2023-03-11 08:44:31,931 - INFO - [Train] step: 603199, loss: 0.025868, lr: 0.000200
2023-03-11 08:46:09,044 - INFO - [Train] step: 603599, loss: 0.017535, lr: 0.000200
2023-03-11 08:47:41,919 - INFO - [Train] step: 603999, loss: 0.018786, lr: 0.000200
2023-03-11 08:49:13,881 - INFO - [Train] step: 604399, loss: 0.029384, lr: 0.000200
2023-03-11 08:50:55,363 - INFO - [Train] step: 604799, loss: 0.053556, lr: 0.000200
2023-03-11 08:53:07,998 - INFO - [Train] step: 605199, loss: 0.019740, lr: 0.000200
2023-03-11 08:54:49,915 - INFO - [Train] step: 605599, loss: 0.022707, lr: 0.000200
2023-03-11 08:56:33,784 - INFO - [Train] step: 605999, loss: 0.017088, lr: 0.000200
2023-03-11 08:58:17,303 - INFO - [Train] step: 606399, loss: 0.028577, lr: 0.000200
2023-03-11 08:59:59,863 - INFO - [Train] step: 606799, loss: 0.032209, lr: 0.000200
2023-03-11 09:01:42,534 - INFO - [Train] step: 607199, loss: 0.031443, lr: 0.000200
2023-03-11 09:03:25,789 - INFO - [Train] step: 607599, loss: 0.020793, lr: 0.000200
2023-03-11 09:05:07,507 - INFO - [Train] step: 607999, loss: 0.020269, lr: 0.000200
2023-03-11 09:06:51,216 - INFO - [Train] step: 608399, loss: 0.033688, lr: 0.000200
2023-03-11 09:08:33,848 - INFO - [Train] step: 608799, loss: 0.024937, lr: 0.000200
2023-03-11 09:10:16,075 - INFO - [Train] step: 609199, loss: 0.023234, lr: 0.000200
2023-03-11 09:11:58,080 - INFO - [Train] step: 609599, loss: 0.036706, lr: 0.000200
2023-03-11 09:13:40,416 - INFO - [Train] step: 609999, loss: 0.026468, lr: 0.000200
2023-03-11 09:15:54,667 - INFO - [Train] step: 610399, loss: 0.020134, lr: 0.000200
2023-03-11 09:17:37,420 - INFO - [Train] step: 610799, loss: 0.030003, lr: 0.000200
2023-03-11 09:19:20,975 - INFO - [Train] step: 611199, loss: 0.021181, lr: 0.000200
2023-03-11 09:21:02,433 - INFO - [Train] step: 611599, loss: 0.023781, lr: 0.000200
2023-03-11 09:22:44,348 - INFO - [Train] step: 611999, loss: 0.024399, lr: 0.000200
2023-03-11 09:24:27,068 - INFO - [Train] step: 612399, loss: 0.019462, lr: 0.000200
2023-03-11 09:26:10,541 - INFO - [Train] step: 612799, loss: 0.028412, lr: 0.000200
2023-03-11 09:27:53,036 - INFO - [Train] step: 613199, loss: 0.031217, lr: 0.000200
2023-03-11 09:29:36,669 - INFO - [Train] step: 613599, loss: 0.040279, lr: 0.000200
2023-03-11 09:31:20,195 - INFO - [Train] step: 613999, loss: 0.020326, lr: 0.000200
2023-03-11 09:33:02,733 - INFO - [Train] step: 614399, loss: 0.038998, lr: 0.000200
2023-03-11 09:34:44,898 - INFO - [Train] step: 614799, loss: 0.024797, lr: 0.000200
2023-03-11 09:36:57,422 - INFO - [Train] step: 615199, loss: 0.027749, lr: 0.000200
2023-03-11 09:38:38,982 - INFO - [Train] step: 615599, loss: 0.019570, lr: 0.000200
2023-03-11 09:40:21,642 - INFO - [Train] step: 615999, loss: 0.023239, lr: 0.000200
2023-03-11 09:42:03,952 - INFO - [Train] step: 616399, loss: 0.031540, lr: 0.000200
2023-03-11 09:43:46,719 - INFO - [Train] step: 616799, loss: 0.043576, lr: 0.000200
2023-03-11 09:45:29,710 - INFO - [Train] step: 617199, loss: 0.030454, lr: 0.000200
2023-03-11 09:47:12,361 - INFO - [Train] step: 617599, loss: 0.040364, lr: 0.000200
2023-03-11 09:48:53,509 - INFO - [Train] step: 617999, loss: 0.017104, lr: 0.000200
2023-03-11 09:50:37,400 - INFO - [Train] step: 618399, loss: 0.013977, lr: 0.000200
2023-03-11 09:52:19,397 - INFO - [Train] step: 618799, loss: 0.046829, lr: 0.000200
2023-03-11 09:54:01,590 - INFO - [Train] step: 619199, loss: 0.026054, lr: 0.000200
2023-03-11 09:55:43,998 - INFO - [Train] step: 619599, loss: 0.029696, lr: 0.000200
2023-03-11 09:57:25,926 - INFO - [Train] step: 619999, loss: 0.025604, lr: 0.000200
2023-03-11 09:59:41,542 - INFO - [Train] step: 620399, loss: 0.026659, lr: 0.000200
2023-03-11 10:01:25,421 - INFO - [Train] step: 620799, loss: 0.018033, lr: 0.000200
2023-03-11 10:03:10,444 - INFO - [Train] step: 621199, loss: 0.017370, lr: 0.000200
2023-03-11 10:04:53,199 - INFO - [Train] step: 621599, loss: 0.029696, lr: 0.000200
2023-03-11 10:06:36,542 - INFO - [Train] step: 621999, loss: 0.030345, lr: 0.000200
2023-03-11 10:08:20,279 - INFO - [Train] step: 622399, loss: 0.033708, lr: 0.000200
2023-03-11 10:10:02,706 - INFO - [Train] step: 622799, loss: 0.024131, lr: 0.000200
2023-03-11 10:11:44,980 - INFO - [Train] step: 623199, loss: 0.025211, lr: 0.000200
2023-03-11 10:13:27,591 - INFO - [Train] step: 623599, loss: 0.027561, lr: 0.000200
2023-03-11 10:15:09,660 - INFO - [Train] step: 623999, loss: 0.019732, lr: 0.000200
2023-03-11 10:16:52,031 - INFO - [Train] step: 624399, loss: 0.021730, lr: 0.000200
2023-03-11 10:18:35,064 - INFO - [Train] step: 624799, loss: 0.027277, lr: 0.000200
2023-03-11 10:20:46,982 - INFO - [Train] step: 625199, loss: 0.030199, lr: 0.000200
2023-03-11 10:22:23,869 - INFO - [Train] step: 625599, loss: 0.035674, lr: 0.000200
2023-03-11 10:23:58,540 - INFO - [Train] step: 625999, loss: 0.024725, lr: 0.000200
2023-03-11 10:25:37,526 - INFO - [Train] step: 626399, loss: 0.022903, lr: 0.000200
2023-03-11 10:27:19,898 - INFO - [Train] step: 626799, loss: 0.027332, lr: 0.000200
2023-03-11 10:29:01,810 - INFO - [Train] step: 627199, loss: 0.030427, lr: 0.000200
2023-03-11 10:30:43,072 - INFO - [Train] step: 627599, loss: 0.015058, lr: 0.000200
2023-03-11 10:32:23,699 - INFO - [Train] step: 627999, loss: 0.033187, lr: 0.000200
2023-03-11 10:34:05,074 - INFO - [Train] step: 628399, loss: 0.036494, lr: 0.000200
2023-03-11 10:35:47,214 - INFO - [Train] step: 628799, loss: 0.025357, lr: 0.000200
2023-03-11 10:37:27,939 - INFO - [Train] step: 629199, loss: 0.023888, lr: 0.000200
2023-03-11 10:39:09,615 - INFO - [Train] step: 629599, loss: 0.021485, lr: 0.000200
2023-03-11 10:40:50,945 - INFO - [Train] step: 629999, loss: 0.024377, lr: 0.000200
2023-03-11 10:43:03,839 - INFO - [Train] step: 630399, loss: 0.022459, lr: 0.000200
2023-03-11 10:44:45,107 - INFO - [Train] step: 630799, loss: 0.021180, lr: 0.000200
2023-03-11 10:46:27,260 - INFO - [Train] step: 631199, loss: 0.034365, lr: 0.000200
2023-03-11 10:48:07,678 - INFO - [Train] step: 631599, loss: 0.024301, lr: 0.000200
2023-03-11 10:49:49,090 - INFO - [Train] step: 631999, loss: 0.027982, lr: 0.000200
2023-03-11 10:51:29,409 - INFO - [Train] step: 632399, loss: 0.020724, lr: 0.000200
2023-03-11 10:53:10,349 - INFO - [Train] step: 632799, loss: 0.032020, lr: 0.000200
2023-03-11 10:54:53,360 - INFO - [Train] step: 633199, loss: 0.026623, lr: 0.000200
2023-03-11 10:56:35,484 - INFO - [Train] step: 633599, loss: 0.026977, lr: 0.000200
2023-03-11 10:58:16,847 - INFO - [Train] step: 633999, loss: 0.022649, lr: 0.000200
2023-03-11 10:59:58,657 - INFO - [Train] step: 634399, loss: 0.037866, lr: 0.000200
2023-03-11 11:01:40,343 - INFO - [Train] step: 634799, loss: 0.025185, lr: 0.000200
2023-03-11 11:03:51,535 - INFO - [Train] step: 635199, loss: 0.035249, lr: 0.000200
2023-03-11 11:05:32,738 - INFO - [Train] step: 635599, loss: 0.027681, lr: 0.000200
2023-03-11 11:07:14,637 - INFO - [Train] step: 635999, loss: 0.021740, lr: 0.000200
2023-03-11 11:08:56,199 - INFO - [Train] step: 636399, loss: 0.029804, lr: 0.000200
2023-03-11 11:10:36,887 - INFO - [Train] step: 636799, loss: 0.027725, lr: 0.000200
2023-03-11 11:12:17,974 - INFO - [Train] step: 637199, loss: 0.041190, lr: 0.000200
2023-03-11 11:13:59,396 - INFO - [Train] step: 637599, loss: 0.027713, lr: 0.000200
2023-03-11 11:15:41,008 - INFO - [Train] step: 637999, loss: 0.029194, lr: 0.000200
2023-03-11 11:17:22,253 - INFO - [Train] step: 638399, loss: 0.027151, lr: 0.000200
2023-03-11 11:19:03,778 - INFO - [Train] step: 638799, loss: 0.030267, lr: 0.000200
2023-03-11 11:20:45,636 - INFO - [Train] step: 639199, loss: 0.017288, lr: 0.000200
2023-03-11 11:22:27,394 - INFO - [Train] step: 639599, loss: 0.038679, lr: 0.000200
2023-03-11 11:24:09,475 - INFO - [Train] step: 639999, loss: 0.027263, lr: 0.000200
2023-03-11 11:26:23,625 - INFO - [Train] step: 640399, loss: 0.038133, lr: 0.000200
2023-03-11 11:28:05,895 - INFO - [Train] step: 640799, loss: 0.029852, lr: 0.000200
2023-03-11 11:29:46,842 - INFO - [Train] step: 641199, loss: 0.033590, lr: 0.000200
2023-03-11 11:31:27,693 - INFO - [Train] step: 641599, loss: 0.029197, lr: 0.000200
2023-03-11 11:33:08,706 - INFO - [Train] step: 641999, loss: 0.027726, lr: 0.000200
2023-03-11 11:34:50,421 - INFO - [Train] step: 642399, loss: 0.021385, lr: 0.000200
2023-03-11 11:36:31,955 - INFO - [Train] step: 642799, loss: 0.023019, lr: 0.000200
2023-03-11 11:38:13,724 - INFO - [Train] step: 643199, loss: 0.032990, lr: 0.000200
2023-03-11 11:39:55,878 - INFO - [Train] step: 643599, loss: 0.024085, lr: 0.000200
2023-03-11 11:41:37,819 - INFO - [Train] step: 643999, loss: 0.018190, lr: 0.000200
2023-03-11 11:43:19,489 - INFO - [Train] step: 644399, loss: 0.020602, lr: 0.000200
2023-03-11 11:45:02,416 - INFO - [Train] step: 644799, loss: 0.034843, lr: 0.000200
2023-03-11 11:47:13,689 - INFO - [Train] step: 645199, loss: 0.029662, lr: 0.000200
2023-03-11 11:48:56,090 - INFO - [Train] step: 645599, loss: 0.026179, lr: 0.000200
2023-03-11 11:50:37,723 - INFO - [Train] step: 645999, loss: 0.021185, lr: 0.000200
2023-03-11 11:52:19,395 - INFO - [Train] step: 646399, loss: 0.014077, lr: 0.000200
2023-03-11 11:54:01,375 - INFO - [Train] step: 646799, loss: 0.017754, lr: 0.000200
2023-03-11 11:55:42,986 - INFO - [Train] step: 647199, loss: 0.028940, lr: 0.000200
2023-03-11 11:57:17,639 - INFO - [Train] step: 647599, loss: 0.029789, lr: 0.000200
2023-03-11 11:58:51,083 - INFO - [Train] step: 647999, loss: 0.033200, lr: 0.000200
2023-03-11 12:00:24,466 - INFO - [Train] step: 648399, loss: 0.019686, lr: 0.000200
2023-03-11 12:02:05,778 - INFO - [Train] step: 648799, loss: 0.029754, lr: 0.000200
2023-03-11 12:03:46,414 - INFO - [Train] step: 649199, loss: 0.020192, lr: 0.000200
2023-03-11 12:05:28,447 - INFO - [Train] step: 649599, loss: 0.021089, lr: 0.000200
2023-03-11 12:07:08,956 - INFO - [Train] step: 649999, loss: 0.021228, lr: 0.000200
2023-03-11 12:09:22,381 - INFO - [Train] step: 650399, loss: 0.028012, lr: 0.000200
2023-03-11 12:11:04,761 - INFO - [Train] step: 650799, loss: 0.027154, lr: 0.000200
2023-03-11 12:12:47,541 - INFO - [Train] step: 651199, loss: 0.045288, lr: 0.000200
2023-03-11 12:14:29,899 - INFO - [Train] step: 651599, loss: 0.033305, lr: 0.000200
2023-03-11 12:16:12,027 - INFO - [Train] step: 651999, loss: 0.032637, lr: 0.000200
2023-03-11 12:17:54,134 - INFO - [Train] step: 652399, loss: 0.021133, lr: 0.000200
2023-03-11 12:19:36,125 - INFO - [Train] step: 652799, loss: 0.026186, lr: 0.000200
2023-03-11 12:21:17,342 - INFO - [Train] step: 653199, loss: 0.023521, lr: 0.000200
2023-03-11 12:22:58,643 - INFO - [Train] step: 653599, loss: 0.023728, lr: 0.000200
2023-03-11 12:24:40,416 - INFO - [Train] step: 653999, loss: 0.025953, lr: 0.000200
2023-03-11 12:26:21,739 - INFO - [Train] step: 654399, loss: 0.034117, lr: 0.000200
2023-03-11 12:28:04,343 - INFO - [Train] step: 654799, loss: 0.027163, lr: 0.000200
2023-03-11 12:30:17,379 - INFO - [Train] step: 655199, loss: 0.028301, lr: 0.000200
2023-03-11 12:31:58,646 - INFO - [Train] step: 655599, loss: 0.035132, lr: 0.000200
2023-03-11 12:33:41,601 - INFO - [Train] step: 655999, loss: 0.040897, lr: 0.000200
2023-03-11 12:35:23,330 - INFO - [Train] step: 656399, loss: 0.018656, lr: 0.000200
2023-03-11 12:37:05,599 - INFO - [Train] step: 656799, loss: 0.021079, lr: 0.000200
2023-03-11 12:38:48,011 - INFO - [Train] step: 657199, loss: 0.032284, lr: 0.000200
2023-03-11 12:40:30,253 - INFO - [Train] step: 657599, loss: 0.016157, lr: 0.000200
2023-03-11 12:42:12,248 - INFO - [Train] step: 657999, loss: 0.044988, lr: 0.000200
2023-03-11 12:43:53,976 - INFO - [Train] step: 658399, loss: 0.022838, lr: 0.000200
2023-03-11 12:45:36,124 - INFO - [Train] step: 658799, loss: 0.033444, lr: 0.000200
2023-03-11 12:47:19,027 - INFO - [Train] step: 659199, loss: 0.022387, lr: 0.000200
2023-03-11 12:49:00,083 - INFO - [Train] step: 659599, loss: 0.028225, lr: 0.000200
2023-03-11 12:50:38,747 - INFO - [Train] step: 659999, loss: 0.022953, lr: 0.000200
2023-03-11 12:52:47,172 - INFO - [Train] step: 660399, loss: 0.026243, lr: 0.000200
2023-03-11 12:54:25,609 - INFO - [Train] step: 660799, loss: 0.035368, lr: 0.000200
2023-03-11 12:56:03,249 - INFO - [Train] step: 661199, loss: 0.037079, lr: 0.000200
2023-03-11 12:57:40,629 - INFO - [Train] step: 661599, loss: 0.032874, lr: 0.000200
2023-03-11 12:59:17,695 - INFO - [Train] step: 661999, loss: 0.020694, lr: 0.000200
2023-03-11 13:00:55,623 - INFO - [Train] step: 662399, loss: 0.035667, lr: 0.000200
2023-03-11 13:02:33,281 - INFO - [Train] step: 662799, loss: 0.040163, lr: 0.000200
2023-03-11 13:04:11,009 - INFO - [Train] step: 663199, loss: 0.025526, lr: 0.000200
2023-03-11 13:05:48,446 - INFO - [Train] step: 663599, loss: 0.019813, lr: 0.000200
2023-03-11 13:07:25,746 - INFO - [Train] step: 663999, loss: 0.022131, lr: 0.000200
2023-03-11 13:09:03,090 - INFO - [Train] step: 664399, loss: 0.023099, lr: 0.000200
2023-03-11 13:10:40,584 - INFO - [Train] step: 664799, loss: 0.019992, lr: 0.000200
2023-03-11 13:12:46,819 - INFO - [Train] step: 665199, loss: 0.032564, lr: 0.000200
2023-03-11 13:14:24,545 - INFO - [Train] step: 665599, loss: 0.024908, lr: 0.000200
2023-03-11 13:16:02,331 - INFO - [Train] step: 665999, loss: 0.026782, lr: 0.000200
2023-03-11 13:17:39,715 - INFO - [Train] step: 666399, loss: 0.022448, lr: 0.000200
2023-03-11 13:19:17,690 - INFO - [Train] step: 666799, loss: 0.024673, lr: 0.000200
2023-03-11 13:20:55,042 - INFO - [Train] step: 667199, loss: 0.026585, lr: 0.000200
2023-03-11 13:22:31,937 - INFO - [Train] step: 667599, loss: 0.025643, lr: 0.000200
2023-03-11 13:24:08,351 - INFO - [Train] step: 667999, loss: 0.045578, lr: 0.000200
2023-03-11 13:25:45,596 - INFO - [Train] step: 668399, loss: 0.026987, lr: 0.000200
2023-03-11 13:27:22,621 - INFO - [Train] step: 668799, loss: 0.029801, lr: 0.000200
2023-03-11 13:28:59,417 - INFO - [Train] step: 669199, loss: 0.025734, lr: 0.000200
2023-03-11 13:30:40,377 - INFO - [Train] step: 669599, loss: 0.030589, lr: 0.000200
2023-03-11 13:32:10,132 - INFO - [Train] step: 669999, loss: 0.029081, lr: 0.000200
2023-03-11 13:34:09,567 - INFO - [Train] step: 670399, loss: 0.033528, lr: 0.000200
2023-03-11 13:35:43,542 - INFO - [Train] step: 670799, loss: 0.024310, lr: 0.000200
2023-03-11 13:37:20,971 - INFO - [Train] step: 671199, loss: 0.044351, lr: 0.000200
2023-03-11 13:38:59,014 - INFO - [Train] step: 671599, loss: 0.023761, lr: 0.000200
2023-03-11 13:40:36,488 - INFO - [Train] step: 671999, loss: 0.022698, lr: 0.000200
2023-03-11 13:42:13,703 - INFO - [Train] step: 672399, loss: 0.028979, lr: 0.000200
2023-03-11 13:43:50,741 - INFO - [Train] step: 672799, loss: 0.031342, lr: 0.000200
2023-03-11 13:45:27,491 - INFO - [Train] step: 673199, loss: 0.023728, lr: 0.000200
2023-03-11 13:47:04,416 - INFO - [Train] step: 673599, loss: 0.023337, lr: 0.000200
2023-03-11 13:48:41,193 - INFO - [Train] step: 673999, loss: 0.029630, lr: 0.000200
2023-03-11 13:50:18,847 - INFO - [Train] step: 674399, loss: 0.021677, lr: 0.000200
2023-03-11 13:51:57,116 - INFO - [Train] step: 674799, loss: 0.027168, lr: 0.000200
2023-03-11 13:54:02,170 - INFO - [Train] step: 675199, loss: 0.021905, lr: 0.000200
2023-03-11 13:55:39,108 - INFO - [Train] step: 675599, loss: 0.023994, lr: 0.000200
2023-03-11 13:57:16,010 - INFO - [Train] step: 675999, loss: 0.026740, lr: 0.000200
2023-03-11 13:58:52,473 - INFO - [Train] step: 676399, loss: 0.040478, lr: 0.000200
2023-03-11 14:00:28,381 - INFO - [Train] step: 676799, loss: 0.029416, lr: 0.000200
2023-03-11 14:02:04,340 - INFO - [Train] step: 677199, loss: 0.030927, lr: 0.000200
2023-03-11 14:03:40,324 - INFO - [Train] step: 677599, loss: 0.036323, lr: 0.000200
2023-03-11 14:05:16,919 - INFO - [Train] step: 677999, loss: 0.023855, lr: 0.000200
2023-03-11 14:06:52,811 - INFO - [Train] step: 678399, loss: 0.033418, lr: 0.000200
2023-03-11 14:08:28,592 - INFO - [Train] step: 678799, loss: 0.024681, lr: 0.000200
2023-03-11 14:10:05,104 - INFO - [Train] step: 679199, loss: 0.040133, lr: 0.000200
2023-03-11 14:11:42,034 - INFO - [Train] step: 679599, loss: 0.024791, lr: 0.000200
2023-03-11 14:13:18,620 - INFO - [Train] step: 679999, loss: 0.043328, lr: 0.000200
2023-03-11 14:15:25,282 - INFO - [Train] step: 680399, loss: 0.020262, lr: 0.000200
2023-03-11 14:17:01,161 - INFO - [Train] step: 680799, loss: 0.035362, lr: 0.000200
2023-03-11 14:18:38,855 - INFO - [Train] step: 681199, loss: 0.015725, lr: 0.000200
2023-03-11 14:20:14,561 - INFO - [Train] step: 681599, loss: 0.032690, lr: 0.000200
2023-03-11 14:21:53,316 - INFO - [Train] step: 681999, loss: 0.032959, lr: 0.000200
2023-03-11 14:23:31,286 - INFO - [Train] step: 682399, loss: 0.031391, lr: 0.000200
2023-03-11 14:25:08,886 - INFO - [Train] step: 682799, loss: 0.041821, lr: 0.000200
2023-03-11 14:26:44,823 - INFO - [Train] step: 683199, loss: 0.025970, lr: 0.000200
2023-03-11 14:28:20,937 - INFO - [Train] step: 683599, loss: 0.023375, lr: 0.000200
2023-03-11 14:29:56,935 - INFO - [Train] step: 683999, loss: 0.019695, lr: 0.000200
2023-03-11 14:31:33,681 - INFO - [Train] step: 684399, loss: 0.042229, lr: 0.000200
2023-03-11 14:33:13,199 - INFO - [Train] step: 684799, loss: 0.033096, lr: 0.000200
2023-03-11 14:35:20,615 - INFO - [Train] step: 685199, loss: 0.021039, lr: 0.000200
2023-03-11 14:36:57,557 - INFO - [Train] step: 685599, loss: 0.034370, lr: 0.000200
2023-03-11 14:38:34,012 - INFO - [Train] step: 685999, loss: 0.022179, lr: 0.000200
2023-03-11 14:40:10,557 - INFO - [Train] step: 686399, loss: 0.023262, lr: 0.000200
2023-03-11 14:41:47,723 - INFO - [Train] step: 686799, loss: 0.020580, lr: 0.000200
2023-03-11 14:43:24,260 - INFO - [Train] step: 687199, loss: 0.019394, lr: 0.000200
2023-03-11 14:45:00,751 - INFO - [Train] step: 687599, loss: 0.020122, lr: 0.000200
2023-03-11 14:46:37,205 - INFO - [Train] step: 687999, loss: 0.028423, lr: 0.000200
2023-03-11 14:48:16,673 - INFO - [Train] step: 688399, loss: 0.035651, lr: 0.000200
2023-03-11 14:49:53,476 - INFO - [Train] step: 688799, loss: 0.018811, lr: 0.000200
2023-03-11 14:51:29,729 - INFO - [Train] step: 689199, loss: 0.034775, lr: 0.000200
2023-03-11 14:53:05,691 - INFO - [Train] step: 689599, loss: 0.015823, lr: 0.000200
2023-03-11 14:54:42,292 - INFO - [Train] step: 689999, loss: 0.029138, lr: 0.000200
2023-03-11 14:56:51,783 - INFO - [Train] step: 690399, loss: 0.032522, lr: 0.000200
2023-03-11 14:58:28,860 - INFO - [Train] step: 690799, loss: 0.028218, lr: 0.000200
2023-03-11 15:00:05,021 - INFO - [Train] step: 691199, loss: 0.017482, lr: 0.000200
2023-03-11 15:01:42,393 - INFO - [Train] step: 691599, loss: 0.032440, lr: 0.000200
2023-03-11 15:03:20,874 - INFO - [Train] step: 691999, loss: 0.036984, lr: 0.000200
2023-03-11 15:04:52,232 - INFO - [Train] step: 692399, loss: 0.032954, lr: 0.000200
2023-03-11 15:06:21,173 - INFO - [Train] step: 692799, loss: 0.041477, lr: 0.000200
2023-03-11 15:07:50,214 - INFO - [Train] step: 693199, loss: 0.029813, lr: 0.000200
2023-03-11 15:09:25,466 - INFO - [Train] step: 693599, loss: 0.023526, lr: 0.000200
2023-03-11 15:11:01,796 - INFO - [Train] step: 693999, loss: 0.045721, lr: 0.000200
2023-03-11 15:12:40,751 - INFO - [Train] step: 694399, loss: 0.037347, lr: 0.000200
2023-03-11 15:14:16,695 - INFO - [Train] step: 694799, loss: 0.026430, lr: 0.000200
2023-03-11 15:16:27,768 - INFO - [Train] step: 695199, loss: 0.032169, lr: 0.000200
2023-03-11 15:18:08,349 - INFO - [Train] step: 695599, loss: 0.020447, lr: 0.000200
2023-03-11 15:19:44,632 - INFO - [Train] step: 695999, loss: 0.020403, lr: 0.000200
2023-03-11 15:21:20,791 - INFO - [Train] step: 696399, loss: 0.039534, lr: 0.000200
2023-03-11 15:22:59,830 - INFO - [Train] step: 696799, loss: 0.029465, lr: 0.000200
2023-03-11 15:24:36,453 - INFO - [Train] step: 697199, loss: 0.028104, lr: 0.000200
2023-03-11 15:26:12,731 - INFO - [Train] step: 697599, loss: 0.028032, lr: 0.000200
2023-03-11 15:27:48,966 - INFO - [Train] step: 697999, loss: 0.026295, lr: 0.000200
2023-03-11 15:29:25,659 - INFO - [Train] step: 698399, loss: 0.029504, lr: 0.000200
2023-03-11 15:31:01,546 - INFO - [Train] step: 698799, loss: 0.028615, lr: 0.000200
2023-03-11 15:32:37,597 - INFO - [Train] step: 699199, loss: 0.040645, lr: 0.000200
2023-03-11 15:34:13,695 - INFO - [Train] step: 699599, loss: 0.020659, lr: 0.000200
2023-03-11 15:35:50,340 - INFO - [Train] step: 699999, loss: 0.023629, lr: 0.000200
2023-03-11 15:37:56,333 - INFO - [Train] step: 700399, loss: 0.024643, lr: 0.000200
2023-03-11 15:39:32,115 - INFO - [Train] step: 700799, loss: 0.040633, lr: 0.000200
2023-03-11 15:41:08,944 - INFO - [Train] step: 701199, loss: 0.029616, lr: 0.000200
2023-03-11 15:42:47,950 - INFO - [Train] step: 701599, loss: 0.021220, lr: 0.000200
2023-03-11 15:44:24,478 - INFO - [Train] step: 701999, loss: 0.032828, lr: 0.000200
2023-03-11 15:46:02,132 - INFO - [Train] step: 702399, loss: 0.033954, lr: 0.000200
2023-03-11 15:47:38,561 - INFO - [Train] step: 702799, loss: 0.020044, lr: 0.000200
2023-03-11 15:49:14,930 - INFO - [Train] step: 703199, loss: 0.030539, lr: 0.000200
2023-03-11 15:50:51,863 - INFO - [Train] step: 703599, loss: 0.021148, lr: 0.000200
2023-03-11 15:52:28,607 - INFO - [Train] step: 703999, loss: 0.019598, lr: 0.000200
2023-03-11 15:54:04,341 - INFO - [Train] step: 704399, loss: 0.024464, lr: 0.000200
2023-03-11 15:55:40,026 - INFO - [Train] step: 704799, loss: 0.027774, lr: 0.000200
2023-03-11 15:57:47,070 - INFO - [Train] step: 705199, loss: 0.035028, lr: 0.000200
2023-03-11 15:59:26,709 - INFO - [Train] step: 705599, loss: 0.050006, lr: 0.000200
2023-03-11 16:01:02,818 - INFO - [Train] step: 705999, loss: 0.025425, lr: 0.000200
2023-03-11 16:02:39,100 - INFO - [Train] step: 706399, loss: 0.024729, lr: 0.000200
2023-03-11 16:04:18,484 - INFO - [Train] step: 706799, loss: 0.022356, lr: 0.000200
2023-03-11 16:05:59,517 - INFO - [Train] step: 707199, loss: 0.030051, lr: 0.000200
2023-03-11 16:07:37,630 - INFO - [Train] step: 707599, loss: 0.020404, lr: 0.000200
2023-03-11 16:09:13,402 - INFO - [Train] step: 707999, loss: 0.024444, lr: 0.000200
2023-03-11 16:10:49,518 - INFO - [Train] step: 708399, loss: 0.020621, lr: 0.000200
2023-03-11 16:12:25,805 - INFO - [Train] step: 708799, loss: 0.026375, lr: 0.000200
2023-03-11 16:14:01,920 - INFO - [Train] step: 709199, loss: 0.024545, lr: 0.000200
2023-03-11 16:15:37,647 - INFO - [Train] step: 709599, loss: 0.018270, lr: 0.000200
2023-03-11 16:17:14,039 - INFO - [Train] step: 709999, loss: 0.012060, lr: 0.000200
2023-03-11 16:19:19,533 - INFO - [Train] step: 710399, loss: 0.038812, lr: 0.000200
2023-03-11 16:20:55,586 - INFO - [Train] step: 710799, loss: 0.017557, lr: 0.000200
2023-03-11 16:22:32,308 - INFO - [Train] step: 711199, loss: 0.038133, lr: 0.000200
2023-03-11 16:24:08,392 - INFO - [Train] step: 711599, loss: 0.026684, lr: 0.000200
2023-03-11 16:25:44,295 - INFO - [Train] step: 711999, loss: 0.027888, lr: 0.000200
2023-03-11 16:27:20,013 - INFO - [Train] step: 712399, loss: 0.029072, lr: 0.000200
2023-03-11 16:28:56,081 - INFO - [Train] step: 712799, loss: 0.019270, lr: 0.000200
2023-03-11 16:30:33,197 - INFO - [Train] step: 713199, loss: 0.017143, lr: 0.000200
2023-03-11 16:32:09,072 - INFO - [Train] step: 713599, loss: 0.034878, lr: 0.000200
2023-03-11 16:33:47,493 - INFO - [Train] step: 713999, loss: 0.027093, lr: 0.000200
2023-03-11 16:35:31,423 - INFO - [Train] step: 714399, loss: 0.032469, lr: 0.000200
2023-03-11 16:37:06,565 - INFO - [Train] step: 714799, loss: 0.031753, lr: 0.000200
2023-03-11 16:39:03,418 - INFO - [Train] step: 715199, loss: 0.016751, lr: 0.000200
2023-03-11 16:40:32,387 - INFO - [Train] step: 715599, loss: 0.015586, lr: 0.000200
2023-03-11 16:42:05,525 - INFO - [Train] step: 715999, loss: 0.024926, lr: 0.000200
2023-03-11 16:43:41,439 - INFO - [Train] step: 716399, loss: 0.025472, lr: 0.000200
2023-03-11 16:45:17,515 - INFO - [Train] step: 716799, loss: 0.032756, lr: 0.000200
2023-03-11 16:46:54,308 - INFO - [Train] step: 717199, loss: 0.033714, lr: 0.000200
2023-03-11 16:48:31,363 - INFO - [Train] step: 717599, loss: 0.026811, lr: 0.000200
2023-03-11 16:50:13,170 - INFO - [Train] step: 717999, loss: 0.043881, lr: 0.000200
2023-03-11 16:51:49,493 - INFO - [Train] step: 718399, loss: 0.030460, lr: 0.000200
2023-03-11 16:53:25,596 - INFO - [Train] step: 718799, loss: 0.033229, lr: 0.000200
2023-03-11 16:55:02,063 - INFO - [Train] step: 719199, loss: 0.029334, lr: 0.000200
2023-03-11 16:56:38,849 - INFO - [Train] step: 719599, loss: 0.017208, lr: 0.000200
2023-03-11 16:58:15,279 - INFO - [Train] step: 719999, loss: 0.028755, lr: 0.000200
2023-03-11 17:00:21,558 - INFO - [Train] step: 720399, loss: 0.017080, lr: 0.000200
2023-03-11 17:01:57,990 - INFO - [Train] step: 720799, loss: 0.035380, lr: 0.000200
2023-03-11 17:03:33,990 - INFO - [Train] step: 721199, loss: 0.030132, lr: 0.000200
2023-03-11 17:05:10,068 - INFO - [Train] step: 721599, loss: 0.025332, lr: 0.000200
2023-03-11 17:06:45,989 - INFO - [Train] step: 721999, loss: 0.028613, lr: 0.000200
2023-03-11 17:08:22,352 - INFO - [Train] step: 722399, loss: 0.020692, lr: 0.000200
2023-03-11 17:09:59,082 - INFO - [Train] step: 722799, loss: 0.028633, lr: 0.000200
2023-03-11 17:11:35,146 - INFO - [Train] step: 723199, loss: 0.028553, lr: 0.000200
2023-03-11 17:13:11,369 - INFO - [Train] step: 723599, loss: 0.027309, lr: 0.000200
2023-03-11 17:14:48,483 - INFO - [Train] step: 723999, loss: 0.028650, lr: 0.000200
2023-03-11 17:16:24,675 - INFO - [Train] step: 724399, loss: 0.030232, lr: 0.000200
2023-03-11 17:18:01,114 - INFO - [Train] step: 724799, loss: 0.019701, lr: 0.000200
2023-03-11 17:20:06,769 - INFO - [Train] step: 725199, loss: 0.031947, lr: 0.000200
2023-03-11 17:21:43,277 - INFO - [Train] step: 725599, loss: 0.039649, lr: 0.000200
2023-03-11 17:23:19,711 - INFO - [Train] step: 725999, loss: 0.022339, lr: 0.000200
2023-03-11 17:24:55,276 - INFO - [Train] step: 726399, loss: 0.032783, lr: 0.000200
2023-03-11 17:26:31,445 - INFO - [Train] step: 726799, loss: 0.027477, lr: 0.000200
2023-03-11 17:28:07,959 - INFO - [Train] step: 727199, loss: 0.041099, lr: 0.000200
2023-03-11 17:29:44,528 - INFO - [Train] step: 727599, loss: 0.015255, lr: 0.000200
2023-03-11 17:31:20,445 - INFO - [Train] step: 727999, loss: 0.025289, lr: 0.000200
2023-03-11 17:32:58,860 - INFO - [Train] step: 728399, loss: 0.029833, lr: 0.000200
2023-03-11 17:34:37,557 - INFO - [Train] step: 728799, loss: 0.032124, lr: 0.000200
2023-03-11 17:36:21,844 - INFO - [Train] step: 729199, loss: 0.018121, lr: 0.000200
2023-03-11 17:38:00,943 - INFO - [Train] step: 729599, loss: 0.024414, lr: 0.000200
2023-03-11 17:39:37,334 - INFO - [Train] step: 729999, loss: 0.034940, lr: 0.000200
2023-03-11 17:41:43,903 - INFO - [Train] step: 730399, loss: 0.022785, lr: 0.000200
2023-03-11 17:43:20,388 - INFO - [Train] step: 730799, loss: 0.030746, lr: 0.000200
2023-03-11 17:44:56,764 - INFO - [Train] step: 731199, loss: 0.023488, lr: 0.000200
2023-03-11 17:46:33,731 - INFO - [Train] step: 731599, loss: 0.031220, lr: 0.000200
2023-03-11 17:48:12,833 - INFO - [Train] step: 731999, loss: 0.025234, lr: 0.000200
2023-03-11 17:49:51,894 - INFO - [Train] step: 732399, loss: 0.028701, lr: 0.000200
2023-03-11 17:51:27,861 - INFO - [Train] step: 732799, loss: 0.020954, lr: 0.000200
2023-03-11 17:53:03,940 - INFO - [Train] step: 733199, loss: 0.050810, lr: 0.000200
2023-03-11 17:54:40,650 - INFO - [Train] step: 733599, loss: 0.024322, lr: 0.000200
2023-03-11 17:56:17,127 - INFO - [Train] step: 733999, loss: 0.034836, lr: 0.000200
2023-03-11 17:57:53,154 - INFO - [Train] step: 734399, loss: 0.037728, lr: 0.000200
2023-03-11 17:59:29,265 - INFO - [Train] step: 734799, loss: 0.030631, lr: 0.000200
2023-03-11 18:01:34,808 - INFO - [Train] step: 735199, loss: 0.017503, lr: 0.000200
2023-03-11 18:03:14,422 - INFO - [Train] step: 735599, loss: 0.024680, lr: 0.000200
2023-03-11 18:04:50,946 - INFO - [Train] step: 735999, loss: 0.025911, lr: 0.000200
2023-03-11 18:06:27,643 - INFO - [Train] step: 736399, loss: 0.016981, lr: 0.000200
2023-03-11 18:08:08,369 - INFO - [Train] step: 736799, loss: 0.025372, lr: 0.000200
2023-03-11 18:09:45,098 - INFO - [Train] step: 737199, loss: 0.027639, lr: 0.000200
2023-03-11 18:11:14,807 - INFO - [Train] step: 737599, loss: 0.029107, lr: 0.000200
2023-03-11 18:12:43,719 - INFO - [Train] step: 737999, loss: 0.033488, lr: 0.000200
2023-03-11 18:14:14,266 - INFO - [Train] step: 738399, loss: 0.017498, lr: 0.000200
2023-03-11 18:15:52,127 - INFO - [Train] step: 738799, loss: 0.023094, lr: 0.000200
2023-03-11 18:17:28,394 - INFO - [Train] step: 739199, loss: 0.027532, lr: 0.000200
2023-03-11 18:19:10,832 - INFO - [Train] step: 739599, loss: 0.018705, lr: 0.000200
2023-03-11 18:20:48,600 - INFO - [Train] step: 739999, loss: 0.031637, lr: 0.000200
2023-03-11 18:22:55,791 - INFO - [Train] step: 740399, loss: 0.037170, lr: 0.000200
2023-03-11 18:24:32,045 - INFO - [Train] step: 740799, loss: 0.029740, lr: 0.000200
2023-03-11 18:26:08,854 - INFO - [Train] step: 741199, loss: 0.017359, lr: 0.000200
2023-03-11 18:27:48,088 - INFO - [Train] step: 741599, loss: 0.023522, lr: 0.000200
2023-03-11 18:29:25,538 - INFO - [Train] step: 741999, loss: 0.031849, lr: 0.000200
2023-03-11 18:31:01,665 - INFO - [Train] step: 742399, loss: 0.021684, lr: 0.000200
2023-03-11 18:32:38,033 - INFO - [Train] step: 742799, loss: 0.038310, lr: 0.000200
2023-03-11 18:34:14,566 - INFO - [Train] step: 743199, loss: 0.024135, lr: 0.000200
2023-03-11 18:35:50,406 - INFO - [Train] step: 743599, loss: 0.026854, lr: 0.000200
2023-03-11 18:37:26,688 - INFO - [Train] step: 743999, loss: 0.025373, lr: 0.000200
2023-03-11 18:39:03,555 - INFO - [Train] step: 744399, loss: 0.039479, lr: 0.000200
2023-03-11 18:40:40,196 - INFO - [Train] step: 744799, loss: 0.029375, lr: 0.000200
2023-03-11 18:42:45,674 - INFO - [Train] step: 745199, loss: 0.024029, lr: 0.000200
2023-03-11 18:44:22,150 - INFO - [Train] step: 745599, loss: 0.035645, lr: 0.000200
2023-03-11 18:45:58,704 - INFO - [Train] step: 745999, loss: 0.034387, lr: 0.000200
2023-03-11 18:47:35,633 - INFO - [Train] step: 746399, loss: 0.022965, lr: 0.000200
2023-03-11 18:49:12,369 - INFO - [Train] step: 746799, loss: 0.029780, lr: 0.000200
2023-03-11 18:50:48,299 - INFO - [Train] step: 747199, loss: 0.031841, lr: 0.000200
2023-03-11 18:52:24,858 - INFO - [Train] step: 747599, loss: 0.027105, lr: 0.000200
2023-03-11 18:54:00,682 - INFO - [Train] step: 747999, loss: 0.032458, lr: 0.000200
2023-03-11 18:55:36,733 - INFO - [Train] step: 748399, loss: 0.034356, lr: 0.000200
2023-03-11 18:57:13,307 - INFO - [Train] step: 748799, loss: 0.021898, lr: 0.000200
2023-03-11 18:58:49,740 - INFO - [Train] step: 749199, loss: 0.019681, lr: 0.000200
2023-03-11 19:00:25,987 - INFO - [Train] step: 749599, loss: 0.024056, lr: 0.000200
2023-03-11 19:02:02,266 - INFO - [Train] step: 749999, loss: 0.038373, lr: 0.000200
2023-03-11 19:04:08,323 - INFO - [Train] step: 750399, loss: 0.024289, lr: 0.000200
2023-03-11 19:05:44,575 - INFO - [Train] step: 750799, loss: 0.042348, lr: 0.000200
2023-03-11 19:07:20,930 - INFO - [Train] step: 751199, loss: 0.025763, lr: 0.000200
2023-03-11 19:08:57,632 - INFO - [Train] step: 751599, loss: 0.033419, lr: 0.000200
2023-03-11 19:10:33,877 - INFO - [Train] step: 751999, loss: 0.035033, lr: 0.000200
2023-03-11 19:12:10,438 - INFO - [Train] step: 752399, loss: 0.015793, lr: 0.000200
2023-03-11 19:13:46,445 - INFO - [Train] step: 752799, loss: 0.034587, lr: 0.000200
2023-03-11 19:15:21,963 - INFO - [Train] step: 753199, loss: 0.033066, lr: 0.000200
2023-03-11 19:16:57,340 - INFO - [Train] step: 753599, loss: 0.023565, lr: 0.000200
2023-03-11 19:18:36,796 - INFO - [Train] step: 753999, loss: 0.029453, lr: 0.000200
2023-03-11 19:20:13,637 - INFO - [Train] step: 754399, loss: 0.037548, lr: 0.000200
2023-03-11 19:21:52,615 - INFO - [Train] step: 754799, loss: 0.028376, lr: 0.000200
2023-03-11 19:23:59,712 - INFO - [Train] step: 755199, loss: 0.019059, lr: 0.000200
2023-03-11 19:25:37,992 - INFO - [Train] step: 755599, loss: 0.050130, lr: 0.000200
2023-03-11 19:27:14,672 - INFO - [Train] step: 755999, loss: 0.038899, lr: 0.000200
2023-03-11 19:28:50,430 - INFO - [Train] step: 756399, loss: 0.024085, lr: 0.000200
2023-03-11 19:30:28,382 - INFO - [Train] step: 756799, loss: 0.034056, lr: 0.000200
2023-03-11 19:32:05,148 - INFO - [Train] step: 757199, loss: 0.044921, lr: 0.000200
2023-03-11 19:33:41,011 - INFO - [Train] step: 757599, loss: 0.020874, lr: 0.000200
2023-03-11 19:35:17,390 - INFO - [Train] step: 757999, loss: 0.024126, lr: 0.000200
2023-03-11 19:36:53,648 - INFO - [Train] step: 758399, loss: 0.024475, lr: 0.000200
2023-03-11 19:38:30,095 - INFO - [Train] step: 758799, loss: 0.038106, lr: 0.000200
2023-03-11 19:40:06,551 - INFO - [Train] step: 759199, loss: 0.026293, lr: 0.000200
2023-03-11 19:41:42,988 - INFO - [Train] step: 759599, loss: 0.026307, lr: 0.000200
2023-03-11 19:43:17,397 - INFO - [Train] step: 759999, loss: 0.028809, lr: 0.000200
2023-03-11 19:45:19,664 - INFO - [Train] step: 760399, loss: 0.013769, lr: 0.000200
2023-03-11 19:46:48,626 - INFO - [Train] step: 760799, loss: 0.034521, lr: 0.000200
2023-03-11 19:48:22,945 - INFO - [Train] step: 761199, loss: 0.029821, lr: 0.000200
2023-03-11 19:49:59,278 - INFO - [Train] step: 761599, loss: 0.021467, lr: 0.000200
2023-03-11 19:51:36,620 - INFO - [Train] step: 761999, loss: 0.018224, lr: 0.000200
2023-03-11 19:53:12,312 - INFO - [Train] step: 762399, loss: 0.033974, lr: 0.000200
2023-03-11 19:54:48,054 - INFO - [Train] step: 762799, loss: 0.035782, lr: 0.000200
2023-03-11 19:56:25,190 - INFO - [Train] step: 763199, loss: 0.014186, lr: 0.000200
2023-03-11 19:58:01,077 - INFO - [Train] step: 763599, loss: 0.015066, lr: 0.000200
2023-03-11 19:59:36,906 - INFO - [Train] step: 763999, loss: 0.028308, lr: 0.000200
2023-03-11 20:01:17,737 - INFO - [Train] step: 764399, loss: 0.035953, lr: 0.000200
2023-03-11 20:02:54,243 - INFO - [Train] step: 764799, loss: 0.029906, lr: 0.000200
2023-03-11 20:04:59,356 - INFO - [Train] step: 765199, loss: 0.026399, lr: 0.000200
2023-03-11 20:06:35,232 - INFO - [Train] step: 765599, loss: 0.035844, lr: 0.000200
2023-03-11 20:08:11,564 - INFO - [Train] step: 765999, loss: 0.028848, lr: 0.000200
2023-03-11 20:09:47,696 - INFO - [Train] step: 766399, loss: 0.023812, lr: 0.000200
2023-03-11 20:11:24,097 - INFO - [Train] step: 766799, loss: 0.022392, lr: 0.000200
2023-03-11 20:13:00,639 - INFO - [Train] step: 767199, loss: 0.032679, lr: 0.000200
2023-03-11 20:14:38,179 - INFO - [Train] step: 767599, loss: 0.034492, lr: 0.000200
2023-03-11 20:16:17,326 - INFO - [Train] step: 767999, loss: 0.028531, lr: 0.000200
2023-03-11 20:17:56,543 - INFO - [Train] step: 768399, loss: 0.017895, lr: 0.000200
2023-03-11 20:19:32,591 - INFO - [Train] step: 768799, loss: 0.021054, lr: 0.000200
2023-03-11 20:21:08,718 - INFO - [Train] step: 769199, loss: 0.014756, lr: 0.000200
2023-03-11 20:22:44,547 - INFO - [Train] step: 769599, loss: 0.027979, lr: 0.000200
2023-03-11 20:24:20,425 - INFO - [Train] step: 769999, loss: 0.023378, lr: 0.000200
2023-03-11 20:26:25,524 - INFO - [Train] step: 770399, loss: 0.025287, lr: 0.000200
2023-03-11 20:28:01,658 - INFO - [Train] step: 770799, loss: 0.027986, lr: 0.000200
2023-03-11 20:29:39,002 - INFO - [Train] step: 771199, loss: 0.035915, lr: 0.000200
2023-03-11 20:31:15,906 - INFO - [Train] step: 771599, loss: 0.031995, lr: 0.000200
2023-03-11 20:32:51,735 - INFO - [Train] step: 771999, loss: 0.024663, lr: 0.000200
2023-03-11 20:34:27,337 - INFO - [Train] step: 772399, loss: 0.021219, lr: 0.000200
2023-03-11 20:36:03,024 - INFO - [Train] step: 772799, loss: 0.021674, lr: 0.000200
2023-03-11 20:37:39,405 - INFO - [Train] step: 773199, loss: 0.013911, lr: 0.000200
2023-03-11 20:39:15,428 - INFO - [Train] step: 773599, loss: 0.026987, lr: 0.000200
2023-03-11 20:40:51,903 - INFO - [Train] step: 773999, loss: 0.031791, lr: 0.000200
2023-03-11 20:42:27,757 - INFO - [Train] step: 774399, loss: 0.032313, lr: 0.000200
2023-03-11 20:44:03,665 - INFO - [Train] step: 774799, loss: 0.013749, lr: 0.000200
2023-03-11 20:46:08,456 - INFO - [Train] step: 775199, loss: 0.020219, lr: 0.000200
2023-03-11 20:47:44,775 - INFO - [Train] step: 775599, loss: 0.031643, lr: 0.000200
2023-03-11 20:49:20,526 - INFO - [Train] step: 775999, loss: 0.033191, lr: 0.000200
2023-03-11 20:50:56,618 - INFO - [Train] step: 776399, loss: 0.042735, lr: 0.000200
2023-03-11 20:52:32,921 - INFO - [Train] step: 776799, loss: 0.031314, lr: 0.000200
2023-03-11 20:54:09,624 - INFO - [Train] step: 777199, loss: 0.031753, lr: 0.000200
2023-03-11 20:55:45,677 - INFO - [Train] step: 777599, loss: 0.019752, lr: 0.000200
2023-03-11 20:57:21,883 - INFO - [Train] step: 777999, loss: 0.030870, lr: 0.000200
2023-03-11 20:58:57,922 - INFO - [Train] step: 778399, loss: 0.045564, lr: 0.000200
2023-03-11 21:00:34,595 - INFO - [Train] step: 778799, loss: 0.022861, lr: 0.000200
2023-03-11 21:02:10,220 - INFO - [Train] step: 779199, loss: 0.024543, lr: 0.000200
2023-03-11 21:03:46,329 - INFO - [Train] step: 779599, loss: 0.042216, lr: 0.000200
2023-03-11 21:05:22,972 - INFO - [Train] step: 779999, loss: 0.018143, lr: 0.000200
2023-03-11 21:07:31,014 - INFO - [Train] step: 780399, loss: 0.025397, lr: 0.000200
2023-03-11 21:09:07,453 - INFO - [Train] step: 780799, loss: 0.029465, lr: 0.000200
2023-03-11 21:10:43,958 - INFO - [Train] step: 781199, loss: 0.030006, lr: 0.000200
2023-03-11 21:12:20,795 - INFO - [Train] step: 781599, loss: 0.029038, lr: 0.000200
2023-03-11 21:13:59,817 - INFO - [Train] step: 781999, loss: 0.022382, lr: 0.000200
2023-03-11 21:15:35,754 - INFO - [Train] step: 782399, loss: 0.026610, lr: 0.000200
2023-03-11 21:17:05,564 - INFO - [Train] step: 782799, loss: 0.020092, lr: 0.000200
2023-03-11 21:18:34,597 - INFO - [Train] step: 783199, loss: 0.033060, lr: 0.000200
2023-03-11 21:20:03,948 - INFO - [Train] step: 783599, loss: 0.019394, lr: 0.000200
2023-03-11 21:21:42,251 - INFO - [Train] step: 783999, loss: 0.018846, lr: 0.000200
2023-03-11 21:23:18,333 - INFO - [Train] step: 784399, loss: 0.019472, lr: 0.000200
2023-03-11 21:24:54,370 - INFO - [Train] step: 784799, loss: 0.028416, lr: 0.000200
2023-03-11 21:26:59,361 - INFO - [Train] step: 785199, loss: 0.026607, lr: 0.000200
2023-03-11 21:28:35,250 - INFO - [Train] step: 785599, loss: 0.026305, lr: 0.000200
2023-03-11 21:30:11,240 - INFO - [Train] step: 785999, loss: 0.018944, lr: 0.000200
2023-03-11 21:31:47,279 - INFO - [Train] step: 786399, loss: 0.026515, lr: 0.000200
2023-03-11 21:33:24,356 - INFO - [Train] step: 786799, loss: 0.031231, lr: 0.000200
2023-03-11 21:35:05,198 - INFO - [Train] step: 787199, loss: 0.031089, lr: 0.000200
2023-03-11 21:36:41,096 - INFO - [Train] step: 787599, loss: 0.015997, lr: 0.000200
2023-03-11 21:38:16,673 - INFO - [Train] step: 787999, loss: 0.020729, lr: 0.000200
2023-03-11 21:39:52,695 - INFO - [Train] step: 788399, loss: 0.028715, lr: 0.000200
2023-03-11 21:41:28,597 - INFO - [Train] step: 788799, loss: 0.017167, lr: 0.000200
2023-03-11 21:43:04,868 - INFO - [Train] step: 789199, loss: 0.017309, lr: 0.000200
2023-03-11 21:44:41,521 - INFO - [Train] step: 789599, loss: 0.033966, lr: 0.000200
2023-03-11 21:46:17,479 - INFO - [Train] step: 789999, loss: 0.021309, lr: 0.000200
2023-03-11 21:48:23,781 - INFO - [Train] step: 790399, loss: 0.022919, lr: 0.000200
2023-03-11 21:49:59,764 - INFO - [Train] step: 790799, loss: 0.031348, lr: 0.000200
2023-03-11 21:51:35,952 - INFO - [Train] step: 791199, loss: 0.030470, lr: 0.000200
2023-03-11 21:53:14,717 - INFO - [Train] step: 791599, loss: 0.023486, lr: 0.000200
2023-03-11 21:54:51,024 - INFO - [Train] step: 791999, loss: 0.040894, lr: 0.000200
2023-03-11 21:56:29,325 - INFO - [Train] step: 792399, loss: 0.031775, lr: 0.000200
2023-03-11 21:58:05,678 - INFO - [Train] step: 792799, loss: 0.019100, lr: 0.000200
2023-03-11 21:59:41,908 - INFO - [Train] step: 793199, loss: 0.020895, lr: 0.000200
2023-03-11 22:01:18,052 - INFO - [Train] step: 793599, loss: 0.023675, lr: 0.000200
2023-03-11 22:02:54,504 - INFO - [Train] step: 793999, loss: 0.031120, lr: 0.000200
2023-03-11 22:04:30,911 - INFO - [Train] step: 794399, loss: 0.033231, lr: 0.000200
2023-03-11 22:06:07,850 - INFO - [Train] step: 794799, loss: 0.030955, lr: 0.000200
2023-03-11 22:08:14,657 - INFO - [Train] step: 795199, loss: 0.022984, lr: 0.000200
2023-03-11 22:09:51,374 - INFO - [Train] step: 795599, loss: 0.054542, lr: 0.000200
2023-03-11 22:11:27,941 - INFO - [Train] step: 795999, loss: 0.029098, lr: 0.000200
2023-03-11 22:13:03,467 - INFO - [Train] step: 796399, loss: 0.024714, lr: 0.000200
2023-03-11 22:14:40,205 - INFO - [Train] step: 796799, loss: 0.021979, lr: 0.000200
2023-03-11 22:16:16,381 - INFO - [Train] step: 797199, loss: 0.023458, lr: 0.000200
2023-03-11 22:17:52,188 - INFO - [Train] step: 797599, loss: 0.045971, lr: 0.000200
2023-03-11 22:19:28,134 - INFO - [Train] step: 797999, loss: 0.023918, lr: 0.000200
2023-03-11 22:21:04,160 - INFO - [Train] step: 798399, loss: 0.022313, lr: 0.000200
2023-03-11 22:22:40,194 - INFO - [Train] step: 798799, loss: 0.019067, lr: 0.000200
2023-03-11 22:24:16,319 - INFO - [Train] step: 799199, loss: 0.027809, lr: 0.000200
2023-03-11 22:25:52,364 - INFO - [Train] step: 799599, loss: 0.029273, lr: 0.000200
2023-03-11 22:27:28,521 - INFO - [Train] step: 799999, loss: 0.026791, lr: 0.000200
2023-03-11 22:27:58,411 - INFO - End of training