2023-12-16 15:04:41,022 44k INFO {'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 4, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 10, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 768, 'ssl_dim': 768, 'n_speakers': 1, 'speech_encoder': 'vec768l12', 'speaker_embedding': False}, 'spk': {'wdlm': 0}, 'model_dir': './logs\\44k'} 2023-12-16 15:04:41,023 44k WARNING F:\GitHub\ai\So-VITS-SVC\аæÕûºÏ°ü\so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-12-16 15:04:43,550 44k INFO emb_g.weight is not in the checkpoint 2023-12-16 15:04:43,634 44k INFO Loaded checkpoint './logs\44k\G_0.pth' (iteration 0) 2023-12-16 15:04:43,750 44k INFO Loaded checkpoint './logs\44k\D_0.pth' (iteration 0) 2023-12-16 15:05:20,752 44k INFO ====> Epoch: 1, cost 39.73 s 2023-12-16 15:05:40,307 44k INFO ====> Epoch: 2, cost 19.55 s 2023-12-16 15:05:59,694 44k INFO ====> Epoch: 3, cost 19.39 s 2023-12-16 15:06:19,142 44k INFO ====> Epoch: 4, cost 19.45 s 2023-12-16 15:06:38,678 44k INFO ====> Epoch: 5, cost 19.54 s 2023-12-16 15:06:57,869 44k INFO ====> Epoch: 6, cost 19.19 s 2023-12-16 15:07:17,453 44k INFO ====> Epoch: 7, cost 19.58 s 2023-12-16 15:07:37,042 44k INFO ====> Epoch: 8, cost 19.59 s 2023-12-16 15:07:56,804 44k INFO ====> Epoch: 9, cost 19.76 s 2023-12-16 15:08:16,352 44k INFO ====> Epoch: 10, cost 19.55 s 2023-12-16 15:08:35,830 44k INFO ====> Epoch: 11, cost 19.48 s 2023-12-16 15:08:55,383 44k INFO ====> Epoch: 12, cost 19.55 s 2023-12-16 15:09:15,077 44k INFO ====> Epoch: 13, cost 19.69 s 2023-12-16 15:09:25,950 44k INFO Train Epoch: 14 [27%] 2023-12-16 15:09:25,951 44k INFO Losses: [2.7133049964904785, 2.5206117630004883, 13.179529190063477, 26.031518936157227, 1.5554561614990234], step: 200, lr: 9.983762181915804e-05, reference_loss: 46.00041961669922 2023-12-16 15:09:35,599 44k INFO ====> Epoch: 14, cost 20.52 s 2023-12-16 15:09:55,081 44k INFO ====> Epoch: 15, cost 19.48 s 2023-12-16 15:10:14,788 44k INFO ====> Epoch: 16, cost 19.71 s 2023-12-16 15:10:34,315 44k INFO ====> Epoch: 17, cost 19.53 s 2023-12-16 15:10:53,989 44k INFO ====> Epoch: 18, cost 19.67 s 2023-12-16 15:11:13,580 44k INFO ====> Epoch: 19, cost 19.59 s 2023-12-16 15:11:33,136 44k INFO ====> Epoch: 20, cost 19.56 s 2023-12-16 15:11:52,753 44k INFO ====> Epoch: 21, cost 19.62 s 2023-12-16 15:12:12,334 44k INFO ====> Epoch: 22, cost 19.58 s 2023-12-16 15:12:32,372 44k INFO ====> Epoch: 23, cost 20.04 s 2023-12-16 15:12:51,943 44k INFO ====> Epoch: 24, cost 19.57 s 2023-12-16 15:13:11,569 44k INFO ====> Epoch: 25, cost 19.63 s 2023-12-16 15:13:31,325 44k INFO ====> Epoch: 26, cost 19.76 s 2023-12-16 15:13:46,590 44k INFO Train Epoch: 27 [60%] 2023-12-16 15:13:46,591 44k INFO Losses: [2.5245213508605957, 2.2742867469787598, 14.102811813354492, 29.21864891052246, 1.0917563438415527], step: 400, lr: 9.967550730505221e-05, reference_loss: 49.2120246887207 2023-12-16 15:13:51,717 44k INFO ====> Epoch: 27, cost 20.39 s 2023-12-16 15:14:11,286 44k INFO ====> Epoch: 28, cost 19.57 s 2023-12-16 15:14:30,702 44k INFO ====> Epoch: 29, cost 19.42 s 2023-12-16 15:14:50,274 44k INFO ====> Epoch: 30, cost 19.57 s 2023-12-16 15:15:09,671 44k INFO ====> Epoch: 31, cost 19.40 s 2023-12-16 15:15:29,009 44k INFO ====> Epoch: 32, cost 19.34 s 2023-12-16 15:15:48,421 44k INFO ====> Epoch: 33, cost 19.41 s 2023-12-16 15:16:07,845 44k INFO ====> Epoch: 34, cost 19.42 s 2023-12-16 15:16:27,367 44k INFO ====> Epoch: 35, cost 19.52 s 2023-12-16 15:16:46,775 44k INFO ====> Epoch: 36, cost 19.41 s 2023-12-16 15:17:06,200 44k INFO ====> Epoch: 37, cost 19.42 s 2023-12-16 15:17:25,567 44k INFO ====> Epoch: 38, cost 19.37 s 2023-12-16 15:17:45,025 44k INFO ====> Epoch: 39, cost 19.46 s 2023-12-16 15:18:04,089 44k INFO Train Epoch: 40 [93%] 2023-12-16 15:18:04,090 44k INFO Losses: [2.4473180770874023, 2.847722291946411, 11.493583679199219, 23.61637306213379, 1.0417319536209106], step: 600, lr: 9.951365602954526e-05, reference_loss: 41.44673156738281 2023-12-16 15:18:05,181 44k INFO ====> Epoch: 40, cost 20.16 s 2023-12-16 15:18:24,849 44k INFO ====> Epoch: 41, cost 19.67 s 2023-12-16 15:18:44,401 44k INFO ====> Epoch: 42, cost 19.55 s 2023-12-16 15:19:04,037 44k INFO ====> Epoch: 43, cost 19.64 s 2023-12-16 15:19:23,422 44k INFO ====> Epoch: 44, cost 19.39 s 2023-12-16 15:19:42,993 44k INFO ====> Epoch: 45, cost 19.57 s 2023-12-16 15:20:02,610 44k INFO ====> Epoch: 46, cost 19.62 s 2023-12-16 15:20:21,948 44k INFO ====> Epoch: 47, cost 19.34 s 2023-12-16 15:20:41,400 44k INFO ====> Epoch: 48, cost 19.45 s 2023-12-16 15:21:01,017 44k INFO ====> Epoch: 49, cost 19.62 s 2023-12-16 15:21:20,617 44k INFO ====> Epoch: 50, cost 19.60 s 2023-12-16 15:21:40,010 44k INFO ====> Epoch: 51, cost 19.39 s 2023-12-16 15:21:59,399 44k INFO ====> Epoch: 52, cost 19.39 s 2023-12-16 15:22:18,958 44k INFO ====> Epoch: 53, cost 19.56 s 2023-12-16 15:22:29,788 44k INFO Train Epoch: 54 [27%] 2023-12-16 15:22:29,789 44k INFO Losses: [2.9208569526672363, 2.2710936069488525, 9.842162132263184, 24.396034240722656, 0.9827083945274353], step: 800, lr: 9.933964855674948e-05, reference_loss: 40.4128532409668 2023-12-16 15:22:37,669 44k INFO Saving model and optimizer state at iteration 54 to ./logs\44k\G_800.pth 2023-12-16 15:22:38,843 44k INFO Saving model and optimizer state at iteration 54 to ./logs\44k\D_800.pth 2023-12-16 15:22:54,576 44k INFO ====> Epoch: 54, cost 35.62 s 2023-12-16 15:23:13,964 44k INFO ====> Epoch: 55, cost 19.39 s 2023-12-16 15:23:33,390 44k INFO ====> Epoch: 56, cost 19.43 s 2023-12-16 15:23:53,150 44k INFO ====> Epoch: 57, cost 19.76 s 2023-12-16 15:24:12,495 44k INFO ====> Epoch: 58, cost 19.34 s 2023-12-16 15:24:31,881 44k INFO ====> Epoch: 59, cost 19.39 s 2023-12-16 15:24:51,232 44k INFO ====> Epoch: 60, cost 19.35 s 2023-12-16 15:25:10,629 44k INFO ====> Epoch: 61, cost 19.40 s 2023-12-16 15:25:30,070 44k INFO ====> Epoch: 62, cost 19.44 s 2023-12-16 15:25:49,473 44k INFO ====> Epoch: 63, cost 19.40 s 2023-12-16 15:26:08,857 44k INFO ====> Epoch: 64, cost 19.38 s 2023-12-16 15:26:28,395 44k INFO ====> Epoch: 65, cost 19.54 s 2023-12-16 15:26:47,925 44k INFO ====> Epoch: 66, cost 19.53 s 2023-12-16 15:27:03,096 44k INFO Train Epoch: 67 [60%] 2023-12-16 15:27:03,097 44k INFO Losses: [2.458609104156494, 2.661210775375366, 10.685734748840332, 27.952234268188477, 0.9748334884643555], step: 1000, lr: 9.917834264256819e-05, reference_loss: 44.73262405395508 2023-12-16 15:27:08,009 44k INFO ====> Epoch: 67, cost 20.08 s 2023-12-16 15:27:27,429 44k INFO ====> Epoch: 68, cost 19.42 s 2023-12-16 15:27:47,007 44k INFO ====> Epoch: 69, cost 19.58 s 2023-12-16 15:28:06,297 44k INFO ====> Epoch: 70, cost 19.29 s 2023-12-16 15:28:25,878 44k INFO ====> Epoch: 71, cost 19.58 s 2023-12-16 15:28:45,321 44k INFO ====> Epoch: 72, cost 19.44 s 2023-12-16 15:29:05,157 44k INFO ====> Epoch: 73, cost 19.84 s 2023-12-16 15:29:24,709 44k INFO ====> Epoch: 74, cost 19.55 s 2023-12-16 15:29:44,141 44k INFO ====> Epoch: 75, cost 19.43 s 2023-12-16 15:30:03,675 44k INFO ====> Epoch: 76, cost 19.53 s 2023-12-16 15:30:23,155 44k INFO ====> Epoch: 77, cost 19.48 s 2023-12-16 15:30:42,525 44k INFO ====> Epoch: 78, cost 19.37 s 2023-12-16 15:31:01,863 44k INFO ====> Epoch: 79, cost 19.34 s 2023-12-16 15:31:20,865 44k INFO Train Epoch: 80 [93%] 2023-12-16 15:31:20,865 44k INFO Losses: [2.663787364959717, 3.247288465499878, 8.122236251831055, 20.93960189819336, 1.0062497854232788], step: 1200, lr: 9.901729865399597e-05, reference_loss: 35.979164123535156 2023-12-16 15:31:21,967 44k INFO ====> Epoch: 80, cost 20.10 s 2023-12-16 15:31:41,674 44k INFO ====> Epoch: 81, cost 19.71 s 2023-12-16 15:32:01,293 44k INFO ====> Epoch: 82, cost 19.62 s 2023-12-16 15:32:20,806 44k INFO ====> Epoch: 83, cost 19.51 s 2023-12-16 15:32:40,420 44k INFO ====> Epoch: 84, cost 19.61 s 2023-12-16 15:32:59,930 44k INFO ====> Epoch: 85, cost 19.51 s 2023-12-16 15:33:19,674 44k INFO ====> Epoch: 86, cost 19.74 s 2023-12-16 15:33:39,168 44k INFO ====> Epoch: 87, cost 19.49 s 2023-12-16 15:33:58,787 44k INFO ====> Epoch: 88, cost 19.62 s 2023-12-16 15:34:18,402 44k INFO ====> Epoch: 89, cost 19.61 s 2023-12-16 15:34:37,991 44k INFO ====> Epoch: 90, cost 19.59 s 2023-12-16 15:34:57,640 44k INFO ====> Epoch: 91, cost 19.65 s 2023-12-16 15:35:17,108 44k INFO ====> Epoch: 92, cost 19.47 s 2023-12-16 15:35:36,639 44k INFO ====> Epoch: 93, cost 19.53 s 2023-12-16 15:35:47,466 44k INFO Train Epoch: 94 [27%] 2023-12-16 15:35:47,467 44k INFO Losses: [2.3108437061309814, 2.647261381149292, 9.941719055175781, 24.150951385498047, 1.0645778179168701], step: 1400, lr: 9.884415910120204e-05, reference_loss: 40.115352630615234 2023-12-16 15:35:56,623 44k INFO ====> Epoch: 94, cost 19.98 s 2023-12-16 15:36:17,207 44k INFO ====> Epoch: 95, cost 20.58 s 2023-12-16 15:36:36,949 44k INFO ====> Epoch: 96, cost 19.74 s 2023-12-16 15:36:56,492 44k INFO ====> Epoch: 97, cost 19.54 s 2023-12-16 15:37:15,984 44k INFO ====> Epoch: 98, cost 19.49 s 2023-12-16 15:37:35,477 44k INFO ====> Epoch: 99, cost 19.49 s 2023-12-16 15:37:54,962 44k INFO ====> Epoch: 100, cost 19.48 s 2023-12-16 15:38:14,427 44k INFO ====> Epoch: 101, cost 19.47 s 2023-12-16 15:38:33,960 44k INFO ====> Epoch: 102, cost 19.53 s 2023-12-16 15:38:53,427 44k INFO ====> Epoch: 103, cost 19.47 s 2023-12-16 15:39:13,006 44k INFO ====> Epoch: 104, cost 19.58 s 2023-12-16 15:39:32,474 44k INFO ====> Epoch: 105, cost 19.47 s 2023-12-16 15:39:52,098 44k INFO ====> Epoch: 106, cost 19.62 s 2023-12-16 15:40:07,035 44k INFO Train Epoch: 107 [60%] 2023-12-16 15:40:07,035 44k INFO Losses: [2.45098876953125, 2.4585273265838623, 9.254941940307617, 23.134368896484375, 1.1095484495162964], step: 1600, lr: 9.868365775378495e-05, reference_loss: 38.40837478637695 2023-12-16 15:40:13,004 44k INFO Saving model and optimizer state at iteration 107 to ./logs\44k\G_1600.pth 2023-12-16 15:40:14,156 44k INFO Saving model and optimizer state at iteration 107 to ./logs\44k\D_1600.pth 2023-12-16 15:40:26,704 44k INFO ====> Epoch: 107, cost 34.61 s 2023-12-16 15:40:46,241 44k INFO ====> Epoch: 108, cost 19.54 s 2023-12-16 15:41:05,834 44k INFO ====> Epoch: 109, cost 19.59 s 2023-12-16 15:41:25,663 44k INFO ====> Epoch: 110, cost 19.83 s 2023-12-16 15:41:45,023 44k INFO ====> Epoch: 111, cost 19.36 s 2023-12-16 15:42:04,616 44k INFO ====> Epoch: 112, cost 19.59 s 2023-12-16 15:42:24,096 44k INFO ====> Epoch: 113, cost 19.48 s 2023-12-16 15:42:43,552 44k INFO ====> Epoch: 114, cost 19.46 s 2023-12-16 15:43:03,031 44k INFO ====> Epoch: 115, cost 19.48 s 2023-12-16 15:43:22,554 44k INFO ====> Epoch: 116, cost 19.52 s 2023-12-16 15:43:42,150 44k INFO ====> Epoch: 117, cost 19.60 s 2023-12-16 15:44:01,854 44k INFO ====> Epoch: 118, cost 19.70 s 2023-12-16 15:44:21,325 44k INFO ====> Epoch: 119, cost 19.47 s 2023-12-16 15:44:40,170 44k INFO Train Epoch: 120 [93%] 2023-12-16 15:44:40,170 44k INFO Losses: [2.4771015644073486, 2.853649854660034, 11.530167579650879, 24.376300811767578, 0.5404216647148132], step: 1800, lr: 9.8523417025536e-05, reference_loss: 41.77764129638672 2023-12-16 15:44:41,229 44k INFO ====> Epoch: 120, cost 19.90 s 2023-12-16 15:45:00,942 44k INFO ====> Epoch: 121, cost 19.71 s 2023-12-16 15:45:20,280 44k INFO ====> Epoch: 122, cost 19.34 s 2023-12-16 15:45:39,893 44k INFO ====> Epoch: 123, cost 19.61 s 2023-12-16 15:45:59,234 44k INFO ====> Epoch: 124, cost 19.34 s 2023-12-16 15:46:18,699 44k INFO ====> Epoch: 125, cost 19.46 s 2023-12-16 15:46:38,264 44k INFO ====> Epoch: 126, cost 19.57 s 2023-12-16 15:46:57,854 44k INFO ====> Epoch: 127, cost 19.59 s 2023-12-16 15:47:17,411 44k INFO ====> Epoch: 128, cost 19.56 s 2023-12-16 15:47:36,828 44k INFO ====> Epoch: 129, cost 19.42 s 2023-12-16 15:47:56,359 44k INFO ====> Epoch: 130, cost 19.53 s 2023-12-16 15:48:15,827 44k INFO ====> Epoch: 131, cost 19.47 s 2023-12-16 15:48:35,133 44k INFO ====> Epoch: 132, cost 19.30 s 2023-12-16 15:48:54,565 44k INFO ====> Epoch: 133, cost 19.43 s 2023-12-16 15:49:05,575 44k INFO Train Epoch: 134 [27%] 2023-12-16 15:49:05,576 44k INFO Losses: [2.5679867267608643, 2.2793493270874023, 7.851959228515625, 20.863174438476562, 0.6462520360946655], step: 2000, lr: 9.835114106370493e-05, reference_loss: 34.20872116088867 2023-12-16 15:49:14,707 44k INFO ====> Epoch: 134, cost 20.14 s 2023-12-16 15:49:34,111 44k INFO ====> Epoch: 135, cost 19.40 s 2023-12-16 15:49:53,567 44k INFO ====> Epoch: 136, cost 19.46 s 2023-12-16 15:50:12,940 44k INFO ====> Epoch: 137, cost 19.37 s 2023-12-16 15:50:32,346 44k INFO ====> Epoch: 138, cost 19.41 s 2023-12-16 15:50:51,737 44k INFO ====> Epoch: 139, cost 19.39 s 2023-12-16 15:51:11,084 44k INFO ====> Epoch: 140, cost 19.35 s 2023-12-16 15:51:30,487 44k INFO ====> Epoch: 141, cost 19.40 s 2023-12-16 15:51:49,943 44k INFO ====> Epoch: 142, cost 19.46 s 2023-12-16 15:52:09,267 44k INFO ====> Epoch: 143, cost 19.32 s 2023-12-16 15:52:28,667 44k INFO ====> Epoch: 144, cost 19.40 s 2023-12-16 15:52:48,041 44k INFO ====> Epoch: 145, cost 19.37 s 2023-12-16 15:53:07,455 44k INFO ====> Epoch: 146, cost 19.41 s 2023-12-16 15:53:22,494 44k INFO Train Epoch: 147 [60%] 2023-12-16 15:53:22,495 44k INFO Losses: [2.527799606323242, 2.258890151977539, 7.608017921447754, 19.962331771850586, 0.5410811901092529], step: 2200, lr: 9.819144027000834e-05, reference_loss: 32.89812088012695 2023-12-16 15:53:27,410 44k INFO ====> Epoch: 147, cost 19.95 s 2023-12-16 15:53:46,842 44k INFO ====> Epoch: 148, cost 19.43 s 2023-12-16 15:54:06,183 44k INFO ====> Epoch: 149, cost 19.34 s 2023-12-16 15:54:25,633 44k INFO ====> Epoch: 150, cost 19.45 s 2023-12-16 15:54:45,022 44k INFO ====> Epoch: 151, cost 19.39 s 2023-12-16 15:55:04,427 44k INFO ====> Epoch: 152, cost 19.41 s 2023-12-16 15:55:23,826 44k INFO ====> Epoch: 153, cost 19.40 s 2023-12-16 15:55:43,255 44k INFO ====> Epoch: 154, cost 19.43 s 2023-12-16 15:56:02,827 44k INFO ====> Epoch: 155, cost 19.57 s 2023-12-16 15:56:22,196 44k INFO ====> Epoch: 156, cost 19.37 s 2023-12-16 15:56:41,582 44k INFO ====> Epoch: 157, cost 19.39 s 2023-12-16 15:57:00,942 44k INFO ====> Epoch: 158, cost 19.36 s 2023-12-16 15:57:20,430 44k INFO ====> Epoch: 159, cost 19.49 s 2023-12-16 15:57:39,208 44k INFO Train Epoch: 160 [93%] 2023-12-16 15:57:39,209 44k INFO Losses: [2.898914337158203, 2.1981236934661865, 3.2429311275482178, 16.30094337463379, 0.4817192256450653], step: 2400, lr: 9.803199879555537e-05, reference_loss: 25.12263298034668 2023-12-16 15:57:45,179 44k INFO Saving model and optimizer state at iteration 160 to ./logs\44k\G_2400.pth 2023-12-16 15:57:46,463 44k INFO Saving model and optimizer state at iteration 160 to ./logs\44k\D_2400.pth 2023-12-16 15:57:53,592 44k INFO ====> Epoch: 160, cost 33.16 s 2023-12-16 15:58:13,664 44k INFO ====> Epoch: 161, cost 20.07 s 2023-12-16 15:58:33,743 44k INFO ====> Epoch: 162, cost 20.08 s 2023-12-16 15:58:53,249 44k INFO ====> Epoch: 163, cost 19.51 s 2023-12-16 15:59:13,104 44k INFO ====> Epoch: 164, cost 19.85 s 2023-12-16 15:59:32,464 44k INFO ====> Epoch: 165, cost 19.36 s 2023-12-16 15:59:52,197 44k INFO ====> Epoch: 166, cost 19.73 s 2023-12-16 16:00:11,606 44k INFO ====> Epoch: 167, cost 19.41 s 2023-12-16 16:00:31,166 44k INFO ====> Epoch: 168, cost 19.56 s 2023-12-16 16:00:50,583 44k INFO ====> Epoch: 169, cost 19.42 s 2023-12-16 16:01:10,060 44k INFO ====> Epoch: 170, cost 19.48 s 2023-12-16 16:01:29,670 44k INFO ====> Epoch: 171, cost 19.61 s 2023-12-16 16:01:49,211 44k INFO ====> Epoch: 172, cost 19.54 s 2023-12-16 16:02:08,850 44k INFO ====> Epoch: 173, cost 19.64 s 2023-12-16 16:02:19,789 44k INFO Train Epoch: 174 [27%] 2023-12-16 16:02:19,790 44k INFO Losses: [2.1893320083618164, 2.635559320449829, 11.227259635925293, 25.670455932617188, 0.6490182280540466], step: 2600, lr: 9.786058211724074e-05, reference_loss: 42.37162399291992 2023-12-16 16:02:29,001 44k INFO ====> Epoch: 174, cost 20.15 s 2023-12-16 16:02:48,647 44k INFO ====> Epoch: 175, cost 19.65 s 2023-12-16 16:03:08,256 44k INFO ====> Epoch: 176, cost 19.61 s 2023-12-16 16:03:27,795 44k INFO ====> Epoch: 177, cost 19.54 s 2023-12-16 16:03:47,377 44k INFO ====> Epoch: 178, cost 19.58 s 2023-12-16 16:04:06,754 44k INFO ====> Epoch: 179, cost 19.38 s 2023-12-16 16:04:26,378 44k INFO ====> Epoch: 180, cost 19.62 s 2023-12-16 16:04:46,184 44k INFO ====> Epoch: 181, cost 19.81 s 2023-12-16 16:05:05,947 44k INFO ====> Epoch: 182, cost 19.76 s 2023-12-16 16:05:25,523 44k INFO ====> Epoch: 183, cost 19.58 s 2023-12-16 16:05:45,250 44k INFO ====> Epoch: 184, cost 19.73 s 2023-12-16 16:06:04,718 44k INFO ====> Epoch: 185, cost 19.47 s 2023-12-16 16:06:24,431 44k INFO ====> Epoch: 186, cost 19.71 s 2023-12-16 16:06:39,485 44k INFO Train Epoch: 187 [60%] 2023-12-16 16:06:39,486 44k INFO Losses: [2.6317648887634277, 2.7047290802001953, 9.169486999511719, 26.55154037475586, 0.713068425655365], step: 2800, lr: 9.77016778842374e-05, reference_loss: 41.770591735839844 2023-12-16 16:06:44,419 44k INFO ====> Epoch: 187, cost 19.99 s 2023-12-16 16:07:03,940 44k INFO ====> Epoch: 188, cost 19.52 s 2023-12-16 16:07:23,717 44k INFO ====> Epoch: 189, cost 19.78 s 2023-12-16 16:07:43,332 44k INFO ====> Epoch: 190, cost 19.62 s 2023-12-16 16:08:02,969 44k INFO ====> Epoch: 191, cost 19.64 s 2023-12-16 16:08:22,442 44k INFO ====> Epoch: 192, cost 19.47 s 2023-12-16 16:08:42,190 44k INFO ====> Epoch: 193, cost 19.75 s 2023-12-16 16:09:01,815 44k INFO ====> Epoch: 194, cost 19.63 s 2023-12-16 16:09:21,540 44k INFO ====> Epoch: 195, cost 19.73 s 2023-12-16 16:09:41,099 44k INFO ====> Epoch: 196, cost 19.56 s 2023-12-16 16:10:00,985 44k INFO ====> Epoch: 197, cost 19.89 s 2023-12-16 16:10:20,790 44k INFO ====> Epoch: 198, cost 19.80 s 2023-12-16 16:10:40,207 44k INFO ====> Epoch: 199, cost 19.42 s 2023-12-16 16:10:59,326 44k INFO Train Epoch: 200 [93%] 2023-12-16 16:10:59,326 44k INFO Losses: [1.576963186264038, 3.343235492706299, 17.190486907958984, 32.088375091552734, 0.1742485910654068], step: 3000, lr: 9.754303167703689e-05, reference_loss: 54.37330627441406 2023-12-16 16:11:00,404 44k INFO ====> Epoch: 200, cost 20.20 s 2023-12-16 16:11:19,904 44k INFO ====> Epoch: 201, cost 19.50 s 2023-12-16 16:11:39,305 44k INFO ====> Epoch: 202, cost 19.40 s 2023-12-16 16:11:58,686 44k INFO ====> Epoch: 203, cost 19.38 s 2023-12-16 16:12:18,046 44k INFO ====> Epoch: 204, cost 19.36 s 2023-12-16 16:12:37,564 44k INFO ====> Epoch: 205, cost 19.52 s 2023-12-16 16:12:57,070 44k INFO ====> Epoch: 206, cost 19.51 s 2023-12-16 16:13:16,485 44k INFO ====> Epoch: 207, cost 19.42 s 2023-12-16 16:13:35,818 44k INFO ====> Epoch: 208, cost 19.33 s 2023-12-16 16:13:55,344 44k INFO ====> Epoch: 209, cost 19.53 s 2023-12-16 16:14:15,262 44k INFO ====> Epoch: 210, cost 19.92 s 2023-12-16 16:14:34,633 44k INFO ====> Epoch: 211, cost 19.37 s 2023-12-16 16:14:54,041 44k INFO ====> Epoch: 212, cost 19.41 s 2023-12-16 16:15:13,420 44k INFO ====> Epoch: 213, cost 19.38 s 2023-12-16 16:15:24,235 44k INFO Train Epoch: 214 [27%] 2023-12-16 16:15:24,235 44k INFO Losses: [2.163398265838623, 3.2876458168029785, 10.961349487304688, 25.006256103515625, 0.9355498552322388], step: 3200, lr: 9.7372469996277e-05, reference_loss: 42.35419845581055 2023-12-16 16:15:30,199 44k INFO Saving model and optimizer state at iteration 214 to ./logs\44k\G_3200.pth 2023-12-16 16:15:31,476 44k INFO Saving model and optimizer state at iteration 214 to ./logs\44k\D_3200.pth 2023-12-16 16:15:49,598 44k INFO ====> Epoch: 214, cost 36.18 s 2023-12-16 16:16:09,069 44k INFO ====> Epoch: 215, cost 19.47 s 2023-12-16 16:16:28,411 44k INFO ====> Epoch: 216, cost 19.34 s 2023-12-16 16:16:48,057 44k INFO ====> Epoch: 217, cost 19.65 s 2023-12-16 16:17:07,428 44k INFO ====> Epoch: 218, cost 19.37 s 2023-12-16 16:17:26,827 44k INFO ====> Epoch: 219, cost 19.40 s 2023-12-16 16:17:46,406 44k INFO ====> Epoch: 220, cost 19.58 s 2023-12-16 16:18:05,807 44k INFO ====> Epoch: 221, cost 19.40 s 2023-12-16 16:18:25,453 44k INFO ====> Epoch: 222, cost 19.65 s 2023-12-16 16:18:44,787 44k INFO ====> Epoch: 223, cost 19.33 s 2023-12-16 16:19:04,253 44k INFO ====> Epoch: 224, cost 19.47 s 2023-12-16 16:19:23,868 44k INFO ====> Epoch: 225, cost 19.62 s 2023-12-16 16:19:43,360 44k INFO ====> Epoch: 226, cost 19.49 s 2023-12-16 16:19:58,621 44k INFO Train Epoch: 227 [60%] 2023-12-16 16:19:58,622 44k INFO Losses: [2.7509920597076416, 2.2684364318847656, 7.127928733825684, 20.45960807800293, 0.9491675496101379], step: 3400, lr: 9.721435835085619e-05, reference_loss: 33.55613327026367 2023-12-16 16:20:03,549 44k INFO ====> Epoch: 227, cost 20.19 s 2023-12-16 16:20:23,108 44k INFO ====> Epoch: 228, cost 19.56 s 2023-12-16 16:20:42,546 44k INFO ====> Epoch: 229, cost 19.44 s 2023-12-16 16:21:02,189 44k INFO ====> Epoch: 230, cost 19.64 s 2023-12-16 16:21:21,982 44k INFO ====> Epoch: 231, cost 19.79 s 2023-12-16 16:21:41,737 44k INFO ====> Epoch: 232, cost 19.76 s 2023-12-16 16:22:01,301 44k INFO ====> Epoch: 233, cost 19.56 s 2023-12-16 16:22:20,962 44k INFO ====> Epoch: 234, cost 19.66 s 2023-12-16 16:22:40,546 44k INFO ====> Epoch: 235, cost 19.58 s 2023-12-16 16:23:00,092 44k INFO ====> Epoch: 236, cost 19.55 s 2023-12-16 16:23:19,519 44k INFO ====> Epoch: 237, cost 19.43 s 2023-12-16 16:23:39,107 44k INFO ====> Epoch: 238, cost 19.59 s 2023-12-16 16:23:58,657 44k INFO ====> Epoch: 239, cost 19.55 s 2023-12-16 16:24:17,465 44k INFO Train Epoch: 240 [93%] 2023-12-16 16:24:17,466 44k INFO Losses: [2.48555326461792, 2.4281094074249268, 10.029970169067383, 21.831945419311523, -0.0773247703909874], step: 3600, lr: 9.705650344424885e-05, reference_loss: 36.6982536315918 2023-12-16 16:24:18,582 44k INFO ====> Epoch: 240, cost 19.93 s 2023-12-16 16:24:38,240 44k INFO ====> Epoch: 241, cost 19.66 s 2023-12-16 16:24:57,803 44k INFO ====> Epoch: 242, cost 19.56 s 2023-12-16 16:25:17,415 44k INFO ====> Epoch: 243, cost 19.61 s 2023-12-16 16:25:37,013 44k INFO ====> Epoch: 244, cost 19.60 s 2023-12-16 16:25:56,570 44k INFO ====> Epoch: 245, cost 19.56 s 2023-12-16 16:26:16,368 44k INFO ====> Epoch: 246, cost 19.80 s 2023-12-16 16:26:35,976 44k INFO ====> Epoch: 247, cost 19.61 s 2023-12-16 16:26:55,725 44k INFO ====> Epoch: 248, cost 19.75 s 2023-12-16 16:27:15,356 44k INFO ====> Epoch: 249, cost 19.63 s 2023-12-16 16:27:35,031 44k INFO ====> Epoch: 250, cost 19.68 s 2023-12-16 16:27:54,542 44k INFO ====> Epoch: 251, cost 19.51 s 2023-12-16 16:28:14,175 44k INFO ====> Epoch: 252, cost 19.63 s 2023-12-16 16:28:33,900 44k INFO ====> Epoch: 253, cost 19.72 s 2023-12-16 16:28:45,281 44k INFO Train Epoch: 254 [27%] 2023-12-16 16:28:45,281 44k INFO Losses: [2.467670202255249, 2.504182815551758, 8.2728853225708, 21.709918975830078, 0.7850950360298157], step: 3800, lr: 9.68867924964598e-05, reference_loss: 35.73975372314453 2023-12-16 16:28:54,691 44k INFO ====> Epoch: 254, cost 20.79 s 2023-12-16 16:29:14,839 44k INFO ====> Epoch: 255, cost 20.15 s 2023-12-16 16:29:34,734 44k INFO ====> Epoch: 256, cost 19.90 s 2023-12-16 16:29:54,253 44k INFO ====> Epoch: 257, cost 19.52 s 2023-12-16 16:30:13,948 44k INFO ====> Epoch: 258, cost 19.69 s 2023-12-16 16:30:34,427 44k INFO ====> Epoch: 259, cost 20.48 s 2023-12-16 16:30:54,377 44k INFO ====> Epoch: 260, cost 19.95 s 2023-12-16 16:31:14,088 44k INFO ====> Epoch: 261, cost 19.71 s 2023-12-16 16:31:35,075 44k INFO ====> Epoch: 262, cost 20.99 s 2023-12-16 16:31:55,725 44k INFO ====> Epoch: 263, cost 20.65 s 2023-12-16 16:32:17,461 44k INFO ====> Epoch: 264, cost 21.74 s 2023-12-16 16:32:39,360 44k INFO ====> Epoch: 265, cost 21.90 s 2023-12-16 16:32:59,912 44k INFO ====> Epoch: 266, cost 20.55 s 2023-12-16 16:33:17,787 44k INFO Train Epoch: 267 [60%] 2023-12-16 16:33:17,788 44k INFO Losses: [2.3996403217315674, 2.36592960357666, 4.848520755767822, 17.112092971801758, 0.5949386358261108], step: 4000, lr: 9.67294694853279e-05, reference_loss: 27.321121215820312 2023-12-16 16:33:24,788 44k INFO Saving model and optimizer state at iteration 267 to ./logs\44k\G_4000.pth 2023-12-16 16:33:26,480 44k INFO Saving model and optimizer state at iteration 267 to ./logs\44k\D_4000.pth 2023-12-16 16:33:36,216 44k INFO ====> Epoch: 267, cost 36.30 s 2023-12-16 16:33:56,739 44k INFO ====> Epoch: 268, cost 20.52 s 2023-12-16 16:34:17,676 44k INFO ====> Epoch: 269, cost 20.94 s 2023-12-16 16:34:39,200 44k INFO ====> Epoch: 270, cost 21.52 s 2023-12-16 16:35:00,391 44k INFO ====> Epoch: 271, cost 21.19 s 2023-12-16 16:35:21,459 44k INFO ====> Epoch: 272, cost 21.07 s 2023-12-16 16:35:43,592 44k INFO ====> Epoch: 273, cost 22.13 s 2023-12-16 16:36:04,412 44k INFO ====> Epoch: 274, cost 20.82 s 2023-12-16 16:36:26,054 44k INFO ====> Epoch: 275, cost 21.64 s 2023-12-16 16:36:47,288 44k INFO ====> Epoch: 276, cost 21.23 s 2023-12-16 16:37:08,113 44k INFO ====> Epoch: 277, cost 20.82 s 2023-12-16 19:04:48,201 44k INFO {'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 3, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 768, 'ssl_dim': 768, 'n_speakers': 1, 'speech_encoder': 'vec768l12', 'speaker_embedding': False}, 'spk': {'wdlm': 0}, 'model_dir': './logs\\44k'} 2023-12-16 19:05:04,189 44k INFO Loaded checkpoint './logs\44k\G_4000.pth' (iteration 267) 2023-12-16 19:05:14,820 44k INFO Loaded checkpoint './logs\44k\D_4000.pth' (iteration 267) 2023-12-18 00:45:59,792 44k INFO {'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 4, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 10, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 768, 'ssl_dim': 768, 'n_speakers': 1, 'speech_encoder': 'vec768l12', 'speaker_embedding': False}, 'spk': {'wdlm': 0}, 'model_dir': './logs\\44k'} 2023-12-18 00:46:35,634 44k INFO Loaded checkpoint './logs\44k\G_4000.pth' (iteration 267) 2023-12-18 00:46:54,299 44k INFO Loaded checkpoint './logs\44k\D_4000.pth' (iteration 267) 2023-12-18 00:48:34,142 44k INFO ====> Epoch: 267, cost 154.35 s 2023-12-18 00:48:53,267 44k INFO ====> Epoch: 268, cost 19.12 s 2023-12-18 00:49:12,551 44k INFO ====> Epoch: 269, cost 19.28 s 2023-12-18 00:49:31,680 44k INFO ====> Epoch: 270, cost 19.13 s 2023-12-18 00:49:50,651 44k INFO ====> Epoch: 271, cost 18.97 s 2023-12-18 00:50:09,505 44k INFO ====> Epoch: 272, cost 18.85 s 2023-12-18 00:50:28,428 44k INFO ====> Epoch: 273, cost 18.92 s 2023-12-18 00:50:47,761 44k INFO ====> Epoch: 274, cost 19.33 s 2023-12-18 00:51:11,246 44k INFO ====> Epoch: 275, cost 23.48 s 2023-12-18 00:51:30,823 44k INFO ====> Epoch: 276, cost 19.58 s 2023-12-18 00:51:50,723 44k INFO ====> Epoch: 277, cost 19.90 s 2023-12-18 00:52:10,645 44k INFO ====> Epoch: 278, cost 19.92 s 2023-12-18 00:52:30,473 44k INFO ====> Epoch: 279, cost 19.83 s 2023-12-18 00:52:41,059 44k INFO Train Epoch: 280 [27%] 2023-12-18 00:52:41,059 44k INFO Losses: [2.4351158142089844, 2.1838958263397217, 9.900190353393555, 21.247800827026367, 1.0966874361038208], step: 4200, lr: 9.65482603409002e-05, reference_loss: 36.86368942260742 2023-12-18 00:52:51,136 44k INFO ====> Epoch: 280, cost 20.66 s 2023-12-18 00:53:10,723 44k INFO ====> Epoch: 281, cost 19.59 s 2023-12-18 00:53:30,481 44k INFO ====> Epoch: 282, cost 19.76 s 2023-12-18 00:53:50,416 44k INFO ====> Epoch: 283, cost 19.94 s 2023-12-18 00:54:10,522 44k INFO ====> Epoch: 284, cost 20.11 s 2023-12-18 00:54:30,201 44k INFO ====> Epoch: 285, cost 19.68 s 2023-12-18 00:54:50,054 44k INFO ====> Epoch: 286, cost 19.85 s 2023-12-18 00:55:09,939 44k INFO ====> Epoch: 287, cost 19.89 s 2023-12-18 00:55:29,714 44k INFO ====> Epoch: 288, cost 19.77 s 2023-12-18 00:55:49,604 44k INFO ====> Epoch: 289, cost 19.89 s 2023-12-18 00:56:09,384 44k INFO ====> Epoch: 290, cost 19.78 s 2023-12-18 00:56:29,295 44k INFO ====> Epoch: 291, cost 19.91 s 2023-12-18 00:56:49,019 44k INFO ====> Epoch: 292, cost 19.72 s 2023-12-18 00:57:03,857 44k INFO Train Epoch: 293 [60%] 2023-12-18 00:57:03,857 44k INFO Losses: [2.3994483947753906, 2.495651960372925, 10.407428741455078, 21.881799697875977, 0.7986530661582947], step: 4400, lr: 9.639148703212408e-05, reference_loss: 37.98298263549805 2023-12-18 00:57:09,169 44k INFO ====> Epoch: 293, cost 20.15 s 2023-12-18 00:57:28,997 44k INFO ====> Epoch: 294, cost 19.83 s 2023-12-18 00:57:48,787 44k INFO ====> Epoch: 295, cost 19.79 s 2023-12-18 00:58:08,528 44k INFO ====> Epoch: 296, cost 19.74 s 2023-12-18 00:58:28,515 44k INFO ====> Epoch: 297, cost 19.99 s 2023-12-18 00:58:48,419 44k INFO ====> Epoch: 298, cost 19.90 s 2023-12-18 00:59:08,353 44k INFO ====> Epoch: 299, cost 19.93 s 2023-12-18 00:59:27,970 44k INFO ====> Epoch: 300, cost 19.62 s 2023-12-18 01:00:12,275 44k INFO ====> Epoch: 301, cost 44.31 s 2023-12-18 02:00:18,227 44k INFO {'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 4, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 10, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 768, 'ssl_dim': 768, 'n_speakers': 1, 'speech_encoder': 'vec768l12', 'speaker_embedding': False}, 'spk': {'wdlm': 0}, 'model_dir': './logs\\44k'} 2023-12-18 02:05:54,785 44k INFO {'train': {'log_interval': 200, 'eval_interval': 800, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 4, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 10240, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 512, 'port': '8001', 'keep_ckpts': 10, 'all_in_mem': False}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 44100, 'filter_length': 2048, 'hop_length': 512, 'win_length': 2048, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': 22050}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [8, 8, 2, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 768, 'ssl_dim': 768, 'n_speakers': 1, 'speech_encoder': 'vec768l12', 'speaker_embedding': False}, 'spk': {'wdlm': 0}, 'model_dir': './logs\\44k'} 2023-12-18 02:08:41,510 44k INFO Loaded checkpoint './logs\44k\G_4000.pth' (iteration 267) 2023-12-18 02:08:52,614 44k INFO Loaded checkpoint './logs\44k\D_4000.pth' (iteration 267) 2023-12-18 02:09:30,741 44k INFO ====> Epoch: 267, cost 215.96 s 2023-12-18 02:09:46,518 44k INFO ====> Epoch: 268, cost 15.78 s 2023-12-18 02:10:02,505 44k INFO ====> Epoch: 269, cost 15.99 s 2023-12-18 02:10:18,581 44k INFO ====> Epoch: 270, cost 16.08 s 2023-12-18 02:10:34,547 44k INFO ====> Epoch: 271, cost 15.97 s 2023-12-18 02:10:50,587 44k INFO ====> Epoch: 272, cost 16.04 s 2023-12-18 02:11:06,655 44k INFO ====> Epoch: 273, cost 16.07 s 2023-12-18 02:11:22,763 44k INFO ====> Epoch: 274, cost 16.11 s 2023-12-18 02:11:39,154 44k INFO ====> Epoch: 275, cost 16.39 s 2023-12-18 02:11:55,540 44k INFO ====> Epoch: 276, cost 16.39 s 2023-12-18 02:12:11,953 44k INFO ====> Epoch: 277, cost 16.41 s 2023-12-18 02:12:28,143 44k INFO ====> Epoch: 278, cost 16.19 s 2023-12-18 02:12:44,336 44k INFO ====> Epoch: 279, cost 16.19 s 2023-12-18 02:12:54,121 44k INFO Train Epoch: 280 [27%] 2023-12-18 02:12:54,121 44k INFO Losses: [2.4663281440734863, 2.9571053981781006, 10.239912986755371, 23.176321029663086, 0.9816058278083801], step: 4200, lr: 9.65482603409002e-05, reference_loss: 39.82127380371094 2023-12-18 02:13:01,891 44k INFO ====> Epoch: 280, cost 17.56 s 2023-12-18 02:13:18,193 44k INFO ====> Epoch: 281, cost 16.30 s 2023-12-18 02:13:34,582 44k INFO ====> Epoch: 282, cost 16.39 s 2023-12-18 02:13:51,082 44k INFO ====> Epoch: 283, cost 16.50 s 2023-12-18 02:14:07,469 44k INFO ====> Epoch: 284, cost 16.39 s 2023-12-18 02:14:23,972 44k INFO ====> Epoch: 285, cost 16.50 s 2023-12-18 02:14:40,360 44k INFO ====> Epoch: 286, cost 16.39 s 2023-12-18 02:14:56,796 44k INFO ====> Epoch: 287, cost 16.44 s 2023-12-18 02:15:13,577 44k INFO ====> Epoch: 288, cost 16.78 s 2023-12-18 02:15:30,124 44k INFO ====> Epoch: 289, cost 16.55 s 2023-12-18 02:15:46,510 44k INFO ====> Epoch: 290, cost 16.39 s 2023-12-18 02:16:03,027 44k INFO ====> Epoch: 291, cost 16.52 s 2023-12-18 02:16:19,575 44k INFO ====> Epoch: 292, cost 16.55 s 2023-12-18 02:16:32,430 44k INFO Train Epoch: 293 [60%] 2023-12-18 02:16:32,430 44k INFO Losses: [2.667595863342285, 2.330214500427246, 6.057078838348389, 18.593589782714844, 1.3511298894882202], step: 4400, lr: 9.639148703212408e-05, reference_loss: 30.999608993530273 2023-12-18 02:16:36,543 44k INFO ====> Epoch: 293, cost 16.97 s 2023-12-18 02:16:52,842 44k INFO ====> Epoch: 294, cost 16.30 s 2023-12-18 02:17:09,176 44k INFO ====> Epoch: 295, cost 16.33 s 2023-12-18 02:17:25,514 44k INFO ====> Epoch: 296, cost 16.34 s 2023-12-18 02:17:41,947 44k INFO ====> Epoch: 297, cost 16.43 s 2023-12-18 02:17:58,412 44k INFO ====> Epoch: 298, cost 16.47 s 2023-12-18 02:18:14,915 44k INFO ====> Epoch: 299, cost 16.50 s 2023-12-18 02:18:31,340 44k INFO ====> Epoch: 300, cost 16.42 s 2023-12-18 02:18:47,846 44k INFO ====> Epoch: 301, cost 16.51 s 2023-12-18 02:19:04,382 44k INFO ====> Epoch: 302, cost 16.54 s 2023-12-18 02:19:20,685 44k INFO ====> Epoch: 303, cost 16.30 s 2023-12-18 02:19:37,054 44k INFO ====> Epoch: 304, cost 16.37 s 2023-12-18 02:19:53,540 44k INFO ====> Epoch: 305, cost 16.49 s 2023-12-18 02:20:09,324 44k INFO Train Epoch: 306 [93%] 2023-12-18 02:20:09,324 44k INFO Losses: [2.795588493347168, 2.8995909690856934, 10.883658409118652, 20.771377563476562, 0.9570157527923584], step: 4600, lr: 9.62349682889948e-05, reference_loss: 38.30723190307617 2023-12-18 02:20:10,371 44k INFO ====> Epoch: 306, cost 16.83 s 2023-12-18 02:20:26,942 44k INFO ====> Epoch: 307, cost 16.57 s 2023-12-18 02:20:43,349 44k INFO ====> Epoch: 308, cost 16.41 s 2023-12-18 02:20:59,629 44k INFO ====> Epoch: 309, cost 16.28 s 2023-12-18 02:21:15,881 44k INFO ====> Epoch: 310, cost 16.25 s 2023-12-18 02:21:32,404 44k INFO ====> Epoch: 311, cost 16.52 s 2023-12-18 02:21:48,810 44k INFO ====> Epoch: 312, cost 16.41 s 2023-12-18 02:22:05,302 44k INFO ====> Epoch: 313, cost 16.49 s 2023-12-18 02:22:21,683 44k INFO ====> Epoch: 314, cost 16.38 s 2023-12-18 02:22:38,067 44k INFO ====> Epoch: 315, cost 16.38 s 2023-12-18 02:22:54,438 44k INFO ====> Epoch: 316, cost 16.37 s 2023-12-18 02:23:10,781 44k INFO ====> Epoch: 317, cost 16.34 s 2023-12-18 02:23:27,078 44k INFO ====> Epoch: 318, cost 16.30 s 2023-12-18 02:23:43,579 44k INFO ====> Epoch: 319, cost 16.50 s 2023-12-18 02:23:53,346 44k INFO Train Epoch: 320 [27%] 2023-12-18 02:23:53,346 44k INFO Losses: [2.7768359184265137, 2.3091416358947754, 7.972097873687744, 21.824607849121094, 0.5270208716392517], step: 4800, lr: 9.606669386019102e-05, reference_loss: 35.40970230102539 2023-12-18 02:24:01,008 44k INFO Saving model and optimizer state at iteration 320 to ./logs\44k\G_4800.pth 2023-12-18 02:24:02,138 44k INFO Saving model and optimizer state at iteration 320 to ./logs\44k\D_4800.pth 2023-12-18 02:24:19,670 44k INFO ====> Epoch: 320, cost 36.09 s 2023-12-18 02:24:36,052 44k INFO ====> Epoch: 321, cost 16.38 s 2023-12-18 02:24:52,551 44k INFO ====> Epoch: 322, cost 16.50 s 2023-12-18 02:25:08,751 44k INFO ====> Epoch: 323, cost 16.20 s 2023-12-18 02:25:25,097 44k INFO ====> Epoch: 324, cost 16.35 s 2023-12-18 02:25:41,471 44k INFO ====> Epoch: 325, cost 16.37 s 2023-12-18 02:25:57,762 44k INFO ====> Epoch: 326, cost 16.29 s 2023-12-18 02:26:14,118 44k INFO ====> Epoch: 327, cost 16.36 s 2023-12-18 02:26:30,463 44k INFO ====> Epoch: 328, cost 16.35 s 2023-12-18 02:26:46,820 44k INFO ====> Epoch: 329, cost 16.36 s 2023-12-18 02:27:03,103 44k INFO ====> Epoch: 330, cost 16.28 s 2023-12-18 02:27:19,370 44k INFO ====> Epoch: 331, cost 16.27 s 2023-12-18 02:27:35,721 44k INFO ====> Epoch: 332, cost 16.35 s 2023-12-18 02:27:48,479 44k INFO Train Epoch: 333 [60%] 2023-12-18 02:27:48,479 44k INFO Losses: [2.273578643798828, 2.649947166442871, 12.30436897277832, 27.678691864013672, 0.6731786131858826], step: 5000, lr: 9.591070251030582e-05, reference_loss: 45.57976531982422 2023-12-18 02:27:52,571 44k INFO ====> Epoch: 333, cost 16.85 s 2023-12-18 02:28:08,901 44k INFO ====> Epoch: 334, cost 16.33 s 2023-12-18 02:28:25,101 44k INFO ====> Epoch: 335, cost 16.20 s 2023-12-18 02:28:41,250 44k INFO ====> Epoch: 336, cost 16.15 s 2023-12-18 02:28:57,537 44k INFO ====> Epoch: 337, cost 16.29 s 2023-12-18 02:29:13,658 44k INFO ====> Epoch: 338, cost 16.12 s 2023-12-18 02:29:29,851 44k INFO ====> Epoch: 339, cost 16.19 s 2023-12-18 02:29:46,124 44k INFO ====> Epoch: 340, cost 16.27 s 2023-12-18 02:30:02,446 44k INFO ====> Epoch: 341, cost 16.32 s 2023-12-18 02:30:18,962 44k INFO ====> Epoch: 342, cost 16.52 s 2023-12-18 02:30:35,484 44k INFO ====> Epoch: 343, cost 16.52 s 2023-12-18 02:30:51,698 44k INFO ====> Epoch: 344, cost 16.21 s 2023-12-18 02:31:08,049 44k INFO ====> Epoch: 345, cost 16.35 s 2023-12-18 02:31:23,702 44k INFO Train Epoch: 346 [93%] 2023-12-18 02:31:23,702 44k INFO Losses: [2.3827779293060303, 2.7761096954345703, 6.374064922332764, 24.456802368164062, -0.039804231375455856], step: 5200, lr: 9.575496445633683e-05, reference_loss: 35.949951171875 2023-12-18 02:31:24,774 44k INFO ====> Epoch: 346, cost 16.72 s 2023-12-18 02:31:41,156 44k INFO ====> Epoch: 347, cost 16.38 s 2023-12-18 02:31:57,510 44k INFO ====> Epoch: 348, cost 16.35 s 2023-12-18 02:32:13,840 44k INFO ====> Epoch: 349, cost 16.33 s 2023-12-18 02:32:30,116 44k INFO ====> Epoch: 350, cost 16.28 s 2023-12-18 02:32:46,417 44k INFO ====> Epoch: 351, cost 16.30 s 2023-12-18 02:33:02,639 44k INFO ====> Epoch: 352, cost 16.22 s 2023-12-18 02:33:19,044 44k INFO ====> Epoch: 353, cost 16.40 s 2023-12-18 02:33:35,377 44k INFO ====> Epoch: 354, cost 16.33 s 2023-12-18 02:33:51,715 44k INFO ====> Epoch: 355, cost 16.34 s 2023-12-18 02:34:07,986 44k INFO ====> Epoch: 356, cost 16.27 s 2023-12-18 02:34:24,495 44k INFO ====> Epoch: 357, cost 16.51 s 2023-12-18 02:34:40,766 44k INFO ====> Epoch: 358, cost 16.27 s 2023-12-18 02:34:56,935 44k INFO ====> Epoch: 359, cost 16.17 s 2023-12-18 02:35:06,532 44k INFO Train Epoch: 360 [27%] 2023-12-18 02:35:06,532 44k INFO Losses: [2.7126941680908203, 2.2726285457611084, 7.150015354156494, 19.460657119750977, 0.37591493129730225], step: 5400, lr: 9.558752935207586e-05, reference_loss: 31.97191047668457 2023-12-18 02:35:13,924 44k INFO ====> Epoch: 360, cost 16.99 s 2023-12-18 02:35:30,103 44k INFO ====> Epoch: 361, cost 16.18 s 2023-12-18 02:35:46,427 44k INFO ====> Epoch: 362, cost 16.32 s 2023-12-18 02:36:02,809 44k INFO ====> Epoch: 363, cost 16.38 s 2023-12-18 02:36:19,221 44k INFO ====> Epoch: 364, cost 16.41 s 2023-12-18 02:36:35,556 44k INFO ====> Epoch: 365, cost 16.33 s 2023-12-18 02:36:51,787 44k INFO ====> Epoch: 366, cost 16.23 s 2023-12-18 02:37:08,253 44k INFO ====> Epoch: 367, cost 16.47 s 2023-12-18 02:37:24,532 44k INFO ====> Epoch: 368, cost 16.28 s 2023-12-18 02:37:40,672 44k INFO ====> Epoch: 369, cost 16.14 s 2023-12-18 02:37:57,214 44k INFO ====> Epoch: 370, cost 16.54 s 2023-12-18 02:38:13,543 44k INFO ====> Epoch: 371, cost 16.33 s 2023-12-18 02:38:29,952 44k INFO ====> Epoch: 372, cost 16.41 s 2023-12-18 02:38:42,805 44k INFO Train Epoch: 373 [60%] 2023-12-18 02:38:42,805 44k INFO Losses: [2.1800777912139893, 3.0790042877197266, 10.158679008483887, 23.722606658935547, 1.0680791139602661], step: 5600, lr: 9.543231606080218e-05, reference_loss: 40.20844650268555 2023-12-18 02:38:48,100 44k INFO Saving model and optimizer state at iteration 373 to ./logs\44k\G_5600.pth 2023-12-18 02:38:49,324 44k INFO Saving model and optimizer state at iteration 373 to ./logs\44k\D_5600.pth 2023-12-18 02:38:58,448 44k INFO ====> Epoch: 373, cost 28.50 s 2023-12-18 02:39:14,761 44k INFO ====> Epoch: 374, cost 16.31 s 2023-12-18 02:39:31,321 44k INFO ====> Epoch: 375, cost 16.56 s 2023-12-18 02:39:47,720 44k INFO ====> Epoch: 376, cost 16.40 s 2023-12-18 02:40:04,286 44k INFO ====> Epoch: 377, cost 16.57 s 2023-12-18 02:40:20,562 44k INFO ====> Epoch: 378, cost 16.28 s 2023-12-18 02:40:36,985 44k INFO ====> Epoch: 379, cost 16.42 s 2023-12-18 02:40:53,247 44k INFO ====> Epoch: 380, cost 16.26 s 2023-12-18 02:41:09,529 44k INFO ====> Epoch: 381, cost 16.28 s 2023-12-18 02:41:25,823 44k INFO ====> Epoch: 382, cost 16.29 s 2023-12-18 02:41:42,042 44k INFO ====> Epoch: 383, cost 16.22 s 2023-12-18 02:41:58,287 44k INFO ====> Epoch: 384, cost 16.24 s 2023-12-18 02:42:14,527 44k INFO ====> Epoch: 385, cost 16.24 s 2023-12-18 02:42:30,462 44k INFO Train Epoch: 386 [93%] 2023-12-18 02:42:30,462 44k INFO Losses: [1.64399254322052, 3.2245638370513916, 14.835515022277832, 31.363067626953125, 0.4534532427787781], step: 5800, lr: 9.527735480204728e-05, reference_loss: 51.520591735839844 2023-12-18 02:42:31,481 44k INFO ====> Epoch: 386, cost 16.94 s 2023-12-18 02:42:47,839 44k INFO ====> Epoch: 387, cost 16.37 s 2023-12-18 02:43:04,234 44k INFO ====> Epoch: 388, cost 16.40 s 2023-12-18 02:43:20,388 44k INFO ====> Epoch: 389, cost 16.15 s 2023-12-18 02:43:36,732 44k INFO ====> Epoch: 390, cost 16.34 s 2023-12-18 02:43:52,937 44k INFO ====> Epoch: 391, cost 16.21 s 2023-12-18 02:44:09,217 44k INFO ====> Epoch: 392, cost 16.28 s 2023-12-18 02:44:25,565 44k INFO ====> Epoch: 393, cost 16.35 s 2023-12-18 02:44:41,848 44k INFO ====> Epoch: 394, cost 16.28 s 2023-12-18 02:44:58,138 44k INFO ====> Epoch: 395, cost 16.29 s 2023-12-18 02:45:14,472 44k INFO ====> Epoch: 396, cost 16.33 s 2023-12-18 02:45:31,018 44k INFO ====> Epoch: 397, cost 16.55 s 2023-12-18 02:45:47,313 44k INFO ====> Epoch: 398, cost 16.30 s 2023-12-18 02:46:03,625 44k INFO ====> Epoch: 399, cost 16.31 s 2023-12-18 02:46:13,149 44k INFO Train Epoch: 400 [27%] 2023-12-18 02:46:13,149 44k INFO Losses: [2.2236180305480957, 2.5851387977600098, 9.906683921813965, 22.398427963256836, 0.3596521317958832], step: 6000, lr: 9.511075483591955e-05, reference_loss: 37.4735221862793 2023-12-18 02:46:20,507 44k INFO ====> Epoch: 400, cost 16.88 s 2023-12-18 02:46:36,797 44k INFO ====> Epoch: 401, cost 16.29 s 2023-12-18 02:46:53,006 44k INFO ====> Epoch: 402, cost 16.21 s 2023-12-18 02:47:09,275 44k INFO ====> Epoch: 403, cost 16.27 s 2023-12-18 02:47:25,528 44k INFO ====> Epoch: 404, cost 16.25 s 2023-12-18 02:47:41,778 44k INFO ====> Epoch: 405, cost 16.25 s 2023-12-18 02:47:58,039 44k INFO ====> Epoch: 406, cost 16.26 s 2023-12-18 02:48:14,312 44k INFO ====> Epoch: 407, cost 16.27 s 2023-12-18 02:48:30,485 44k INFO ====> Epoch: 408, cost 16.17 s 2023-12-18 02:48:46,733 44k INFO ====> Epoch: 409, cost 16.25 s 2023-12-18 02:49:03,152 44k INFO ====> Epoch: 410, cost 16.42 s 2023-12-18 02:49:19,471 44k INFO ====> Epoch: 411, cost 16.32 s 2023-12-18 02:49:35,690 44k INFO ====> Epoch: 412, cost 16.22 s 2023-12-18 02:49:48,472 44k INFO Train Epoch: 413 [60%] 2023-12-18 02:49:48,472 44k INFO Losses: [2.440399408340454, 2.545651912689209, 8.020552635192871, 21.467039108276367, 0.9791371822357178], step: 6200, lr: 9.495631572243191e-05, reference_loss: 35.452781677246094 2023-12-18 02:49:52,547 44k INFO ====> Epoch: 413, cost 16.86 s 2023-12-18 02:50:08,745 44k INFO ====> Epoch: 414, cost 16.20 s 2023-12-18 02:50:24,928 44k INFO ====> Epoch: 415, cost 16.18 s 2023-12-18 02:50:41,057 44k INFO ====> Epoch: 416, cost 16.13 s 2023-12-18 02:50:57,282 44k INFO ====> Epoch: 417, cost 16.22 s 2023-12-18 02:51:13,531 44k INFO ====> Epoch: 418, cost 16.25 s 2023-12-18 02:51:29,739 44k INFO ====> Epoch: 419, cost 16.21 s 2023-12-18 02:51:46,038 44k INFO ====> Epoch: 420, cost 16.30 s 2023-12-18 02:52:02,203 44k INFO ====> Epoch: 421, cost 16.16 s 2023-12-18 02:52:18,412 44k INFO ====> Epoch: 422, cost 16.21 s 2023-12-18 02:52:34,659 44k INFO ====> Epoch: 423, cost 16.25 s 2023-12-18 02:52:50,736 44k INFO ====> Epoch: 424, cost 16.08 s 2023-12-18 02:53:06,824 44k INFO ====> Epoch: 425, cost 16.09 s 2023-12-18 02:53:22,414 44k INFO Train Epoch: 426 [93%] 2023-12-18 02:53:22,414 44k INFO Losses: [2.115166187286377, 2.556946277618408, 13.124458312988281, 27.000316619873047, 0.6712955832481384], step: 6400, lr: 9.480212738436729e-05, reference_loss: 45.46818161010742 2023-12-18 02:53:27,768 44k INFO Saving model and optimizer state at iteration 426 to ./logs\44k\G_6400.pth 2023-12-18 02:53:28,966 44k INFO Saving model and optimizer state at iteration 426 to ./logs\44k\D_6400.pth 2023-12-18 02:53:38,147 44k INFO ====> Epoch: 426, cost 31.32 s 2023-12-18 02:53:54,776 44k INFO ====> Epoch: 427, cost 16.63 s 2023-12-18 02:54:11,007 44k INFO ====> Epoch: 428, cost 16.23 s 2023-12-18 02:54:27,589 44k INFO ====> Epoch: 429, cost 16.58 s 2023-12-18 02:54:43,860 44k INFO ====> Epoch: 430, cost 16.27 s 2023-12-18 02:54:59,990 44k INFO ====> Epoch: 431, cost 16.13 s 2023-12-18 02:55:16,108 44k INFO ====> Epoch: 432, cost 16.12 s 2023-12-18 02:55:32,342 44k INFO ====> Epoch: 433, cost 16.23 s 2023-12-18 02:55:48,464 44k INFO ====> Epoch: 434, cost 16.12 s 2023-12-18 02:56:04,714 44k INFO ====> Epoch: 435, cost 16.25 s 2023-12-18 02:56:20,863 44k INFO ====> Epoch: 436, cost 16.15 s 2023-12-18 02:56:37,321 44k INFO ====> Epoch: 437, cost 16.46 s 2023-12-18 02:56:53,531 44k INFO ====> Epoch: 438, cost 16.21 s 2023-12-18 02:57:09,676 44k INFO ====> Epoch: 439, cost 16.14 s 2023-12-18 02:57:19,261 44k INFO Train Epoch: 440 [27%] 2023-12-18 02:57:19,261 44k INFO Losses: [2.4228711128234863, 2.587799549102783, 7.584394454956055, 21.408559799194336, 0.8824259042739868], step: 6600, lr: 9.463635839084426e-05, reference_loss: 34.886051177978516 2023-12-18 02:57:26,676 44k INFO ====> Epoch: 440, cost 17.00 s 2023-12-18 02:57:42,881 44k INFO ====> Epoch: 441, cost 16.20 s 2023-12-18 02:57:59,165 44k INFO ====> Epoch: 442, cost 16.28 s 2023-12-18 02:58:15,280 44k INFO ====> Epoch: 443, cost 16.11 s 2023-12-18 02:58:31,512 44k INFO ====> Epoch: 444, cost 16.23 s 2023-12-18 02:58:47,920 44k INFO ====> Epoch: 445, cost 16.41 s 2023-12-18 02:59:04,009 44k INFO ====> Epoch: 446, cost 16.09 s 2023-12-18 02:59:20,292 44k INFO ====> Epoch: 447, cost 16.28 s 2023-12-18 02:59:36,415 44k INFO ====> Epoch: 448, cost 16.12 s 2023-12-18 02:59:52,867 44k INFO ====> Epoch: 449, cost 16.45 s 2023-12-18 03:00:09,080 44k INFO ====> Epoch: 450, cost 16.21 s 2023-12-18 03:00:25,942 44k INFO ====> Epoch: 451, cost 16.86 s 2023-12-18 03:00:42,177 44k INFO ====> Epoch: 452, cost 16.23 s 2023-12-18 03:00:54,963 44k INFO Train Epoch: 453 [60%] 2023-12-18 03:00:54,963 44k INFO Losses: [2.2092530727386475, 2.8768882751464844, 10.324399948120117, 21.902294158935547, 0.6800149083137512], step: 6800, lr: 9.448268959367411e-05, reference_loss: 37.99285125732422 2023-12-18 03:00:59,073 44k INFO ====> Epoch: 453, cost 16.90 s 2023-12-18 03:01:15,284 44k INFO ====> Epoch: 454, cost 16.21 s 2023-12-18 03:01:31,425 44k INFO ====> Epoch: 455, cost 16.14 s 2023-12-18 03:01:47,850 44k INFO ====> Epoch: 456, cost 16.42 s 2023-12-18 03:02:04,043 44k INFO ====> Epoch: 457, cost 16.19 s 2023-12-18 03:02:20,122 44k INFO ====> Epoch: 458, cost 16.08 s 2023-12-18 03:02:36,475 44k INFO ====> Epoch: 459, cost 16.35 s 2023-12-18 03:02:53,057 44k INFO ====> Epoch: 460, cost 16.58 s 2023-12-18 03:03:09,345 44k INFO ====> Epoch: 461, cost 16.29 s 2023-12-18 03:03:25,607 44k INFO ====> Epoch: 462, cost 16.26 s 2023-12-18 03:03:42,077 44k INFO ====> Epoch: 463, cost 16.47 s 2023-12-18 03:03:58,332 44k INFO ====> Epoch: 464, cost 16.26 s 2023-12-18 03:04:14,591 44k INFO ====> Epoch: 465, cost 16.26 s 2023-12-18 03:04:30,309 44k INFO Train Epoch: 466 [93%] 2023-12-18 03:04:30,309 44k INFO Losses: [2.9419851303100586, 1.9626998901367188, 1.2887744903564453, 12.46977710723877, 0.17783141136169434], step: 7000, lr: 9.432927032110133e-05, reference_loss: 18.841068267822266 2023-12-18 03:04:31,338 44k INFO ====> Epoch: 466, cost 16.75 s 2023-12-18 03:04:47,605 44k INFO ====> Epoch: 467, cost 16.27 s 2023-12-18 03:05:04,070 44k INFO ====> Epoch: 468, cost 16.47 s 2023-12-18 03:05:20,380 44k INFO ====> Epoch: 469, cost 16.31 s 2023-12-18 03:05:36,740 44k INFO ====> Epoch: 470, cost 16.36 s 2023-12-18 03:05:52,944 44k INFO ====> Epoch: 471, cost 16.20 s 2023-12-18 03:06:09,118 44k INFO ====> Epoch: 472, cost 16.17 s 2023-12-18 03:06:25,516 44k INFO ====> Epoch: 473, cost 16.40 s 2023-12-18 03:06:41,720 44k INFO ====> Epoch: 474, cost 16.20 s 2023-12-18 03:06:57,853 44k INFO ====> Epoch: 475, cost 16.13 s 2023-12-18 03:07:14,131 44k INFO ====> Epoch: 476, cost 16.28 s 2023-12-18 03:07:30,672 44k INFO ====> Epoch: 477, cost 16.54 s 2023-12-18 03:07:46,904 44k INFO ====> Epoch: 478, cost 16.23 s 2023-12-18 03:08:03,386 44k INFO ====> Epoch: 479, cost 16.48 s 2023-12-18 03:08:13,369 44k INFO Train Epoch: 480 [27%] 2023-12-18 03:08:13,369 44k INFO Losses: [2.371331214904785, 2.164797067642212, 10.70827865600586, 23.26729393005371, 0.7630030512809753], step: 7200, lr: 9.416432815543143e-05, reference_loss: 39.27470397949219 2023-12-18 03:08:18,722 44k INFO Saving model and optimizer state at iteration 480 to ./logs\44k\G_7200.pth 2023-12-18 03:08:19,872 44k INFO Saving model and optimizer state at iteration 480 to ./logs\44k\D_7200.pth 2023-12-18 03:08:32,657 44k INFO ====> Epoch: 480, cost 29.27 s 2023-12-18 03:08:49,069 44k INFO ====> Epoch: 481, cost 16.41 s 2023-12-18 03:09:05,633 44k INFO ====> Epoch: 482, cost 16.56 s 2023-12-18 03:09:21,776 44k INFO ====> Epoch: 483, cost 16.14 s 2023-12-18 03:09:38,020 44k INFO ====> Epoch: 484, cost 16.24 s 2023-12-18 03:09:54,544 44k INFO ====> Epoch: 485, cost 16.52 s 2023-12-18 03:10:11,138 44k INFO ====> Epoch: 486, cost 16.59 s 2023-12-18 03:10:27,528 44k INFO ====> Epoch: 487, cost 16.39 s 2023-12-18 03:10:43,799 44k INFO ====> Epoch: 488, cost 16.27 s 2023-12-18 03:11:00,218 44k INFO ====> Epoch: 489, cost 16.42 s 2023-12-18 03:11:16,761 44k INFO ====> Epoch: 490, cost 16.54 s 2023-12-18 03:11:33,054 44k INFO ====> Epoch: 491, cost 16.29 s 2023-12-18 03:11:49,258 44k INFO ====> Epoch: 492, cost 16.20 s 2023-12-18 03:12:02,043 44k INFO Train Epoch: 493 [60%] 2023-12-18 03:12:02,043 44k INFO Losses: [2.5731394290924072, 2.0920982360839844, 8.718671798706055, 23.59229278564453, 0.7969865202903748], step: 7400, lr: 9.401142583237059e-05, reference_loss: 37.773189544677734 2023-12-18 03:12:06,167 44k INFO ====> Epoch: 493, cost 16.91 s 2023-12-18 03:12:22,432 44k INFO ====> Epoch: 494, cost 16.27 s 2023-12-18 03:12:38,628 44k INFO ====> Epoch: 495, cost 16.20 s 2023-12-18 03:12:54,618 44k INFO ====> Epoch: 496, cost 15.99 s 2023-12-18 03:13:10,799 44k INFO ====> Epoch: 497, cost 16.18 s 2023-12-18 03:13:27,104 44k INFO ====> Epoch: 498, cost 16.31 s 2023-12-18 03:13:43,419 44k INFO ====> Epoch: 499, cost 16.31 s 2023-12-18 03:13:59,683 44k INFO ====> Epoch: 500, cost 16.26 s 2023-12-18 03:14:16,015 44k INFO ====> Epoch: 501, cost 16.33 s 2023-12-18 03:14:32,282 44k INFO ====> Epoch: 502, cost 16.27 s 2023-12-18 03:14:48,776 44k INFO ====> Epoch: 503, cost 16.49 s 2023-12-18 03:15:04,986 44k INFO ====> Epoch: 504, cost 16.21 s 2023-12-18 03:15:21,572 44k INFO ====> Epoch: 505, cost 16.59 s 2023-12-18 03:15:37,166 44k INFO Train Epoch: 506 [93%] 2023-12-18 03:15:37,166 44k INFO Losses: [2.1747968196868896, 2.423266649246216, 9.536327362060547, 23.640836715698242, 1.4842207431793213], step: 7600, lr: 9.385877178932038e-05, reference_loss: 39.25944900512695 2023-12-18 03:15:38,190 44k INFO ====> Epoch: 506, cost 16.62 s 2023-12-18 03:15:54,543 44k INFO ====> Epoch: 507, cost 16.35 s 2023-12-18 03:16:10,721 44k INFO ====> Epoch: 508, cost 16.18 s 2023-12-18 03:16:27,035 44k INFO ====> Epoch: 509, cost 16.31 s 2023-12-18 03:16:43,354 44k INFO ====> Epoch: 510, cost 16.32 s 2023-12-18 03:16:59,521 44k INFO ====> Epoch: 511, cost 16.17 s 2023-12-18 03:17:15,776 44k INFO ====> Epoch: 512, cost 16.25 s 2023-12-18 03:17:32,108 44k INFO ====> Epoch: 513, cost 16.33 s 2023-12-18 03:17:48,422 44k INFO ====> Epoch: 514, cost 16.31 s 2023-12-18 03:18:04,653 44k INFO ====> Epoch: 515, cost 16.23 s 2023-12-18 03:18:20,890 44k INFO ====> Epoch: 516, cost 16.24 s 2023-12-18 03:18:37,130 44k INFO ====> Epoch: 517, cost 16.24 s 2023-12-18 03:18:53,257 44k INFO ====> Epoch: 518, cost 16.13 s 2023-12-18 03:19:09,542 44k INFO ====> Epoch: 519, cost 16.29 s 2023-12-18 03:19:19,228 44k INFO Train Epoch: 520 [27%] 2023-12-18 03:19:19,228 44k INFO Losses: [2.44846248626709, 2.524541139602661, 10.14451789855957, 24.795408248901367, 1.2752424478530884], step: 7800, lr: 9.36946523274254e-05, reference_loss: 41.18817138671875 2023-12-18 03:19:26,405 44k INFO ====> Epoch: 520, cost 16.86 s 2023-12-18 03:19:42,632 44k INFO ====> Epoch: 521, cost 16.23 s 2023-12-18 03:19:58,709 44k INFO ====> Epoch: 522, cost 16.08 s 2023-12-18 03:20:14,958 44k INFO ====> Epoch: 523, cost 16.25 s 2023-12-18 03:20:31,163 44k INFO ====> Epoch: 524, cost 16.21 s 2023-12-18 03:20:47,502 44k INFO ====> Epoch: 525, cost 16.34 s 2023-12-18 03:21:03,719 44k INFO ====> Epoch: 526, cost 16.22 s 2023-12-18 03:21:20,092 44k INFO ====> Epoch: 527, cost 16.37 s 2023-12-18 03:21:36,224 44k INFO ====> Epoch: 528, cost 16.13 s 2023-12-18 03:21:52,548 44k INFO ====> Epoch: 529, cost 16.32 s 2023-12-18 03:22:08,909 44k INFO ====> Epoch: 530, cost 16.36 s 2023-12-18 03:22:25,089 44k INFO ====> Epoch: 531, cost 16.18 s 2023-12-18 03:22:41,399 44k INFO ====> Epoch: 532, cost 16.31 s 2023-12-18 03:22:54,307 44k INFO Train Epoch: 533 [60%] 2023-12-18 03:22:54,307 44k INFO Losses: [2.2243690490722656, 2.556581974029541, 9.608068466186523, 22.271312713623047, 0.8414894342422485], step: 8000, lr: 9.35425126554299e-05, reference_loss: 37.5018196105957 2023-12-18 03:22:59,580 44k INFO Saving model and optimizer state at iteration 533 to ./logs\44k\G_8000.pth 2023-12-18 03:23:00,866 44k INFO Saving model and optimizer state at iteration 533 to ./logs\44k\D_8000.pth 2023-12-18 03:23:10,155 44k INFO ====> Epoch: 533, cost 28.76 s 2023-12-18 03:23:27,684 44k INFO ====> Epoch: 534, cost 17.53 s 2023-12-18 03:23:43,782 44k INFO ====> Epoch: 535, cost 16.10 s 2023-12-18 03:23:59,834 44k INFO ====> Epoch: 536, cost 16.05 s 2023-12-18 03:24:16,079 44k INFO ====> Epoch: 537, cost 16.25 s 2023-12-18 03:24:32,125 44k INFO ====> Epoch: 538, cost 16.05 s 2023-12-18 03:24:48,580 44k INFO ====> Epoch: 539, cost 16.46 s 2023-12-18 03:25:04,774 44k INFO ====> Epoch: 540, cost 16.19 s 2023-12-18 03:25:21,111 44k INFO ====> Epoch: 541, cost 16.34 s 2023-12-18 03:25:37,107 44k INFO ====> Epoch: 542, cost 16.00 s 2023-12-18 03:25:53,498 44k INFO ====> Epoch: 543, cost 16.39 s 2023-12-18 03:26:09,673 44k INFO ====> Epoch: 544, cost 16.17 s 2023-12-18 03:26:25,869 44k INFO ====> Epoch: 545, cost 16.20 s 2023-12-18 03:26:41,747 44k INFO Train Epoch: 546 [93%] 2023-12-18 03:26:41,747 44k INFO Losses: [1.6758639812469482, 3.2283408641815186, 14.777837753295898, 29.46989631652832, 0.7132466435432434], step: 8200, lr: 9.339062002506615e-05, reference_loss: 49.86518478393555 2023-12-18 03:26:42,804 44k INFO ====> Epoch: 546, cost 16.94 s 2023-12-18 03:26:59,131 44k INFO ====> Epoch: 547, cost 16.33 s 2023-12-18 03:27:15,325 44k INFO ====> Epoch: 548, cost 16.19 s 2023-12-18 03:27:31,444 44k INFO ====> Epoch: 549, cost 16.12 s 2023-12-18 03:27:47,553 44k INFO ====> Epoch: 550, cost 16.11 s 2023-12-18 03:28:03,774 44k INFO ====> Epoch: 551, cost 16.22 s 2023-12-18 03:28:20,055 44k INFO ====> Epoch: 552, cost 16.28 s 2023-12-18 03:28:36,421 44k INFO ====> Epoch: 553, cost 16.37 s 2023-12-18 03:28:52,633 44k INFO ====> Epoch: 554, cost 16.21 s 2023-12-18 03:29:08,807 44k INFO ====> Epoch: 555, cost 16.17 s 2023-12-18 03:29:25,093 44k INFO ====> Epoch: 556, cost 16.29 s 2023-12-18 03:29:41,262 44k INFO ====> Epoch: 557, cost 16.17 s 2023-12-18 03:29:57,593 44k INFO ====> Epoch: 558, cost 16.33 s 2023-12-18 03:30:13,853 44k INFO ====> Epoch: 559, cost 16.26 s 2023-12-18 03:30:23,669 44k INFO Train Epoch: 560 [27%] 2023-12-18 03:30:23,669 44k INFO Losses: [2.5246875286102295, 2.4349358081817627, 8.764331817626953, 22.128585815429688, 0.7562029361724854], step: 8400, lr: 9.322731916343797e-05, reference_loss: 36.608741760253906 2023-12-18 03:30:30,752 44k INFO ====> Epoch: 560, cost 16.90 s 2023-12-18 03:30:46,838 44k INFO ====> Epoch: 561, cost 16.09 s 2023-12-18 03:31:03,003 44k INFO ====> Epoch: 562, cost 16.17 s 2023-12-18 03:31:19,123 44k INFO ====> Epoch: 563, cost 16.12 s 2023-12-18 03:31:35,368 44k INFO ====> Epoch: 564, cost 16.24 s 2023-12-18 03:31:51,650 44k INFO ====> Epoch: 565, cost 16.28 s 2023-12-18 03:32:07,824 44k INFO ====> Epoch: 566, cost 16.17 s 2023-12-18 03:32:23,972 44k INFO ====> Epoch: 567, cost 16.15 s 2023-12-18 03:32:40,229 44k INFO ====> Epoch: 568, cost 16.26 s 2023-12-18 03:32:56,944 44k INFO ====> Epoch: 569, cost 16.71 s 2023-12-18 03:33:13,211 44k INFO ====> Epoch: 570, cost 16.27 s 2023-12-18 03:33:29,739 44k INFO ====> Epoch: 571, cost 16.53 s 2023-12-18 03:33:45,986 44k INFO ====> Epoch: 572, cost 16.25 s 2023-12-18 03:33:58,609 44k INFO Train Epoch: 573 [60%] 2023-12-18 03:33:58,609 44k INFO Losses: [2.032956123352051, 2.7742443084716797, 12.18560791015625, 23.122148513793945, 0.5222935080528259], step: 8600, lr: 9.307593833853263e-05, reference_loss: 40.63725280761719 2023-12-18 03:34:02,628 44k INFO ====> Epoch: 573, cost 16.64 s 2023-12-18 03:34:18,954 44k INFO ====> Epoch: 574, cost 16.33 s 2023-12-18 03:34:35,403 44k INFO ====> Epoch: 575, cost 16.45 s 2023-12-18 03:34:51,687 44k INFO ====> Epoch: 576, cost 16.28 s 2023-12-18 03:35:07,938 44k INFO ====> Epoch: 577, cost 16.25 s 2023-12-18 03:35:24,122 44k INFO ====> Epoch: 578, cost 16.18 s 2023-12-18 03:35:40,459 44k INFO ====> Epoch: 579, cost 16.34 s 2023-12-18 03:35:56,578 44k INFO ====> Epoch: 580, cost 16.12 s 2023-12-18 03:36:12,839 44k INFO ====> Epoch: 581, cost 16.26 s 2023-12-18 03:36:29,130 44k INFO ====> Epoch: 582, cost 16.29 s 2023-12-18 03:36:45,239 44k INFO ====> Epoch: 583, cost 16.11 s 2023-12-18 03:37:01,401 44k INFO ====> Epoch: 584, cost 16.16 s 2023-12-18 03:37:17,541 44k INFO ====> Epoch: 585, cost 16.14 s 2023-12-18 03:37:33,167 44k INFO Train Epoch: 586 [93%] 2023-12-18 03:37:33,167 44k INFO Losses: [2.367368698120117, 2.294210195541382, 9.031126022338867, 21.53502082824707, 0.6182959675788879], step: 8800, lr: 9.292480332305691e-05, reference_loss: 35.84602355957031 2023-12-18 03:37:38,352 44k INFO Saving model and optimizer state at iteration 586 to ./logs\44k\G_8800.pth 2023-12-18 03:37:39,664 44k INFO Saving model and optimizer state at iteration 586 to ./logs\44k\D_8800.pth 2023-12-18 03:37:44,395 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_800.pth 2023-12-18 03:37:44,395 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_800.pth 2023-12-18 03:37:44,939 44k INFO ====> Epoch: 586, cost 27.40 s 2023-12-18 03:38:01,738 44k INFO ====> Epoch: 587, cost 16.80 s 2023-12-18 03:38:18,096 44k INFO ====> Epoch: 588, cost 16.36 s 2023-12-18 03:38:34,464 44k INFO ====> Epoch: 589, cost 16.37 s 2023-12-18 03:38:50,641 44k INFO ====> Epoch: 590, cost 16.18 s 2023-12-18 03:39:06,826 44k INFO ====> Epoch: 591, cost 16.19 s 2023-12-18 03:39:22,928 44k INFO ====> Epoch: 592, cost 16.10 s 2023-12-18 03:39:39,089 44k INFO ====> Epoch: 593, cost 16.16 s 2023-12-18 03:39:55,483 44k INFO ====> Epoch: 594, cost 16.39 s 2023-12-18 03:40:11,888 44k INFO ====> Epoch: 595, cost 16.40 s 2023-12-18 03:40:28,236 44k INFO ====> Epoch: 596, cost 16.35 s 2023-12-18 03:40:44,502 44k INFO ====> Epoch: 597, cost 16.27 s 2023-12-18 03:41:00,679 44k INFO ====> Epoch: 598, cost 16.18 s 2023-12-18 03:41:16,980 44k INFO ====> Epoch: 599, cost 16.30 s 2023-12-18 03:41:26,559 44k INFO Train Epoch: 600 [27%] 2023-12-18 03:41:26,559 44k INFO Losses: [2.686659336090088, 2.2477200031280518, 7.3207926750183105, 18.719139099121094, 0.4053540527820587], step: 9000, lr: 9.276231697865521e-05, reference_loss: 31.37966537475586 2023-12-18 03:41:33,716 44k INFO ====> Epoch: 600, cost 16.74 s 2023-12-18 03:41:50,015 44k INFO ====> Epoch: 601, cost 16.30 s 2023-12-18 03:42:06,509 44k INFO ====> Epoch: 602, cost 16.49 s 2023-12-18 03:42:22,815 44k INFO ====> Epoch: 603, cost 16.31 s 2023-12-18 03:42:39,029 44k INFO ====> Epoch: 604, cost 16.21 s 2023-12-18 03:42:55,157 44k INFO ====> Epoch: 605, cost 16.13 s 2023-12-18 03:43:11,390 44k INFO ====> Epoch: 606, cost 16.23 s 2023-12-18 03:43:27,594 44k INFO ====> Epoch: 607, cost 16.20 s 2023-12-18 03:43:43,772 44k INFO ====> Epoch: 608, cost 16.18 s 2023-12-18 03:43:59,953 44k INFO ====> Epoch: 609, cost 16.18 s 2023-12-18 03:44:16,358 44k INFO ====> Epoch: 610, cost 16.40 s 2023-12-18 03:44:32,550 44k INFO ====> Epoch: 611, cost 16.19 s 2023-12-18 03:44:48,782 44k INFO ====> Epoch: 612, cost 16.23 s 2023-12-18 03:45:01,661 44k INFO Train Epoch: 613 [60%] 2023-12-18 03:45:01,661 44k INFO Losses: [2.3935043811798096, 2.2610201835632324, 7.467950820922852, 19.677114486694336, 0.6357718110084534], step: 9200, lr: 9.261169121583839e-05, reference_loss: 32.43536376953125 2023-12-18 03:45:05,726 44k INFO ====> Epoch: 613, cost 16.94 s 2023-12-18 03:45:22,080 44k INFO ====> Epoch: 614, cost 16.35 s 2023-12-18 03:45:38,271 44k INFO ====> Epoch: 615, cost 16.19 s 2023-12-18 03:45:54,490 44k INFO ====> Epoch: 616, cost 16.22 s 2023-12-18 03:46:10,750 44k INFO ====> Epoch: 617, cost 16.26 s 2023-12-18 03:46:26,917 44k INFO ====> Epoch: 618, cost 16.17 s 2023-12-18 03:46:43,164 44k INFO ====> Epoch: 619, cost 16.25 s 2023-12-18 03:46:59,487 44k INFO ====> Epoch: 620, cost 16.32 s 2023-12-18 03:47:15,700 44k INFO ====> Epoch: 621, cost 16.21 s 2023-12-18 03:47:31,843 44k INFO ====> Epoch: 622, cost 16.14 s 2023-12-18 03:47:48,025 44k INFO ====> Epoch: 623, cost 16.18 s 2023-12-18 03:48:04,310 44k INFO ====> Epoch: 624, cost 16.28 s 2023-12-18 03:48:20,464 44k INFO ====> Epoch: 625, cost 16.15 s 2023-12-18 03:48:36,155 44k INFO Train Epoch: 626 [93%] 2023-12-18 03:48:36,155 44k INFO Losses: [2.365736484527588, 2.5224854946136475, 10.652039527893066, 23.317241668701172, -0.05370108410716057], step: 9400, lr: 9.246131003639512e-05, reference_loss: 38.803802490234375 2023-12-18 03:48:37,219 44k INFO ====> Epoch: 626, cost 16.76 s 2023-12-18 03:48:53,470 44k INFO ====> Epoch: 627, cost 16.25 s 2023-12-18 03:49:09,593 44k INFO ====> Epoch: 628, cost 16.12 s 2023-12-18 03:49:25,850 44k INFO ====> Epoch: 629, cost 16.26 s 2023-12-18 03:49:42,077 44k INFO ====> Epoch: 630, cost 16.23 s 2023-12-18 03:49:58,410 44k INFO ====> Epoch: 631, cost 16.33 s 2023-12-18 03:50:14,490 44k INFO ====> Epoch: 632, cost 16.08 s 2023-12-18 03:50:30,639 44k INFO ====> Epoch: 633, cost 16.15 s 2023-12-18 03:50:46,749 44k INFO ====> Epoch: 634, cost 16.11 s 2023-12-18 03:51:03,029 44k INFO ====> Epoch: 635, cost 16.28 s 2023-12-18 03:51:19,357 44k INFO ====> Epoch: 636, cost 16.33 s 2023-12-18 03:51:35,596 44k INFO ====> Epoch: 637, cost 16.24 s 2023-12-18 03:51:51,767 44k INFO ====> Epoch: 638, cost 16.17 s 2023-12-18 03:52:07,975 44k INFO ====> Epoch: 639, cost 16.21 s 2023-12-18 03:52:17,580 44k INFO Train Epoch: 640 [27%] 2023-12-18 03:52:17,590 44k INFO Losses: [2.5325210094451904, 2.3539023399353027, 6.18679141998291, 17.071977615356445, 0.7912488579750061], step: 9600, lr: 9.229963414654495e-05, reference_loss: 28.936439514160156 2023-12-18 03:52:22,907 44k INFO Saving model and optimizer state at iteration 640 to ./logs\44k\G_9600.pth 2023-12-18 03:52:24,118 44k INFO Saving model and optimizer state at iteration 640 to ./logs\44k\D_9600.pth 2023-12-18 03:52:29,045 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_1600.pth 2023-12-18 03:52:29,045 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_1600.pth 2023-12-18 03:52:36,354 44k INFO ====> Epoch: 640, cost 28.38 s 2023-12-18 03:52:52,607 44k INFO ====> Epoch: 641, cost 16.25 s 2023-12-18 03:53:08,860 44k INFO ====> Epoch: 642, cost 16.25 s 2023-12-18 03:53:25,458 44k INFO ====> Epoch: 643, cost 16.60 s 2023-12-18 03:53:41,757 44k INFO ====> Epoch: 644, cost 16.30 s 2023-12-18 03:53:58,143 44k INFO ====> Epoch: 645, cost 16.39 s 2023-12-18 03:54:14,518 44k INFO ====> Epoch: 646, cost 16.38 s 2023-12-18 03:54:30,876 44k INFO ====> Epoch: 647, cost 16.36 s 2023-12-18 03:54:47,202 44k INFO ====> Epoch: 648, cost 16.33 s 2023-12-18 03:55:03,670 44k INFO ====> Epoch: 649, cost 16.47 s 2023-12-18 03:55:19,944 44k INFO ====> Epoch: 650, cost 16.27 s 2023-12-18 03:55:36,120 44k INFO ====> Epoch: 651, cost 16.18 s 2023-12-18 03:55:52,410 44k INFO ====> Epoch: 652, cost 16.29 s 2023-12-18 03:56:05,085 44k INFO Train Epoch: 653 [60%] 2023-12-18 03:56:05,085 44k INFO Losses: [2.470935106277466, 2.413087844848633, 9.679454803466797, 22.19013023376465, 0.5475367903709412], step: 9800, lr: 9.214975967969402e-05, reference_loss: 37.301143646240234 2023-12-18 03:56:09,133 44k INFO ====> Epoch: 653, cost 16.72 s 2023-12-18 03:56:25,316 44k INFO ====> Epoch: 654, cost 16.18 s 2023-12-18 03:56:41,581 44k INFO ====> Epoch: 655, cost 16.26 s 2023-12-18 03:56:58,156 44k INFO ====> Epoch: 656, cost 16.58 s 2023-12-18 03:57:14,564 44k INFO ====> Epoch: 657, cost 16.41 s 2023-12-18 03:57:30,872 44k INFO ====> Epoch: 658, cost 16.31 s 2023-12-18 03:57:47,101 44k INFO ====> Epoch: 659, cost 16.23 s 2023-12-18 03:58:03,526 44k INFO ====> Epoch: 660, cost 16.43 s 2023-12-18 03:58:19,859 44k INFO ====> Epoch: 661, cost 16.33 s 2023-12-18 03:58:36,028 44k INFO ====> Epoch: 662, cost 16.17 s 2023-12-18 03:58:52,408 44k INFO ====> Epoch: 663, cost 16.38 s 2023-12-18 03:59:08,630 44k INFO ====> Epoch: 664, cost 16.22 s 2023-12-18 03:59:24,851 44k INFO ====> Epoch: 665, cost 16.22 s 2023-12-18 03:59:40,548 44k INFO Train Epoch: 666 [93%] 2023-12-18 03:59:40,548 44k INFO Losses: [2.5068047046661377, 2.4131383895874023, 9.193986892700195, 21.434316635131836, 0.2774357795715332], step: 10000, lr: 9.200012857627587e-05, reference_loss: 35.82568359375 2023-12-18 03:59:41,587 44k INFO ====> Epoch: 666, cost 16.74 s 2023-12-18 03:59:57,955 44k INFO ====> Epoch: 667, cost 16.37 s 2023-12-18 04:00:14,179 44k INFO ====> Epoch: 668, cost 16.22 s 2023-12-18 04:00:30,811 44k INFO ====> Epoch: 669, cost 16.63 s 2023-12-18 04:00:47,121 44k INFO ====> Epoch: 670, cost 16.31 s 2023-12-18 04:01:03,446 44k INFO ====> Epoch: 671, cost 16.32 s 2023-12-18 04:01:19,623 44k INFO ====> Epoch: 672, cost 16.18 s 2023-12-18 04:01:35,794 44k INFO ====> Epoch: 673, cost 16.17 s 2023-12-18 04:01:52,256 44k INFO ====> Epoch: 674, cost 16.46 s 2023-12-18 04:02:08,564 44k INFO ====> Epoch: 675, cost 16.31 s 2023-12-18 04:02:24,770 44k INFO ====> Epoch: 676, cost 16.21 s 2023-12-18 04:02:41,169 44k INFO ====> Epoch: 677, cost 16.40 s 2023-12-18 04:02:57,522 44k INFO ====> Epoch: 678, cost 16.35 s 2023-12-18 04:03:13,721 44k INFO ====> Epoch: 679, cost 16.20 s 2023-12-18 04:03:23,308 44k INFO Train Epoch: 680 [27%] 2023-12-18 04:03:23,308 44k INFO Losses: [2.6443393230438232, 2.3035964965820312, 5.249619960784912, 20.20764923095703, 0.637464165687561], step: 10200, lr: 9.183925909856629e-05, reference_loss: 31.04266929626465 2023-12-18 04:03:30,446 44k INFO ====> Epoch: 680, cost 16.73 s 2023-12-18 04:03:46,553 44k INFO ====> Epoch: 681, cost 16.11 s 2023-12-18 04:04:02,767 44k INFO ====> Epoch: 682, cost 16.21 s 2023-12-18 04:04:19,113 44k INFO ====> Epoch: 683, cost 16.35 s 2023-12-18 04:04:35,348 44k INFO ====> Epoch: 684, cost 16.23 s 2023-12-18 04:04:51,503 44k INFO ====> Epoch: 685, cost 16.15 s 2023-12-18 04:05:07,769 44k INFO ====> Epoch: 686, cost 16.27 s 2023-12-18 04:05:24,035 44k INFO ====> Epoch: 687, cost 16.27 s 2023-12-18 04:05:40,267 44k INFO ====> Epoch: 688, cost 16.23 s 2023-12-18 04:05:56,556 44k INFO ====> Epoch: 689, cost 16.29 s 2023-12-18 04:06:12,773 44k INFO ====> Epoch: 690, cost 16.22 s 2023-12-18 04:06:29,073 44k INFO ====> Epoch: 691, cost 16.30 s 2023-12-18 04:06:45,371 44k INFO ====> Epoch: 692, cost 16.30 s 2023-12-18 04:06:58,129 44k INFO Train Epoch: 693 [60%] 2023-12-18 04:06:58,129 44k INFO Losses: [2.4249868392944336, 2.5093863010406494, 8.122474670410156, 19.230562210083008, 0.8282244205474854], step: 10400, lr: 9.169013218034329e-05, reference_loss: 33.11563491821289 2023-12-18 04:07:03,419 44k INFO Saving model and optimizer state at iteration 693 to ./logs\44k\G_10400.pth 2023-12-18 04:07:04,624 44k INFO Saving model and optimizer state at iteration 693 to ./logs\44k\D_10400.pth 2023-12-18 04:07:09,069 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_2400.pth 2023-12-18 04:07:09,069 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_2400.pth 2023-12-18 04:07:13,277 44k INFO ====> Epoch: 693, cost 27.91 s 2023-12-18 04:07:30,358 44k INFO ====> Epoch: 694, cost 17.08 s 2023-12-18 04:07:46,492 44k INFO ====> Epoch: 695, cost 16.13 s 2023-12-18 04:08:03,021 44k INFO ====> Epoch: 696, cost 16.53 s 2023-12-18 04:08:22,012 44k INFO ====> Epoch: 697, cost 18.99 s 2023-12-18 04:08:43,600 44k INFO ====> Epoch: 698, cost 21.59 s 2023-12-18 04:09:01,932 44k INFO ====> Epoch: 699, cost 18.33 s 2023-12-18 04:09:20,406 44k INFO ====> Epoch: 700, cost 18.47 s 2023-12-18 04:09:38,596 44k INFO ====> Epoch: 701, cost 18.19 s 2023-12-18 04:09:57,071 44k INFO ====> Epoch: 702, cost 18.48 s 2023-12-18 04:10:14,556 44k INFO ====> Epoch: 703, cost 17.48 s 2023-12-18 04:10:31,157 44k INFO ====> Epoch: 704, cost 16.60 s 2023-12-18 04:10:48,395 44k INFO ====> Epoch: 705, cost 17.24 s 2023-12-18 04:11:04,835 44k INFO Train Epoch: 706 [93%] 2023-12-18 04:11:04,835 44k INFO Losses: [2.4289908409118652, 2.600823402404785, 9.889498710632324, 21.11870002746582, 1.2660746574401855], step: 10600, lr: 9.154124741169722e-05, reference_loss: 37.3040885925293 2023-12-18 04:11:05,860 44k INFO ====> Epoch: 706, cost 17.47 s 2023-12-18 04:11:22,392 44k INFO ====> Epoch: 707, cost 16.53 s 2023-12-18 04:11:38,517 44k INFO ====> Epoch: 708, cost 16.12 s 2023-12-18 04:11:55,150 44k INFO ====> Epoch: 709, cost 16.63 s 2023-12-18 04:12:11,581 44k INFO ====> Epoch: 710, cost 16.43 s 2023-12-18 04:12:27,921 44k INFO ====> Epoch: 711, cost 16.34 s 2023-12-18 04:12:46,291 44k INFO ====> Epoch: 712, cost 18.37 s 2023-12-18 04:13:03,321 44k INFO ====> Epoch: 713, cost 17.03 s 2023-12-18 04:13:20,095 44k INFO ====> Epoch: 714, cost 16.77 s 2023-12-18 04:13:37,295 44k INFO ====> Epoch: 715, cost 17.20 s 2023-12-18 04:13:54,982 44k INFO ====> Epoch: 716, cost 17.69 s 2023-12-18 04:14:12,684 44k INFO ====> Epoch: 717, cost 17.70 s 2023-12-18 04:14:33,321 44k INFO ====> Epoch: 718, cost 20.64 s 2023-12-18 04:14:58,832 44k INFO ====> Epoch: 719, cost 25.51 s 2023-12-18 04:15:09,509 44k INFO Train Epoch: 720 [27%] 2023-12-18 04:15:09,509 44k INFO Losses: [2.1472959518432617, 2.515350580215454, 7.656266212463379, 23.180870056152344, 0.794455885887146], step: 10800, lr: 9.138118032388012e-05, reference_loss: 36.29423904418945 2023-12-18 04:15:16,680 44k INFO ====> Epoch: 720, cost 17.85 s 2023-12-18 04:15:32,963 44k INFO ====> Epoch: 721, cost 16.28 s 2023-12-18 04:15:49,317 44k INFO ====> Epoch: 722, cost 16.35 s 2023-12-18 04:16:05,535 44k INFO ====> Epoch: 723, cost 16.22 s 2023-12-18 04:16:21,760 44k INFO ====> Epoch: 724, cost 16.22 s 2023-12-18 04:16:38,253 44k INFO ====> Epoch: 725, cost 16.49 s 2023-12-18 04:16:54,635 44k INFO ====> Epoch: 726, cost 16.38 s 2023-12-18 04:17:11,044 44k INFO ====> Epoch: 727, cost 16.41 s 2023-12-18 04:17:27,237 44k INFO ====> Epoch: 728, cost 16.19 s 2023-12-18 04:17:43,593 44k INFO ====> Epoch: 729, cost 16.36 s 2023-12-18 04:17:59,774 44k INFO ====> Epoch: 730, cost 16.18 s 2023-12-18 04:18:16,032 44k INFO ====> Epoch: 731, cost 16.26 s 2023-12-18 04:18:32,483 44k INFO ====> Epoch: 732, cost 16.45 s 2023-12-18 04:18:45,297 44k INFO Train Epoch: 733 [60%] 2023-12-18 04:18:45,297 44k INFO Losses: [2.0838499069213867, 2.8245339393615723, 9.96864128112793, 21.94310188293457, 0.9255548715591431], step: 11000, lr: 9.123279722563828e-05, reference_loss: 37.74568557739258 2023-12-18 04:18:49,346 44k INFO ====> Epoch: 733, cost 16.86 s 2023-12-18 04:19:05,580 44k INFO ====> Epoch: 734, cost 16.22 s 2023-12-18 04:19:21,894 44k INFO ====> Epoch: 735, cost 16.32 s 2023-12-18 04:19:38,221 44k INFO ====> Epoch: 736, cost 16.33 s 2023-12-18 04:19:54,561 44k INFO ====> Epoch: 737, cost 16.34 s 2023-12-18 04:20:10,923 44k INFO ====> Epoch: 738, cost 16.36 s 2023-12-18 04:20:27,208 44k INFO ====> Epoch: 739, cost 16.28 s 2023-12-18 04:20:43,673 44k INFO ====> Epoch: 740, cost 16.47 s 2023-12-18 04:20:59,903 44k INFO ====> Epoch: 741, cost 16.23 s 2023-12-18 04:21:16,153 44k INFO ====> Epoch: 742, cost 16.25 s 2023-12-18 04:21:32,602 44k INFO ====> Epoch: 743, cost 16.45 s 2023-12-18 04:21:48,961 44k INFO ====> Epoch: 744, cost 16.36 s 2023-12-18 04:22:05,154 44k INFO ====> Epoch: 745, cost 16.19 s 2023-12-18 04:22:20,897 44k INFO Train Epoch: 746 [93%] 2023-12-18 04:22:20,897 44k INFO Losses: [2.150676727294922, 2.9614171981811523, 15.374418258666992, 18.337383270263672, 0.8749496340751648], step: 11200, lr: 9.108465506917204e-05, reference_loss: 39.69884490966797 2023-12-18 04:22:26,210 44k INFO Saving model and optimizer state at iteration 746 to ./logs\44k\G_11200.pth 2023-12-18 04:22:27,368 44k INFO Saving model and optimizer state at iteration 746 to ./logs\44k\D_11200.pth 2023-12-18 04:22:32,351 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_3200.pth 2023-12-18 04:22:32,351 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_3200.pth 2023-12-18 04:22:32,913 44k INFO ====> Epoch: 746, cost 27.76 s 2023-12-18 04:22:49,358 44k INFO ====> Epoch: 747, cost 16.44 s 2023-12-18 04:23:05,849 44k INFO ====> Epoch: 748, cost 16.49 s 2023-12-18 04:23:22,287 44k INFO ====> Epoch: 749, cost 16.44 s 2023-12-18 04:23:38,644 44k INFO ====> Epoch: 750, cost 16.36 s 2023-12-18 04:23:54,906 44k INFO ====> Epoch: 751, cost 16.26 s 2023-12-18 04:24:11,094 44k INFO ====> Epoch: 752, cost 16.19 s 2023-12-18 04:24:27,453 44k INFO ====> Epoch: 753, cost 16.36 s 2023-12-18 04:24:43,786 44k INFO ====> Epoch: 754, cost 16.33 s 2023-12-18 04:25:00,094 44k INFO ====> Epoch: 755, cost 16.31 s 2023-12-18 04:25:16,326 44k INFO ====> Epoch: 756, cost 16.23 s 2023-12-18 04:25:32,615 44k INFO ====> Epoch: 757, cost 16.29 s 2023-12-18 04:25:48,963 44k INFO ====> Epoch: 758, cost 16.35 s 2023-12-18 04:26:05,271 44k INFO ====> Epoch: 759, cost 16.31 s 2023-12-18 04:26:14,972 44k INFO Train Epoch: 760 [27%] 2023-12-18 04:26:14,972 44k INFO Losses: [2.172591209411621, 2.789529323577881, 6.801379680633545, 20.053255081176758, 0.6327328085899353], step: 11400, lr: 9.092538636906162e-05, reference_loss: 32.44948959350586 2023-12-18 04:26:22,097 44k INFO ====> Epoch: 760, cost 16.83 s 2023-12-18 04:26:38,452 44k INFO ====> Epoch: 761, cost 16.36 s 2023-12-18 04:26:54,853 44k INFO ====> Epoch: 762, cost 16.40 s 2023-12-18 04:27:11,268 44k INFO ====> Epoch: 763, cost 16.42 s 2023-12-18 04:27:27,640 44k INFO ====> Epoch: 764, cost 16.37 s 2023-12-18 04:27:44,120 44k INFO ====> Epoch: 765, cost 16.48 s 2023-12-18 04:28:00,585 44k INFO ====> Epoch: 766, cost 16.46 s 2023-12-18 04:28:16,862 44k INFO ====> Epoch: 767, cost 16.28 s 2023-12-18 04:28:33,224 44k INFO ====> Epoch: 768, cost 16.36 s 2023-12-18 04:28:49,465 44k INFO ====> Epoch: 769, cost 16.24 s 2023-12-18 04:29:05,750 44k INFO ====> Epoch: 770, cost 16.28 s 2023-12-18 04:29:22,088 44k INFO ====> Epoch: 771, cost 16.34 s 2023-12-18 04:29:38,589 44k INFO ====> Epoch: 772, cost 16.50 s 2023-12-18 04:29:51,381 44k INFO Train Epoch: 773 [60%] 2023-12-18 04:29:51,381 44k INFO Losses: [2.5176641941070557, 2.639613628387451, 9.57396411895752, 18.828937530517578, 0.8886445760726929], step: 11600, lr: 9.077774338075196e-05, reference_loss: 34.44882583618164 2023-12-18 04:29:55,424 44k INFO ====> Epoch: 773, cost 16.83 s 2023-12-18 04:30:11,867 44k INFO ====> Epoch: 774, cost 16.44 s 2023-12-18 04:30:28,196 44k INFO ====> Epoch: 775, cost 16.33 s 2023-12-18 04:30:44,408 44k INFO ====> Epoch: 776, cost 16.21 s 2023-12-18 04:31:00,745 44k INFO ====> Epoch: 777, cost 16.34 s 2023-12-18 04:31:17,156 44k INFO ====> Epoch: 778, cost 16.41 s 2023-12-18 04:31:33,323 44k INFO ====> Epoch: 779, cost 16.17 s 2023-12-18 04:31:49,659 44k INFO ====> Epoch: 780, cost 16.34 s 2023-12-18 04:32:05,732 44k INFO ====> Epoch: 781, cost 16.07 s 2023-12-18 04:32:22,016 44k INFO ====> Epoch: 782, cost 16.28 s 2023-12-18 04:32:38,130 44k INFO ====> Epoch: 783, cost 16.11 s 2023-12-18 04:32:54,497 44k INFO ====> Epoch: 784, cost 16.37 s 2023-12-18 04:33:10,810 44k INFO ====> Epoch: 785, cost 16.31 s 2023-12-18 04:33:26,624 44k INFO Train Epoch: 786 [93%] 2023-12-18 04:33:26,624 44k INFO Losses: [1.2847888469696045, 3.8371198177337646, 14.647244453430176, 29.79035186767578, 1.2161997556686401], step: 11800, lr: 9.063034013244091e-05, reference_loss: 50.77570343017578 2023-12-18 04:33:27,652 44k INFO ====> Epoch: 786, cost 16.84 s 2023-12-18 04:33:43,979 44k INFO ====> Epoch: 787, cost 16.33 s 2023-12-18 04:34:00,267 44k INFO ====> Epoch: 788, cost 16.29 s 2023-12-18 04:34:16,519 44k INFO ====> Epoch: 789, cost 16.25 s 2023-12-18 04:34:32,664 44k INFO ====> Epoch: 790, cost 16.15 s 2023-12-18 04:34:48,925 44k INFO ====> Epoch: 791, cost 16.26 s 2023-12-18 04:35:05,079 44k INFO ====> Epoch: 792, cost 16.15 s 2023-12-18 04:35:21,364 44k INFO ====> Epoch: 793, cost 16.28 s 2023-12-18 04:35:37,565 44k INFO ====> Epoch: 794, cost 16.20 s 2023-12-18 04:35:53,693 44k INFO ====> Epoch: 795, cost 16.13 s 2023-12-18 04:36:09,797 44k INFO ====> Epoch: 796, cost 16.10 s 2023-12-18 04:36:26,049 44k INFO ====> Epoch: 797, cost 16.25 s 2023-12-18 04:36:42,407 44k INFO ====> Epoch: 798, cost 16.36 s 2023-12-18 04:36:58,649 44k INFO ====> Epoch: 799, cost 16.24 s 2023-12-18 04:37:08,240 44k INFO Train Epoch: 800 [27%] 2023-12-18 04:37:08,240 44k INFO Losses: [2.3202173709869385, 2.5541932582855225, 5.824159145355225, 18.169513702392578, 0.5033887028694153], step: 12000, lr: 9.04718658378136e-05, reference_loss: 29.37147331237793 2023-12-18 04:37:13,560 44k INFO Saving model and optimizer state at iteration 800 to ./logs\44k\G_12000.pth 2023-12-18 04:37:14,796 44k INFO Saving model and optimizer state at iteration 800 to ./logs\44k\D_12000.pth 2023-12-18 04:37:19,860 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_4000.pth 2023-12-18 04:37:19,860 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_4000.pth 2023-12-18 04:37:27,208 44k INFO ====> Epoch: 800, cost 28.56 s 2023-12-18 04:37:44,177 44k INFO ====> Epoch: 801, cost 16.97 s 2023-12-18 04:38:00,746 44k INFO ====> Epoch: 802, cost 16.57 s 2023-12-18 04:38:17,242 44k INFO ====> Epoch: 803, cost 16.50 s 2023-12-18 04:38:33,367 44k INFO ====> Epoch: 804, cost 16.12 s 2023-12-18 04:38:49,642 44k INFO ====> Epoch: 805, cost 16.28 s 2023-12-18 04:39:05,987 44k INFO ====> Epoch: 806, cost 16.35 s 2023-12-18 04:39:22,364 44k INFO ====> Epoch: 807, cost 16.38 s 2023-12-18 04:39:38,688 44k INFO ====> Epoch: 808, cost 16.32 s 2023-12-18 04:39:54,900 44k INFO ====> Epoch: 809, cost 16.21 s 2023-12-18 04:40:11,387 44k INFO ====> Epoch: 810, cost 16.49 s 2023-12-18 04:40:27,716 44k INFO ====> Epoch: 811, cost 16.33 s 2023-12-18 04:40:44,002 44k INFO ====> Epoch: 812, cost 16.29 s 2023-12-18 04:40:56,711 44k INFO Train Epoch: 813 [60%] 2023-12-18 04:40:56,711 44k INFO Losses: [2.2773890495300293, 2.357138156890869, 9.44801139831543, 20.78373908996582, 0.6980268955230713], step: 12200, lr: 9.032495926789236e-05, reference_loss: 35.564308166503906 2023-12-18 04:41:00,809 44k INFO ====> Epoch: 813, cost 16.81 s 2023-12-18 04:41:16,993 44k INFO ====> Epoch: 814, cost 16.18 s 2023-12-18 04:41:33,412 44k INFO ====> Epoch: 815, cost 16.42 s 2023-12-18 04:41:49,772 44k INFO ====> Epoch: 816, cost 16.36 s 2023-12-18 04:42:06,068 44k INFO ====> Epoch: 817, cost 16.30 s 2023-12-18 04:42:22,329 44k INFO ====> Epoch: 818, cost 16.26 s 2023-12-18 04:42:38,584 44k INFO ====> Epoch: 819, cost 16.26 s 2023-12-18 04:42:54,895 44k INFO ====> Epoch: 820, cost 16.31 s 2023-12-18 04:43:11,177 44k INFO ====> Epoch: 821, cost 16.28 s 2023-12-18 04:43:27,504 44k INFO ====> Epoch: 822, cost 16.33 s 2023-12-18 04:43:43,960 44k INFO ====> Epoch: 823, cost 16.46 s 2023-12-18 04:44:00,375 44k INFO ====> Epoch: 824, cost 16.41 s 2023-12-18 04:44:16,565 44k INFO ====> Epoch: 825, cost 16.19 s 2023-12-18 04:44:32,281 44k INFO Train Epoch: 826 [93%] 2023-12-18 04:44:32,281 44k INFO Losses: [2.5697479248046875, 1.9767307043075562, 4.003001689910889, 18.901403427124023, 0.7684782147407532], step: 12400, lr: 9.017829124218688e-05, reference_loss: 28.219362258911133 2023-12-18 04:44:33,309 44k INFO ====> Epoch: 826, cost 16.74 s 2023-12-18 04:44:49,694 44k INFO ====> Epoch: 827, cost 16.39 s 2023-12-18 04:45:06,084 44k INFO ====> Epoch: 828, cost 16.39 s 2023-12-18 04:45:22,406 44k INFO ====> Epoch: 829, cost 16.32 s 2023-12-18 04:45:38,917 44k INFO ====> Epoch: 830, cost 16.51 s 2023-12-18 04:45:55,290 44k INFO ====> Epoch: 831, cost 16.37 s 2023-12-18 04:46:11,489 44k INFO ====> Epoch: 832, cost 16.20 s 2023-12-18 04:46:27,795 44k INFO ====> Epoch: 833, cost 16.31 s 2023-12-18 04:46:44,146 44k INFO ====> Epoch: 834, cost 16.35 s 2023-12-18 04:47:00,479 44k INFO ====> Epoch: 835, cost 16.33 s 2023-12-18 04:47:16,780 44k INFO ====> Epoch: 836, cost 16.30 s 2023-12-18 04:47:33,116 44k INFO ====> Epoch: 837, cost 16.34 s 2023-12-18 04:47:49,541 44k INFO ====> Epoch: 838, cost 16.43 s 2023-12-18 04:48:05,711 44k INFO ====> Epoch: 839, cost 16.17 s 2023-12-18 04:48:15,306 44k INFO Train Epoch: 840 [27%] 2023-12-18 04:48:15,306 44k INFO Losses: [2.4772167205810547, 2.981828451156616, 10.502273559570312, 19.10709571838379, 0.8394704461097717], step: 12600, lr: 9.002060739068175e-05, reference_loss: 35.90788269042969 2023-12-18 04:48:22,402 44k INFO ====> Epoch: 840, cost 16.69 s 2023-12-18 04:48:38,652 44k INFO ====> Epoch: 841, cost 16.25 s 2023-12-18 04:48:54,908 44k INFO ====> Epoch: 842, cost 16.26 s 2023-12-18 04:49:11,277 44k INFO ====> Epoch: 843, cost 16.37 s 2023-12-18 04:49:27,684 44k INFO ====> Epoch: 844, cost 16.41 s 2023-12-18 04:49:44,030 44k INFO ====> Epoch: 845, cost 16.35 s 2023-12-18 04:50:00,326 44k INFO ====> Epoch: 846, cost 16.30 s 2023-12-18 04:50:16,670 44k INFO ====> Epoch: 847, cost 16.34 s 2023-12-18 04:50:32,936 44k INFO ====> Epoch: 848, cost 16.27 s 2023-12-18 04:50:49,315 44k INFO ====> Epoch: 849, cost 16.38 s 2023-12-18 04:51:05,671 44k INFO ====> Epoch: 850, cost 16.36 s 2023-12-18 04:51:21,899 44k INFO ====> Epoch: 851, cost 16.23 s 2023-12-18 04:51:38,154 44k INFO ====> Epoch: 852, cost 16.25 s 2023-12-18 04:51:50,893 44k INFO Train Epoch: 853 [60%] 2023-12-18 04:51:50,893 44k INFO Losses: [2.094500780105591, 2.8708062171936035, 10.942903518676758, 22.112499237060547, 0.9065408110618591], step: 12800, lr: 8.987443356601786e-05, reference_loss: 38.927249908447266 2023-12-18 04:51:56,236 44k INFO Saving model and optimizer state at iteration 853 to ./logs\44k\G_12800.pth 2023-12-18 04:51:57,355 44k INFO Saving model and optimizer state at iteration 853 to ./logs\44k\D_12800.pth 2023-12-18 04:52:02,740 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_4800.pth 2023-12-18 04:52:02,740 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_4800.pth 2023-12-18 04:52:06,974 44k INFO ====> Epoch: 853, cost 28.82 s 2023-12-18 04:52:24,081 44k INFO ====> Epoch: 854, cost 17.11 s 2023-12-18 04:52:40,662 44k INFO ====> Epoch: 855, cost 16.58 s 2023-12-18 04:52:57,389 44k INFO ====> Epoch: 856, cost 16.73 s 2023-12-18 04:53:13,690 44k INFO ====> Epoch: 857, cost 16.30 s 2023-12-18 04:53:30,033 44k INFO ====> Epoch: 858, cost 16.34 s 2023-12-18 04:53:46,340 44k INFO ====> Epoch: 859, cost 16.31 s 2023-12-18 04:54:02,699 44k INFO ====> Epoch: 860, cost 16.36 s 2023-12-18 04:54:19,371 44k INFO ====> Epoch: 861, cost 16.67 s 2023-12-18 04:54:35,621 44k INFO ====> Epoch: 862, cost 16.25 s 2023-12-18 04:54:51,937 44k INFO ====> Epoch: 863, cost 16.32 s 2023-12-18 04:55:08,301 44k INFO ====> Epoch: 864, cost 16.36 s 2023-12-18 04:55:24,813 44k INFO ====> Epoch: 865, cost 16.51 s 2023-12-18 04:55:40,656 44k INFO Train Epoch: 866 [93%] 2023-12-18 04:55:40,656 44k INFO Losses: [1.918639898300171, 2.914567232131958, 14.78557014465332, 20.96776580810547, 0.6036921143531799], step: 13000, lr: 8.972849709575134e-05, reference_loss: 41.19023513793945 2023-12-18 04:55:41,725 44k INFO ====> Epoch: 866, cost 16.91 s 2023-12-18 04:55:58,074 44k INFO ====> Epoch: 867, cost 16.35 s 2023-12-18 04:56:14,467 44k INFO ====> Epoch: 868, cost 16.39 s 2023-12-18 04:56:30,735 44k INFO ====> Epoch: 869, cost 16.27 s 2023-12-18 04:56:46,994 44k INFO ====> Epoch: 870, cost 16.26 s 2023-12-18 04:57:03,334 44k INFO ====> Epoch: 871, cost 16.34 s 2023-12-18 04:57:19,892 44k INFO ====> Epoch: 872, cost 16.56 s 2023-12-18 04:57:36,186 44k INFO ====> Epoch: 873, cost 16.29 s 2023-12-18 04:57:52,498 44k INFO ====> Epoch: 874, cost 16.31 s 2023-12-18 04:58:08,765 44k INFO ====> Epoch: 875, cost 16.27 s 2023-12-18 04:58:25,122 44k INFO ====> Epoch: 876, cost 16.36 s 2023-12-18 04:58:41,599 44k INFO ====> Epoch: 877, cost 16.48 s 2023-12-18 04:58:58,039 44k INFO ====> Epoch: 878, cost 16.44 s 2023-12-18 04:59:14,320 44k INFO ====> Epoch: 879, cost 16.28 s 2023-12-18 04:59:24,084 44k INFO Train Epoch: 880 [27%] 2023-12-18 04:59:24,084 44k INFO Losses: [2.2481818199157715, 3.1555540561676025, 12.307568550109863, 24.523696899414062, 0.6064103245735168], step: 13200, lr: 8.957159974477111e-05, reference_loss: 42.84141159057617 2023-12-18 04:59:31,221 44k INFO ====> Epoch: 880, cost 16.90 s 2023-12-18 04:59:47,541 44k INFO ====> Epoch: 881, cost 16.32 s 2023-12-18 05:00:03,989 44k INFO ====> Epoch: 882, cost 16.45 s 2023-12-18 05:00:20,368 44k INFO ====> Epoch: 883, cost 16.38 s 2023-12-18 05:00:36,792 44k INFO ====> Epoch: 884, cost 16.42 s 2023-12-18 05:00:53,110 44k INFO ====> Epoch: 885, cost 16.32 s 2023-12-18 05:01:09,483 44k INFO ====> Epoch: 886, cost 16.37 s 2023-12-18 05:01:25,709 44k INFO ====> Epoch: 887, cost 16.22 s 2023-12-18 05:01:42,007 44k INFO ====> Epoch: 888, cost 16.31 s 2023-12-18 05:01:58,262 44k INFO ====> Epoch: 889, cost 16.25 s 2023-12-18 05:02:14,667 44k INFO ====> Epoch: 890, cost 16.40 s 2023-12-18 05:02:31,102 44k INFO ====> Epoch: 891, cost 16.44 s 2023-12-18 05:02:47,438 44k INFO ====> Epoch: 892, cost 16.34 s 2023-12-18 05:03:00,125 44k INFO Train Epoch: 893 [60%] 2023-12-18 05:03:00,125 44k INFO Losses: [2.5325207710266113, 2.485131025314331, 10.87796401977539, 22.48543357849121, 0.878551721572876], step: 13400, lr: 8.942615501055449e-05, reference_loss: 39.25960159301758 2023-12-18 05:03:04,205 44k INFO ====> Epoch: 893, cost 16.77 s 2023-12-18 05:03:20,606 44k INFO ====> Epoch: 894, cost 16.40 s 2023-12-18 05:03:36,939 44k INFO ====> Epoch: 895, cost 16.33 s 2023-12-18 05:03:53,203 44k INFO ====> Epoch: 896, cost 16.26 s 2023-12-18 05:04:09,364 44k INFO ====> Epoch: 897, cost 16.16 s 2023-12-18 05:04:25,653 44k INFO ====> Epoch: 898, cost 16.29 s 2023-12-18 05:04:41,734 44k INFO ====> Epoch: 899, cost 16.08 s 2023-12-18 05:04:57,916 44k INFO ====> Epoch: 900, cost 16.18 s 2023-12-18 05:05:14,157 44k INFO ====> Epoch: 901, cost 16.24 s 2023-12-18 05:05:30,547 44k INFO ====> Epoch: 902, cost 16.39 s 2023-12-18 05:05:46,776 44k INFO ====> Epoch: 903, cost 16.23 s 2023-12-18 05:06:02,960 44k INFO ====> Epoch: 904, cost 16.18 s 2023-12-18 05:06:19,246 44k INFO ====> Epoch: 905, cost 16.29 s 2023-12-18 05:06:35,006 44k INFO Train Epoch: 906 [93%] 2023-12-18 05:06:35,006 44k INFO Losses: [1.924677848815918, 2.9283578395843506, 11.971912384033203, 22.45263671875, 0.39251601696014404], step: 13600, lr: 8.928094644685142e-05, reference_loss: 39.670101165771484 2023-12-18 05:06:40,317 44k INFO Saving model and optimizer state at iteration 906 to ./logs\44k\G_13600.pth 2023-12-18 05:06:41,627 44k INFO Saving model and optimizer state at iteration 906 to ./logs\44k\D_13600.pth 2023-12-18 05:06:45,790 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_5600.pth 2023-12-18 05:06:45,790 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_5600.pth 2023-12-18 05:06:46,351 44k INFO ====> Epoch: 906, cost 27.10 s 2023-12-18 05:07:02,769 44k INFO ====> Epoch: 907, cost 16.42 s 2023-12-18 05:07:19,299 44k INFO ====> Epoch: 908, cost 16.53 s 2023-12-18 05:07:35,479 44k INFO ====> Epoch: 909, cost 16.18 s 2023-12-18 05:07:51,730 44k INFO ====> Epoch: 910, cost 16.25 s 2023-12-18 05:08:07,919 44k INFO ====> Epoch: 911, cost 16.19 s 2023-12-18 05:08:24,081 44k INFO ====> Epoch: 912, cost 16.16 s 2023-12-18 05:08:40,275 44k INFO ====> Epoch: 913, cost 16.19 s 2023-12-18 05:08:56,745 44k INFO ====> Epoch: 914, cost 16.47 s 2023-12-18 05:09:13,143 44k INFO ====> Epoch: 915, cost 16.40 s 2023-12-18 05:09:29,477 44k INFO ====> Epoch: 916, cost 16.33 s 2023-12-18 05:09:45,720 44k INFO ====> Epoch: 917, cost 16.24 s 2023-12-18 05:10:02,383 44k INFO ====> Epoch: 918, cost 16.66 s 2023-12-18 05:10:18,704 44k INFO ====> Epoch: 919, cost 16.32 s 2023-12-18 05:10:28,460 44k INFO Train Epoch: 920 [27%] 2023-12-18 05:10:28,460 44k INFO Losses: [2.205873489379883, 2.511430501937866, 7.871214389801025, 21.171934127807617, 0.7102806568145752], step: 13800, lr: 8.912483167346374e-05, reference_loss: 34.470733642578125 2023-12-18 05:10:35,653 44k INFO ====> Epoch: 920, cost 16.95 s 2023-12-18 05:10:51,783 44k INFO ====> Epoch: 921, cost 16.13 s 2023-12-18 05:11:08,097 44k INFO ====> Epoch: 922, cost 16.31 s 2023-12-18 05:11:24,356 44k INFO ====> Epoch: 923, cost 16.26 s 2023-12-18 05:11:40,484 44k INFO ====> Epoch: 924, cost 16.13 s 2023-12-18 05:11:56,847 44k INFO ====> Epoch: 925, cost 16.36 s 2023-12-18 05:12:13,067 44k INFO ====> Epoch: 926, cost 16.22 s 2023-12-18 05:12:29,316 44k INFO ====> Epoch: 927, cost 16.25 s 2023-12-18 05:12:45,576 44k INFO ====> Epoch: 928, cost 16.26 s 2023-12-18 05:13:01,945 44k INFO ====> Epoch: 929, cost 16.37 s 2023-12-18 05:13:18,230 44k INFO ====> Epoch: 930, cost 16.28 s 2023-12-18 05:13:34,963 44k INFO ====> Epoch: 931, cost 16.73 s 2023-12-18 05:13:51,496 44k INFO ====> Epoch: 932, cost 16.53 s 2023-12-18 05:14:04,309 44k INFO Train Epoch: 933 [60%] 2023-12-18 05:14:04,309 44k INFO Losses: [1.9371291399002075, 2.9745523929595947, 9.684198379516602, 19.643077850341797, 0.7068173885345459], step: 14000, lr: 8.898011239311388e-05, reference_loss: 34.945777893066406 2023-12-18 05:14:08,399 44k INFO ====> Epoch: 933, cost 16.90 s 2023-12-18 05:14:24,596 44k INFO ====> Epoch: 934, cost 16.20 s 2023-12-18 05:14:40,850 44k INFO ====> Epoch: 935, cost 16.25 s 2023-12-18 05:14:57,188 44k INFO ====> Epoch: 936, cost 16.34 s 2023-12-18 05:15:13,507 44k INFO ====> Epoch: 937, cost 16.32 s 2023-12-18 05:15:29,796 44k INFO ====> Epoch: 938, cost 16.29 s 2023-12-18 05:15:46,210 44k INFO ====> Epoch: 939, cost 16.41 s 2023-12-18 05:16:02,595 44k INFO ====> Epoch: 940, cost 16.39 s 2023-12-18 05:16:18,760 44k INFO ====> Epoch: 941, cost 16.17 s 2023-12-18 05:16:34,986 44k INFO ====> Epoch: 942, cost 16.23 s 2023-12-18 05:16:51,099 44k INFO ====> Epoch: 943, cost 16.11 s 2023-12-18 05:17:07,323 44k INFO ====> Epoch: 944, cost 16.22 s 2023-12-18 05:17:23,493 44k INFO ====> Epoch: 945, cost 16.17 s 2023-12-18 05:17:39,116 44k INFO Train Epoch: 946 [93%] 2023-12-18 05:17:39,116 44k INFO Losses: [1.27572762966156, 3.6431021690368652, 16.160905838012695, 25.824914932250977, 0.7454162836074829], step: 14200, lr: 8.88356281052988e-05, reference_loss: 47.650062561035156 2023-12-18 05:17:40,131 44k INFO ====> Epoch: 946, cost 16.64 s 2023-12-18 05:17:56,530 44k INFO ====> Epoch: 947, cost 16.40 s 2023-12-18 05:18:12,741 44k INFO ====> Epoch: 948, cost 16.21 s 2023-12-18 05:18:28,994 44k INFO ====> Epoch: 949, cost 16.25 s 2023-12-18 05:18:45,281 44k INFO ====> Epoch: 950, cost 16.29 s 2023-12-18 05:19:01,559 44k INFO ====> Epoch: 951, cost 16.28 s 2023-12-18 05:19:17,873 44k INFO ====> Epoch: 952, cost 16.31 s 2023-12-18 05:19:34,107 44k INFO ====> Epoch: 953, cost 16.23 s 2023-12-18 05:19:50,422 44k INFO ====> Epoch: 954, cost 16.32 s 2023-12-18 05:20:06,641 44k INFO ====> Epoch: 955, cost 16.22 s 2023-12-18 05:20:22,846 44k INFO ====> Epoch: 956, cost 16.20 s 2023-12-18 05:20:39,026 44k INFO ====> Epoch: 957, cost 16.18 s 2023-12-18 05:20:55,176 44k INFO ====> Epoch: 958, cost 16.15 s 2023-12-18 05:21:11,428 44k INFO ====> Epoch: 959, cost 16.25 s 2023-12-18 05:21:20,959 44k INFO Train Epoch: 960 [27%] 2023-12-18 05:21:20,959 44k INFO Losses: [2.119739294052124, 2.718965768814087, 10.586974143981934, 22.381820678710938, 0.5612308382987976], step: 14400, lr: 8.868029200613832e-05, reference_loss: 38.36872863769531 2023-12-18 05:21:26,251 44k INFO Saving model and optimizer state at iteration 960 to ./logs\44k\G_14400.pth 2023-12-18 05:21:27,553 44k INFO Saving model and optimizer state at iteration 960 to ./logs\44k\D_14400.pth 2023-12-18 05:21:31,405 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_6400.pth 2023-12-18 05:21:31,405 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_6400.pth 2023-12-18 05:21:38,739 44k INFO ====> Epoch: 960, cost 27.31 s 2023-12-18 05:21:55,205 44k INFO ====> Epoch: 961, cost 16.47 s 2023-12-18 05:22:11,761 44k INFO ====> Epoch: 962, cost 16.56 s 2023-12-18 05:22:28,137 44k INFO ====> Epoch: 963, cost 16.38 s 2023-12-18 05:22:44,398 44k INFO ====> Epoch: 964, cost 16.26 s 2023-12-18 05:23:00,729 44k INFO ====> Epoch: 965, cost 16.33 s 2023-12-18 05:23:16,956 44k INFO ====> Epoch: 966, cost 16.23 s 2023-12-18 05:23:33,131 44k INFO ====> Epoch: 967, cost 16.18 s 2023-12-18 05:23:49,274 44k INFO ====> Epoch: 968, cost 16.14 s 2023-12-18 05:24:05,310 44k INFO ====> Epoch: 969, cost 16.04 s 2023-12-18 05:24:21,403 44k INFO ====> Epoch: 970, cost 16.09 s 2023-12-18 05:24:37,553 44k INFO ====> Epoch: 971, cost 16.15 s 2023-12-18 05:24:53,782 44k INFO ====> Epoch: 972, cost 16.23 s 2023-12-18 05:25:06,588 44k INFO Train Epoch: 973 [60%] 2023-12-18 05:25:06,588 44k INFO Losses: [2.1153931617736816, 2.83221697807312, 10.71894359588623, 23.82561683654785, 0.7411358952522278], step: 14600, lr: 8.853629456121339e-05, reference_loss: 40.233306884765625 2023-12-18 05:25:10,771 44k INFO ====> Epoch: 973, cost 16.99 s 2023-12-18 05:25:26,872 44k INFO ====> Epoch: 974, cost 16.10 s 2023-12-18 05:25:43,338 44k INFO ====> Epoch: 975, cost 16.47 s 2023-12-18 05:25:59,492 44k INFO ====> Epoch: 976, cost 16.15 s 2023-12-18 05:26:15,715 44k INFO ====> Epoch: 977, cost 16.22 s 2023-12-18 05:26:31,884 44k INFO ====> Epoch: 978, cost 16.17 s 2023-12-18 05:26:48,140 44k INFO ====> Epoch: 979, cost 16.26 s 2023-12-18 05:27:04,317 44k INFO ====> Epoch: 980, cost 16.18 s 2023-12-18 05:27:20,452 44k INFO ====> Epoch: 981, cost 16.13 s 2023-12-18 05:27:36,734 44k INFO ====> Epoch: 982, cost 16.28 s 2023-12-18 05:27:52,897 44k INFO ====> Epoch: 983, cost 16.16 s 2023-12-18 05:28:09,078 44k INFO ====> Epoch: 984, cost 16.18 s 2023-12-18 05:28:25,376 44k INFO ====> Epoch: 985, cost 16.30 s 2023-12-18 05:28:40,986 44k INFO Train Epoch: 986 [93%] 2023-12-18 05:28:40,986 44k INFO Losses: [1.8821439743041992, 3.204425096511841, 11.332779884338379, 19.23809814453125, 0.533846378326416], step: 14800, lr: 8.839253093672e-05, reference_loss: 36.1912956237793 2023-12-18 05:28:41,991 44k INFO ====> Epoch: 986, cost 16.61 s 2023-12-18 05:28:58,280 44k INFO ====> Epoch: 987, cost 16.29 s 2023-12-18 05:29:14,497 44k INFO ====> Epoch: 988, cost 16.22 s 2023-12-18 05:29:30,603 44k INFO ====> Epoch: 989, cost 16.11 s 2023-12-18 05:29:46,712 44k INFO ====> Epoch: 990, cost 16.10 s 2023-12-18 05:30:02,841 44k INFO ====> Epoch: 991, cost 16.14 s 2023-12-18 05:30:18,890 44k INFO ====> Epoch: 992, cost 16.05 s 2023-12-18 05:30:35,219 44k INFO ====> Epoch: 993, cost 16.33 s 2023-12-18 05:30:51,468 44k INFO ====> Epoch: 994, cost 16.25 s 2023-12-18 05:31:07,648 44k INFO ====> Epoch: 995, cost 16.18 s 2023-12-18 05:31:23,774 44k INFO ====> Epoch: 996, cost 16.13 s 2023-12-18 05:31:39,878 44k INFO ====> Epoch: 997, cost 16.10 s 2023-12-18 05:31:56,020 44k INFO ====> Epoch: 998, cost 16.14 s 2023-12-18 05:32:12,222 44k INFO ====> Epoch: 999, cost 16.20 s 2023-12-18 05:32:21,987 44k INFO Train Epoch: 1000 [27%] 2023-12-18 05:32:21,987 44k INFO Losses: [2.059866189956665, 2.775845527648926, 9.863354682922363, 20.497215270996094, 0.795164942741394], step: 15000, lr: 8.823796962789062e-05, reference_loss: 35.99144744873047 2023-12-18 05:32:29,195 44k INFO ====> Epoch: 1000, cost 16.97 s 2023-12-18 05:32:45,569 44k INFO ====> Epoch: 1001, cost 16.37 s 2023-12-18 05:33:01,780 44k INFO ====> Epoch: 1002, cost 16.21 s 2023-12-18 05:33:17,920 44k INFO ====> Epoch: 1003, cost 16.14 s 2023-12-18 05:33:34,130 44k INFO ====> Epoch: 1004, cost 16.21 s 2023-12-18 05:33:50,385 44k INFO ====> Epoch: 1005, cost 16.25 s 2023-12-18 05:34:06,590 44k INFO ====> Epoch: 1006, cost 16.21 s 2023-12-18 05:34:22,775 44k INFO ====> Epoch: 1007, cost 16.18 s 2023-12-18 05:34:38,956 44k INFO ====> Epoch: 1008, cost 16.18 s 2023-12-18 05:34:55,206 44k INFO ====> Epoch: 1009, cost 16.25 s 2023-12-18 05:35:11,546 44k INFO ====> Epoch: 1010, cost 16.34 s 2023-12-18 05:35:28,196 44k INFO ====> Epoch: 1011, cost 16.65 s 2023-12-18 05:35:44,491 44k INFO ====> Epoch: 1012, cost 16.29 s 2023-12-18 05:35:57,239 44k INFO Train Epoch: 1013 [60%] 2023-12-18 05:35:57,239 44k INFO Losses: [2.212512731552124, 2.9520888328552246, 8.964977264404297, 19.329120635986328, 1.3261667490005493], step: 15200, lr: 8.809469041799697e-05, reference_loss: 34.78486633300781 2023-12-18 05:36:02,533 44k INFO Saving model and optimizer state at iteration 1013 to ./logs\44k\G_15200.pth 2023-12-18 05:36:03,648 44k INFO Saving model and optimizer state at iteration 1013 to ./logs\44k\D_15200.pth 2023-12-18 05:36:08,447 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_7200.pth 2023-12-18 05:36:08,447 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_7200.pth 2023-12-18 05:36:12,467 44k INFO ====> Epoch: 1013, cost 27.98 s 2023-12-18 05:36:29,399 44k INFO ====> Epoch: 1014, cost 16.93 s 2023-12-18 05:36:45,728 44k INFO ====> Epoch: 1015, cost 16.33 s 2023-12-18 05:37:02,118 44k INFO ====> Epoch: 1016, cost 16.39 s 2023-12-18 05:37:18,326 44k INFO ====> Epoch: 1017, cost 16.21 s 2023-12-18 05:37:34,651 44k INFO ====> Epoch: 1018, cost 16.33 s 2023-12-18 05:37:50,774 44k INFO ====> Epoch: 1019, cost 16.12 s 2023-12-18 05:38:07,049 44k INFO ====> Epoch: 1020, cost 16.28 s 2023-12-18 05:38:23,433 44k INFO ====> Epoch: 1021, cost 16.38 s 2023-12-18 05:38:39,601 44k INFO ====> Epoch: 1022, cost 16.17 s 2023-12-18 05:38:55,807 44k INFO ====> Epoch: 1023, cost 16.21 s 2023-12-18 05:39:11,979 44k INFO ====> Epoch: 1024, cost 16.17 s 2023-12-18 05:39:28,323 44k INFO ====> Epoch: 1025, cost 16.34 s 2023-12-18 05:39:44,145 44k INFO Train Epoch: 1026 [93%] 2023-12-18 05:39:44,145 44k INFO Losses: [1.8876391649246216, 3.0399537086486816, 9.734643936157227, 20.899738311767578, 0.663930356502533], step: 15400, lr: 8.795164386227784e-05, reference_loss: 36.22590255737305 2023-12-18 05:39:45,233 44k INFO ====> Epoch: 1026, cost 16.91 s 2023-12-18 05:40:01,531 44k INFO ====> Epoch: 1027, cost 16.30 s 2023-12-18 05:40:17,948 44k INFO ====> Epoch: 1028, cost 16.42 s 2023-12-18 05:40:34,232 44k INFO ====> Epoch: 1029, cost 16.28 s 2023-12-18 05:40:50,526 44k INFO ====> Epoch: 1030, cost 16.29 s 2023-12-18 05:41:06,694 44k INFO ====> Epoch: 1031, cost 16.17 s 2023-12-18 05:41:23,233 44k INFO ====> Epoch: 1032, cost 16.54 s 2023-12-18 05:41:39,373 44k INFO ====> Epoch: 1033, cost 16.14 s 2023-12-18 05:41:55,675 44k INFO ====> Epoch: 1034, cost 16.30 s 2023-12-18 05:42:11,764 44k INFO ====> Epoch: 1035, cost 16.09 s 2023-12-18 05:42:27,948 44k INFO ====> Epoch: 1036, cost 16.18 s 2023-12-18 05:42:44,260 44k INFO ====> Epoch: 1037, cost 16.31 s 2023-12-18 05:43:00,533 44k INFO ====> Epoch: 1038, cost 16.27 s 2023-12-18 05:43:17,009 44k INFO ====> Epoch: 1039, cost 16.48 s 2023-12-18 05:43:26,520 44k INFO Train Epoch: 1040 [27%] 2023-12-18 05:43:26,520 44k INFO Losses: [2.1083412170410156, 2.8425333499908447, 10.715215682983398, 22.113059997558594, 0.6594805121421814], step: 15600, lr: 8.779785347925579e-05, reference_loss: 38.43863296508789 2023-12-18 05:43:33,673 44k INFO ====> Epoch: 1040, cost 16.66 s 2023-12-18 05:43:49,902 44k INFO ====> Epoch: 1041, cost 16.23 s 2023-12-18 05:44:06,168 44k INFO ====> Epoch: 1042, cost 16.27 s 2023-12-18 05:44:22,396 44k INFO ====> Epoch: 1043, cost 16.23 s 2023-12-18 05:44:38,529 44k INFO ====> Epoch: 1044, cost 16.13 s 2023-12-18 05:44:54,755 44k INFO ====> Epoch: 1045, cost 16.23 s 2023-12-18 05:45:10,940 44k INFO ====> Epoch: 1046, cost 16.19 s 2023-12-18 05:45:27,125 44k INFO ====> Epoch: 1047, cost 16.18 s 2023-12-18 05:45:43,439 44k INFO ====> Epoch: 1048, cost 16.31 s 2023-12-18 05:45:59,571 44k INFO ====> Epoch: 1049, cost 16.13 s 2023-12-18 05:46:15,592 44k INFO ====> Epoch: 1050, cost 16.02 s 2023-12-18 05:46:32,178 44k INFO ====> Epoch: 1051, cost 16.59 s 2023-12-18 05:46:48,451 44k INFO ====> Epoch: 1052, cost 16.27 s 2023-12-18 05:47:01,179 44k INFO Train Epoch: 1053 [60%] 2023-12-18 05:47:01,179 44k INFO Losses: [2.1774704456329346, 2.847099781036377, 11.837100982666016, 23.386157989501953, 0.8974283337593079], step: 15800, lr: 8.765528892195788e-05, reference_loss: 41.14525604248047 2023-12-18 05:47:05,261 44k INFO ====> Epoch: 1053, cost 16.81 s 2023-12-18 05:47:21,488 44k INFO ====> Epoch: 1054, cost 16.23 s 2023-12-18 05:47:37,884 44k INFO ====> Epoch: 1055, cost 16.40 s 2023-12-18 05:47:54,136 44k INFO ====> Epoch: 1056, cost 16.25 s 2023-12-18 05:48:10,402 44k INFO ====> Epoch: 1057, cost 16.27 s 2023-12-18 05:48:26,681 44k INFO ====> Epoch: 1058, cost 16.28 s 2023-12-18 05:48:42,861 44k INFO ====> Epoch: 1059, cost 16.18 s 2023-12-18 05:48:59,107 44k INFO ====> Epoch: 1060, cost 16.25 s 2023-12-18 05:49:15,378 44k INFO ====> Epoch: 1061, cost 16.27 s 2023-12-18 05:49:31,660 44k INFO ====> Epoch: 1062, cost 16.28 s 2023-12-18 05:49:47,860 44k INFO ====> Epoch: 1063, cost 16.20 s 2023-12-18 05:50:04,172 44k INFO ====> Epoch: 1064, cost 16.31 s 2023-12-18 05:50:20,389 44k INFO ====> Epoch: 1065, cost 16.22 s 2023-12-18 05:50:36,256 44k INFO Train Epoch: 1066 [93%] 2023-12-18 05:50:36,256 44k INFO Losses: [1.6845853328704834, 3.0176143646240234, 12.826883316040039, 21.84742546081543, 0.513529896736145], step: 16000, lr: 8.751295585839462e-05, reference_loss: 39.89004135131836 2023-12-18 05:50:41,507 44k INFO Saving model and optimizer state at iteration 1066 to ./logs\44k\G_16000.pth 2023-12-18 05:50:42,587 44k INFO Saving model and optimizer state at iteration 1066 to ./logs\44k\D_16000.pth 2023-12-18 05:50:49,816 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_8000.pth 2023-12-18 05:50:49,816 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_8000.pth 2023-12-18 05:50:50,401 44k INFO ====> Epoch: 1066, cost 30.01 s 2023-12-18 05:51:07,373 44k INFO ====> Epoch: 1067, cost 16.97 s 2023-12-18 05:51:23,980 44k INFO ====> Epoch: 1068, cost 16.61 s 2023-12-18 05:51:40,157 44k INFO ====> Epoch: 1069, cost 16.18 s 2023-12-18 05:51:56,451 44k INFO ====> Epoch: 1070, cost 16.29 s 2023-12-18 05:52:12,517 44k INFO ====> Epoch: 1071, cost 16.07 s 2023-12-18 05:52:28,881 44k INFO ====> Epoch: 1072, cost 16.36 s 2023-12-18 05:52:45,097 44k INFO ====> Epoch: 1073, cost 16.22 s 2023-12-18 05:53:01,357 44k INFO ====> Epoch: 1074, cost 16.26 s 2023-12-18 05:53:17,595 44k INFO ====> Epoch: 1075, cost 16.24 s 2023-12-18 05:53:33,954 44k INFO ====> Epoch: 1076, cost 16.36 s 2023-12-18 05:53:50,240 44k INFO ====> Epoch: 1077, cost 16.29 s 2023-12-18 05:54:06,518 44k INFO ====> Epoch: 1078, cost 16.28 s 2023-12-18 05:54:22,970 44k INFO ====> Epoch: 1079, cost 16.45 s 2023-12-18 05:54:32,597 44k INFO Train Epoch: 1080 [27%] 2023-12-18 05:54:32,597 44k INFO Losses: [2.133018970489502, 2.7093684673309326, 9.334502220153809, 20.1745548248291, 0.496013343334198], step: 16200, lr: 8.735993255593163e-05, reference_loss: 34.84745788574219 2023-12-18 05:54:39,751 44k INFO ====> Epoch: 1080, cost 16.78 s 2023-12-18 05:54:56,159 44k INFO ====> Epoch: 1081, cost 16.41 s 2023-12-18 05:55:12,629 44k INFO ====> Epoch: 1082, cost 16.47 s 2023-12-18 05:55:28,960 44k INFO ====> Epoch: 1083, cost 16.33 s 2023-12-18 05:55:45,396 44k INFO ====> Epoch: 1084, cost 16.44 s 2023-12-18 05:56:01,764 44k INFO ====> Epoch: 1085, cost 16.37 s 2023-12-18 05:56:17,943 44k INFO ====> Epoch: 1086, cost 16.18 s 2023-12-18 05:56:34,035 44k INFO ====> Epoch: 1087, cost 16.09 s 2023-12-18 05:56:50,371 44k INFO ====> Epoch: 1088, cost 16.34 s 2023-12-18 05:57:06,700 44k INFO ====> Epoch: 1089, cost 16.33 s 2023-12-18 05:57:22,862 44k INFO ====> Epoch: 1090, cost 16.16 s 2023-12-18 05:57:39,126 44k INFO ====> Epoch: 1091, cost 16.26 s 2023-12-18 05:57:55,636 44k INFO ====> Epoch: 1092, cost 16.51 s 2023-12-18 05:58:08,397 44k INFO Train Epoch: 1093 [60%] 2023-12-18 05:58:08,397 44k INFO Losses: [2.3138749599456787, 2.750227451324463, 12.569123268127441, 22.78923797607422, 0.5938176512718201], step: 16400, lr: 8.721807908666253e-05, reference_loss: 41.01628112792969 2023-12-18 05:58:12,453 44k INFO ====> Epoch: 1093, cost 16.82 s 2023-12-18 05:58:28,759 44k INFO ====> Epoch: 1094, cost 16.31 s 2023-12-18 05:58:45,072 44k INFO ====> Epoch: 1095, cost 16.31 s 2023-12-18 05:59:01,291 44k INFO ====> Epoch: 1096, cost 16.22 s 2023-12-18 05:59:17,670 44k INFO ====> Epoch: 1097, cost 16.38 s 2023-12-18 05:59:33,953 44k INFO ====> Epoch: 1098, cost 16.28 s 2023-12-18 05:59:50,493 44k INFO ====> Epoch: 1099, cost 16.54 s 2023-12-18 06:00:06,680 44k INFO ====> Epoch: 1100, cost 16.19 s 2023-12-18 06:00:22,891 44k INFO ====> Epoch: 1101, cost 16.21 s 2023-12-18 06:00:39,348 44k INFO ====> Epoch: 1102, cost 16.46 s 2023-12-18 06:00:55,592 44k INFO ====> Epoch: 1103, cost 16.24 s 2023-12-18 06:01:11,849 44k INFO ====> Epoch: 1104, cost 16.26 s 2023-12-18 06:01:28,188 44k INFO ====> Epoch: 1105, cost 16.34 s 2023-12-18 06:01:43,918 44k INFO Train Epoch: 1106 [93%] 2023-12-18 06:01:43,918 44k INFO Losses: [1.738747000694275, 3.5735418796539307, 12.602117538452148, 20.461223602294922, -0.10984014719724655], step: 16600, lr: 8.707645595647632e-05, reference_loss: 38.26578903198242 2023-12-18 06:01:44,926 44k INFO ====> Epoch: 1106, cost 16.74 s 2023-12-18 06:02:01,277 44k INFO ====> Epoch: 1107, cost 16.35 s 2023-12-18 06:02:17,491 44k INFO ====> Epoch: 1108, cost 16.21 s 2023-12-18 06:02:34,070 44k INFO ====> Epoch: 1109, cost 16.58 s 2023-12-18 06:02:50,399 44k INFO ====> Epoch: 1110, cost 16.33 s 2023-12-18 06:03:06,660 44k INFO ====> Epoch: 1111, cost 16.26 s 2023-12-18 06:03:22,784 44k INFO ====> Epoch: 1112, cost 16.12 s 2023-12-18 06:03:39,158 44k INFO ====> Epoch: 1113, cost 16.37 s 2023-12-18 06:03:55,573 44k INFO ====> Epoch: 1114, cost 16.41 s 2023-12-18 06:04:11,768 44k INFO ====> Epoch: 1115, cost 16.19 s 2023-12-18 06:04:27,949 44k INFO ====> Epoch: 1116, cost 16.18 s 2023-12-18 06:04:44,193 44k INFO ====> Epoch: 1117, cost 16.24 s 2023-12-18 06:05:00,548 44k INFO ====> Epoch: 1118, cost 16.35 s 2023-12-18 06:05:16,808 44k INFO ====> Epoch: 1119, cost 16.26 s 2023-12-18 06:05:26,368 44k INFO Train Epoch: 1120 [27%] 2023-12-18 06:05:26,368 44k INFO Losses: [2.218595504760742, 2.762629508972168, 9.027056694030762, 19.968494415283203, 0.48796916007995605], step: 16800, lr: 8.692419590850362e-05, reference_loss: 34.464744567871094 2023-12-18 06:05:31,747 44k INFO Saving model and optimizer state at iteration 1120 to ./logs\44k\G_16800.pth 2023-12-18 06:05:33,015 44k INFO Saving model and optimizer state at iteration 1120 to ./logs\44k\D_16800.pth 2023-12-18 06:05:40,565 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_8800.pth 2023-12-18 06:05:40,565 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_8800.pth 2023-12-18 06:05:47,804 44k INFO ====> Epoch: 1120, cost 31.00 s 2023-12-18 06:06:04,191 44k INFO ====> Epoch: 1121, cost 16.39 s 2023-12-18 06:06:20,498 44k INFO ====> Epoch: 1122, cost 16.31 s 2023-12-18 06:06:36,619 44k INFO ====> Epoch: 1123, cost 16.12 s 2023-12-18 06:06:52,996 44k INFO ====> Epoch: 1124, cost 16.38 s 2023-12-18 06:07:09,224 44k INFO ====> Epoch: 1125, cost 16.23 s 2023-12-18 06:07:25,513 44k INFO ====> Epoch: 1126, cost 16.29 s 2023-12-18 06:07:41,656 44k INFO ====> Epoch: 1127, cost 16.14 s 2023-12-18 06:07:57,731 44k INFO ====> Epoch: 1128, cost 16.08 s 2023-12-18 06:08:14,123 44k INFO ====> Epoch: 1129, cost 16.39 s 2023-12-18 06:08:30,290 44k INFO ====> Epoch: 1130, cost 16.17 s 2023-12-18 06:08:46,413 44k INFO ====> Epoch: 1131, cost 16.12 s 2023-12-18 06:09:02,585 44k INFO ====> Epoch: 1132, cost 16.17 s 2023-12-18 06:09:15,376 44k INFO Train Epoch: 1133 [60%] 2023-12-18 06:09:15,376 44k INFO Losses: [2.0891776084899902, 3.027534246444702, 10.803962707519531, 23.576364517211914, 0.6100084781646729], step: 17000, lr: 8.678304998047589e-05, reference_loss: 40.10704803466797 2023-12-18 06:09:19,480 44k INFO ====> Epoch: 1133, cost 16.89 s 2023-12-18 06:09:35,645 44k INFO ====> Epoch: 1134, cost 16.17 s 2023-12-18 06:09:52,090 44k INFO ====> Epoch: 1135, cost 16.44 s 2023-12-18 06:10:08,663 44k INFO ====> Epoch: 1136, cost 16.57 s 2023-12-18 06:10:25,064 44k INFO ====> Epoch: 1137, cost 16.40 s 2023-12-18 06:10:41,325 44k INFO ====> Epoch: 1138, cost 16.26 s 2023-12-18 06:10:58,215 44k INFO ====> Epoch: 1139, cost 16.89 s 2023-12-18 06:11:14,494 44k INFO ====> Epoch: 1140, cost 16.28 s 2023-12-18 06:11:30,988 44k INFO ====> Epoch: 1141, cost 16.49 s 2023-12-18 06:11:47,403 44k INFO ====> Epoch: 1142, cost 16.42 s 2023-12-18 06:12:03,700 44k INFO ====> Epoch: 1143, cost 16.30 s 2023-12-18 06:12:20,074 44k INFO ====> Epoch: 1144, cost 16.37 s 2023-12-18 06:12:36,226 44k INFO ====> Epoch: 1145, cost 16.15 s 2023-12-18 06:12:51,906 44k INFO Train Epoch: 1146 [93%] 2023-12-18 06:12:51,906 44k INFO Losses: [2.4534311294555664, 2.6070504188537598, 11.066351890563965, 24.765302658081055, 0.6896950006484985], step: 17200, lr: 8.664213324263843e-05, reference_loss: 41.58182907104492 2023-12-18 06:12:52,911 44k INFO ====> Epoch: 1146, cost 16.68 s 2023-12-18 06:13:09,334 44k INFO ====> Epoch: 1147, cost 16.42 s 2023-12-18 06:13:25,766 44k INFO ====> Epoch: 1148, cost 16.43 s 2023-12-18 06:13:42,091 44k INFO ====> Epoch: 1149, cost 16.32 s 2023-12-18 06:13:58,408 44k INFO ====> Epoch: 1150, cost 16.32 s 2023-12-18 06:14:14,532 44k INFO ====> Epoch: 1151, cost 16.12 s 2023-12-18 06:14:30,750 44k INFO ====> Epoch: 1152, cost 16.22 s 2023-12-18 06:14:46,921 44k INFO ====> Epoch: 1153, cost 16.17 s 2023-12-18 06:15:03,174 44k INFO ====> Epoch: 1154, cost 16.25 s 2023-12-18 06:15:19,415 44k INFO ====> Epoch: 1155, cost 16.24 s 2023-12-18 06:15:35,867 44k INFO ====> Epoch: 1156, cost 16.45 s 2023-12-18 06:15:52,246 44k INFO ====> Epoch: 1157, cost 16.38 s 2023-12-18 06:16:08,569 44k INFO ====> Epoch: 1158, cost 16.32 s 2023-12-18 06:16:24,855 44k INFO ====> Epoch: 1159, cost 16.29 s 2023-12-18 06:16:34,523 44k INFO Train Epoch: 1160 [27%] 2023-12-18 06:16:34,523 44k INFO Losses: [2.130260705947876, 3.216271162033081, 10.654054641723633, 21.04338264465332, 0.7342672348022461], step: 17400, lr: 8.649063264217098e-05, reference_loss: 37.778236389160156 2023-12-18 06:16:41,681 44k INFO ====> Epoch: 1160, cost 16.83 s 2023-12-18 06:16:57,956 44k INFO ====> Epoch: 1161, cost 16.27 s 2023-12-18 06:17:14,469 44k INFO ====> Epoch: 1162, cost 16.51 s 2023-12-18 06:17:30,937 44k INFO ====> Epoch: 1163, cost 16.47 s 2023-12-18 06:17:47,049 44k INFO ====> Epoch: 1164, cost 16.11 s 2023-12-18 06:18:03,261 44k INFO ====> Epoch: 1165, cost 16.21 s 2023-12-18 06:18:19,609 44k INFO ====> Epoch: 1166, cost 16.35 s 2023-12-18 06:18:35,937 44k INFO ====> Epoch: 1167, cost 16.33 s 2023-12-18 06:18:52,224 44k INFO ====> Epoch: 1168, cost 16.29 s 2023-12-18 06:19:08,605 44k INFO ====> Epoch: 1169, cost 16.38 s 2023-12-18 06:19:24,970 44k INFO ====> Epoch: 1170, cost 16.36 s 2023-12-18 06:19:41,286 44k INFO ====> Epoch: 1171, cost 16.32 s 2023-12-18 06:19:57,628 44k INFO ====> Epoch: 1172, cost 16.34 s 2023-12-18 06:20:10,369 44k INFO Train Epoch: 1173 [60%] 2023-12-18 06:20:10,369 44k INFO Losses: [2.26151967048645, 2.9894022941589355, 11.47362995147705, 23.037372589111328, 0.8677698373794556], step: 17600, lr: 8.635019072628792e-05, reference_loss: 40.629695892333984 2023-12-18 06:20:15,654 44k INFO Saving model and optimizer state at iteration 1173 to ./logs\44k\G_17600.pth 2023-12-18 06:20:16,993 44k INFO Saving model and optimizer state at iteration 1173 to ./logs\44k\D_17600.pth 2023-12-18 06:20:23,158 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_9600.pth 2023-12-18 06:20:23,158 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_9600.pth 2023-12-18 06:20:27,200 44k INFO ====> Epoch: 1173, cost 29.57 s 2023-12-18 06:20:43,794 44k INFO ====> Epoch: 1174, cost 16.59 s 2023-12-18 06:21:00,262 44k INFO ====> Epoch: 1175, cost 16.47 s 2023-12-18 06:21:16,641 44k INFO ====> Epoch: 1176, cost 16.38 s 2023-12-18 06:21:33,463 44k INFO ====> Epoch: 1177, cost 16.82 s 2023-12-18 06:21:50,055 44k INFO ====> Epoch: 1178, cost 16.59 s 2023-12-18 06:22:06,502 44k INFO ====> Epoch: 1179, cost 16.45 s 2023-12-18 06:22:22,862 44k INFO ====> Epoch: 1180, cost 16.36 s 2023-12-18 06:22:39,108 44k INFO ====> Epoch: 1181, cost 16.25 s 2023-12-18 06:22:55,318 44k INFO ====> Epoch: 1182, cost 16.21 s 2023-12-18 06:23:11,684 44k INFO ====> Epoch: 1183, cost 16.37 s 2023-12-18 06:23:27,902 44k INFO ====> Epoch: 1184, cost 16.22 s 2023-12-18 06:23:44,255 44k INFO ====> Epoch: 1185, cost 16.35 s 2023-12-18 06:24:00,191 44k INFO Train Epoch: 1186 [93%] 2023-12-18 06:24:00,191 44k INFO Losses: [2.168531656265259, 3.1336638927459717, 8.851216316223145, 16.01417350769043, 1.0383952856063843], step: 17800, lr: 8.620997685743301e-05, reference_loss: 31.20598030090332 2023-12-18 06:24:01,186 44k INFO ====> Epoch: 1186, cost 16.93 s 2023-12-18 06:24:17,478 44k INFO ====> Epoch: 1187, cost 16.29 s 2023-12-18 06:24:33,804 44k INFO ====> Epoch: 1188, cost 16.33 s 2023-12-18 06:24:50,133 44k INFO ====> Epoch: 1189, cost 16.33 s 2023-12-18 06:25:06,564 44k INFO ====> Epoch: 1190, cost 16.43 s 2023-12-18 06:25:22,789 44k INFO ====> Epoch: 1191, cost 16.22 s 2023-12-18 06:25:39,024 44k INFO ====> Epoch: 1192, cost 16.24 s 2023-12-18 06:25:55,265 44k INFO ====> Epoch: 1193, cost 16.24 s 2023-12-18 06:26:11,577 44k INFO ====> Epoch: 1194, cost 16.31 s 2023-12-18 06:26:28,160 44k INFO ====> Epoch: 1195, cost 16.58 s 2023-12-18 06:26:44,363 44k INFO ====> Epoch: 1196, cost 16.20 s 2023-12-18 06:27:00,740 44k INFO ====> Epoch: 1197, cost 16.38 s 2023-12-18 06:27:17,199 44k INFO ====> Epoch: 1198, cost 16.46 s 2023-12-18 06:27:33,650 44k INFO ====> Epoch: 1199, cost 16.45 s 2023-12-18 06:27:43,331 44k INFO Train Epoch: 1200 [27%] 2023-12-18 06:27:43,331 44k INFO Losses: [2.088735342025757, 2.6107234954833984, 11.18794059753418, 20.640644073486328, 0.7308809161186218], step: 18000, lr: 8.605923191647444e-05, reference_loss: 37.25892639160156 2023-12-18 06:27:50,525 44k INFO ====> Epoch: 1200, cost 16.88 s 2023-12-18 06:28:06,979 44k INFO ====> Epoch: 1201, cost 16.45 s 2023-12-18 06:28:23,254 44k INFO ====> Epoch: 1202, cost 16.28 s 2023-12-18 06:28:39,699 44k INFO ====> Epoch: 1203, cost 16.44 s 2023-12-18 06:28:56,087 44k INFO ====> Epoch: 1204, cost 16.39 s 2023-12-18 06:29:12,346 44k INFO ====> Epoch: 1205, cost 16.26 s 2023-12-18 06:29:28,572 44k INFO ====> Epoch: 1206, cost 16.23 s 2023-12-18 06:29:44,955 44k INFO ====> Epoch: 1207, cost 16.38 s 2023-12-18 06:30:01,323 44k INFO ====> Epoch: 1208, cost 16.37 s 2023-12-18 06:30:17,768 44k INFO ====> Epoch: 1209, cost 16.45 s 2023-12-18 06:30:34,223 44k INFO ====> Epoch: 1210, cost 16.45 s 2023-12-18 06:30:50,767 44k INFO ====> Epoch: 1211, cost 16.54 s 2023-12-18 06:31:07,058 44k INFO ====> Epoch: 1212, cost 16.29 s 2023-12-18 06:31:19,931 44k INFO Train Epoch: 1213 [60%] 2023-12-18 06:31:19,931 44k INFO Losses: [2.028057098388672, 2.8344717025756836, 9.369321823120117, 19.576229095458984, 0.7934724688529968], step: 18200, lr: 8.591949050124189e-05, reference_loss: 34.6015510559082 2023-12-18 06:31:24,056 44k INFO ====> Epoch: 1213, cost 17.00 s 2023-12-18 06:31:40,317 44k INFO ====> Epoch: 1214, cost 16.26 s 2023-12-18 06:31:56,661 44k INFO ====> Epoch: 1215, cost 16.34 s 2023-12-18 06:32:12,954 44k INFO ====> Epoch: 1216, cost 16.29 s 2023-12-18 06:32:29,423 44k INFO ====> Epoch: 1217, cost 16.47 s 2023-12-18 06:32:45,955 44k INFO ====> Epoch: 1218, cost 16.53 s 2023-12-18 06:33:02,267 44k INFO ====> Epoch: 1219, cost 16.31 s 2023-12-18 06:33:18,697 44k INFO ====> Epoch: 1220, cost 16.43 s 2023-12-18 06:33:35,102 44k INFO ====> Epoch: 1221, cost 16.40 s 2023-12-18 06:33:51,461 44k INFO ====> Epoch: 1222, cost 16.36 s 2023-12-18 06:34:07,843 44k INFO ====> Epoch: 1223, cost 16.38 s 2023-12-18 06:34:24,286 44k INFO ====> Epoch: 1224, cost 16.44 s 2023-12-18 06:34:40,665 44k INFO ====> Epoch: 1225, cost 16.38 s 2023-12-18 06:34:56,533 44k INFO Train Epoch: 1226 [93%] 2023-12-18 06:34:56,533 44k INFO Losses: [2.3795242309570312, 2.5027527809143066, 10.052430152893066, 18.268102645874023, 0.3964497148990631], step: 18400, lr: 8.577997599557726e-05, reference_loss: 33.59926223754883 2023-12-18 06:35:01,804 44k INFO Saving model and optimizer state at iteration 1226 to ./logs\44k\G_18400.pth 2023-12-18 06:35:02,964 44k INFO Saving model and optimizer state at iteration 1226 to ./logs\44k\D_18400.pth 2023-12-18 06:35:07,201 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_10400.pth 2023-12-18 06:35:07,201 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_10400.pth 2023-12-18 06:35:07,746 44k INFO ====> Epoch: 1226, cost 27.08 s 2023-12-18 06:35:25,621 44k INFO ====> Epoch: 1227, cost 17.88 s 2023-12-18 06:35:41,999 44k INFO ====> Epoch: 1228, cost 16.38 s 2023-12-18 06:35:58,221 44k INFO ====> Epoch: 1229, cost 16.22 s 2023-12-18 06:36:14,619 44k INFO ====> Epoch: 1230, cost 16.40 s 2023-12-18 06:36:30,981 44k INFO ====> Epoch: 1231, cost 16.36 s 2023-12-18 06:36:47,426 44k INFO ====> Epoch: 1232, cost 16.44 s 2023-12-18 06:37:03,720 44k INFO ====> Epoch: 1233, cost 16.29 s 2023-12-18 06:37:20,265 44k INFO ====> Epoch: 1234, cost 16.54 s 2023-12-18 06:37:36,604 44k INFO ====> Epoch: 1235, cost 16.34 s 2023-12-18 06:37:52,773 44k INFO ====> Epoch: 1236, cost 16.17 s 2023-12-18 06:38:09,029 44k INFO ====> Epoch: 1237, cost 16.26 s 2023-12-18 06:38:25,362 44k INFO ====> Epoch: 1238, cost 16.33 s 2023-12-18 06:38:41,651 44k INFO ====> Epoch: 1239, cost 16.29 s 2023-12-18 06:38:51,360 44k INFO Train Epoch: 1240 [27%] 2023-12-18 06:38:51,360 44k INFO Losses: [2.323397636413574, 2.7230424880981445, 9.150565147399902, 20.704212188720703, 0.37942272424697876], step: 18600, lr: 8.562998294502507e-05, reference_loss: 35.2806396484375 2023-12-18 06:38:58,537 44k INFO ====> Epoch: 1240, cost 16.89 s 2023-12-18 06:39:14,740 44k INFO ====> Epoch: 1241, cost 16.20 s 2023-12-18 06:39:31,057 44k INFO ====> Epoch: 1242, cost 16.32 s 2023-12-18 06:39:47,430 44k INFO ====> Epoch: 1243, cost 16.37 s 2023-12-18 06:40:03,946 44k INFO ====> Epoch: 1244, cost 16.52 s 2023-12-18 06:40:20,226 44k INFO ====> Epoch: 1245, cost 16.28 s 2023-12-18 06:40:36,473 44k INFO ====> Epoch: 1246, cost 16.25 s 2023-12-18 06:40:52,884 44k INFO ====> Epoch: 1247, cost 16.41 s 2023-12-18 06:41:09,309 44k INFO ====> Epoch: 1248, cost 16.43 s 2023-12-18 06:41:25,478 44k INFO ====> Epoch: 1249, cost 16.17 s 2023-12-18 06:41:41,863 44k INFO ====> Epoch: 1250, cost 16.39 s 2023-12-18 06:41:58,054 44k INFO ====> Epoch: 1251, cost 16.19 s 2023-12-18 06:42:14,305 44k INFO ====> Epoch: 1252, cost 16.25 s 2023-12-18 06:42:27,133 44k INFO Train Epoch: 1253 [60%] 2023-12-18 06:42:27,133 44k INFO Losses: [2.0730786323547363, 2.7422306537628174, 10.379049301147461, 20.902223587036133, 0.534008264541626], step: 18800, lr: 8.549093853646363e-05, reference_loss: 36.63058853149414 2023-12-18 06:42:31,192 44k INFO ====> Epoch: 1253, cost 16.89 s 2023-12-18 06:42:47,563 44k INFO ====> Epoch: 1254, cost 16.37 s 2023-12-18 06:43:03,941 44k INFO ====> Epoch: 1255, cost 16.38 s 2023-12-18 06:43:20,200 44k INFO ====> Epoch: 1256, cost 16.26 s 2023-12-18 06:43:36,634 44k INFO ====> Epoch: 1257, cost 16.43 s 2023-12-18 06:43:53,139 44k INFO ====> Epoch: 1258, cost 16.51 s 2023-12-18 06:44:09,380 44k INFO ====> Epoch: 1259, cost 16.24 s 2023-12-18 06:44:25,669 44k INFO ====> Epoch: 1260, cost 16.29 s 2023-12-18 06:44:42,025 44k INFO ====> Epoch: 1261, cost 16.36 s 2023-12-18 06:44:58,401 44k INFO ====> Epoch: 1262, cost 16.38 s 2023-12-18 06:45:14,723 44k INFO ====> Epoch: 1263, cost 16.32 s 2023-12-18 06:45:31,016 44k INFO ====> Epoch: 1264, cost 16.29 s 2023-12-18 06:45:47,368 44k INFO ====> Epoch: 1265, cost 16.35 s 2023-12-18 06:46:03,312 44k INFO Train Epoch: 1266 [93%] 2023-12-18 06:46:03,312 44k INFO Losses: [1.8506581783294678, 2.8672995567321777, 11.080596923828125, 20.960647583007812, 0.9160330295562744], step: 19000, lr: 8.535211990568338e-05, reference_loss: 37.675235748291016 2023-12-18 06:46:04,378 44k INFO ====> Epoch: 1266, cost 17.01 s 2023-12-18 06:46:20,733 44k INFO ====> Epoch: 1267, cost 16.36 s 2023-12-18 06:46:37,034 44k INFO ====> Epoch: 1268, cost 16.30 s 2023-12-18 06:46:53,343 44k INFO ====> Epoch: 1269, cost 16.31 s 2023-12-18 06:47:09,676 44k INFO ====> Epoch: 1270, cost 16.33 s 2023-12-18 06:47:26,241 44k INFO ====> Epoch: 1271, cost 16.57 s 2023-12-18 06:47:42,580 44k INFO ====> Epoch: 1272, cost 16.34 s 2023-12-18 06:47:59,031 44k INFO ====> Epoch: 1273, cost 16.45 s 2023-12-18 06:48:15,428 44k INFO ====> Epoch: 1274, cost 16.40 s 2023-12-18 06:48:31,746 44k INFO ====> Epoch: 1275, cost 16.32 s 2023-12-18 06:48:48,065 44k INFO ====> Epoch: 1276, cost 16.32 s 2023-12-18 06:49:04,584 44k INFO ====> Epoch: 1277, cost 16.52 s 2023-12-18 06:49:20,928 44k INFO ====> Epoch: 1278, cost 16.34 s 2023-12-18 06:49:37,157 44k INFO ====> Epoch: 1279, cost 16.23 s 2023-12-18 06:49:46,769 44k INFO Train Epoch: 1280 [27%] 2023-12-18 06:49:46,769 44k INFO Losses: [2.0041909217834473, 2.7132248878479004, 10.882377624511719, 25.7646427154541, 0.5638467669487], step: 19200, lr: 8.52028749952347e-05, reference_loss: 41.92828369140625 2023-12-18 06:49:52,158 44k INFO Saving model and optimizer state at iteration 1280 to ./logs\44k\G_19200.pth 2023-12-18 06:49:53,347 44k INFO Saving model and optimizer state at iteration 1280 to ./logs\44k\D_19200.pth 2023-12-18 06:49:58,080 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_11200.pth 2023-12-18 06:49:58,080 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_11200.pth 2023-12-18 06:50:05,644 44k INFO ====> Epoch: 1280, cost 28.49 s 2023-12-18 06:50:23,495 44k INFO ====> Epoch: 1281, cost 17.85 s 2023-12-18 06:50:43,256 44k INFO ====> Epoch: 1282, cost 19.76 s 2023-12-18 06:51:02,860 44k INFO ====> Epoch: 1283, cost 19.60 s 2023-12-18 06:51:22,237 44k INFO ====> Epoch: 1284, cost 19.38 s 2023-12-18 06:51:41,205 44k INFO ====> Epoch: 1285, cost 18.97 s 2023-12-18 06:52:00,510 44k INFO ====> Epoch: 1286, cost 19.31 s 2023-12-18 06:52:19,714 44k INFO ====> Epoch: 1287, cost 19.20 s 2023-12-18 06:52:38,541 44k INFO ====> Epoch: 1288, cost 18.83 s 2023-12-18 06:52:57,482 44k INFO ====> Epoch: 1289, cost 18.94 s 2023-12-18 06:53:15,895 44k INFO ====> Epoch: 1290, cost 18.41 s 2023-12-18 06:53:35,247 44k INFO ====> Epoch: 1291, cost 19.35 s 2023-12-18 06:53:53,475 44k INFO ====> Epoch: 1292, cost 18.23 s 2023-12-18 06:54:08,216 44k INFO Train Epoch: 1293 [60%] 2023-12-18 06:54:08,216 44k INFO Losses: [2.2594776153564453, 2.5694639682769775, 10.90573501586914, 22.7269229888916, 0.6819525957107544], step: 19400, lr: 8.506452411679236e-05, reference_loss: 39.1435546875 2023-12-18 06:54:12,791 44k INFO ====> Epoch: 1293, cost 19.32 s 2023-12-18 06:54:31,013 44k INFO ====> Epoch: 1294, cost 18.22 s 2023-12-18 06:54:49,034 44k INFO ====> Epoch: 1295, cost 18.02 s 2023-12-18 06:55:07,396 44k INFO ====> Epoch: 1296, cost 18.36 s 2023-12-18 06:55:25,961 44k INFO ====> Epoch: 1297, cost 18.57 s 2023-12-18 06:55:44,202 44k INFO ====> Epoch: 1298, cost 18.24 s 2023-12-18 06:56:02,596 44k INFO ====> Epoch: 1299, cost 18.39 s 2023-12-18 06:56:20,948 44k INFO ====> Epoch: 1300, cost 18.35 s 2023-12-18 06:56:39,909 44k INFO ====> Epoch: 1301, cost 18.96 s 2023-12-18 06:56:58,075 44k INFO ====> Epoch: 1302, cost 18.17 s 2023-12-18 06:57:16,568 44k INFO ====> Epoch: 1303, cost 18.49 s 2023-12-18 06:57:34,909 44k INFO ====> Epoch: 1304, cost 18.34 s 2023-12-18 06:57:52,834 44k INFO ====> Epoch: 1305, cost 17.93 s 2023-12-18 06:58:10,815 44k INFO Train Epoch: 1306 [93%] 2023-12-18 06:58:10,815 44k INFO Losses: [2.4444050788879395, 2.827528238296509, 5.222635746002197, 15.31334400177002, 0.24387672543525696], step: 19600, lr: 8.492639788998965e-05, reference_loss: 26.05179214477539 2023-12-18 06:58:12,019 44k INFO ====> Epoch: 1306, cost 19.18 s 2023-12-18 06:58:31,163 44k INFO ====> Epoch: 1307, cost 19.14 s 2023-12-18 06:58:49,892 44k INFO ====> Epoch: 1308, cost 18.73 s 2023-12-18 06:59:08,355 44k INFO ====> Epoch: 1309, cost 18.46 s 2023-12-18 06:59:27,309 44k INFO ====> Epoch: 1310, cost 18.95 s 2023-12-18 06:59:45,826 44k INFO ====> Epoch: 1311, cost 18.52 s 2023-12-18 07:00:04,030 44k INFO ====> Epoch: 1312, cost 18.20 s 2023-12-18 07:00:22,694 44k INFO ====> Epoch: 1313, cost 18.66 s 2023-12-18 07:00:41,358 44k INFO ====> Epoch: 1314, cost 18.66 s 2023-12-18 07:01:00,647 44k INFO ====> Epoch: 1315, cost 19.29 s 2023-12-18 07:01:18,957 44k INFO ====> Epoch: 1316, cost 18.31 s 2023-12-18 07:01:37,598 44k INFO ====> Epoch: 1317, cost 18.64 s 2023-12-18 07:01:56,350 44k INFO ====> Epoch: 1318, cost 18.75 s 2023-12-18 07:02:14,874 44k INFO ====> Epoch: 1319, cost 18.52 s 2023-12-18 07:02:26,230 44k INFO Train Epoch: 1320 [27%] 2023-12-18 07:02:26,230 44k INFO Losses: [1.9742059707641602, 2.920365571975708, 10.084603309631348, 21.356775283813477, 0.9709144234657288], step: 19800, lr: 8.477789738804749e-05, reference_loss: 37.306861877441406 2023-12-18 07:02:34,270 44k INFO ====> Epoch: 1320, cost 19.40 s 2023-12-18 07:02:52,532 44k INFO ====> Epoch: 1321, cost 18.26 s 2023-12-18 07:03:11,647 44k INFO ====> Epoch: 1322, cost 19.12 s 2023-12-18 07:03:29,656 44k INFO ====> Epoch: 1323, cost 18.01 s 2023-12-18 07:03:48,023 44k INFO ====> Epoch: 1324, cost 18.37 s 2023-12-18 07:04:06,148 44k INFO ====> Epoch: 1325, cost 18.13 s 2023-12-18 07:04:24,374 44k INFO ====> Epoch: 1326, cost 18.23 s 2023-12-18 07:04:42,642 44k INFO ====> Epoch: 1327, cost 18.27 s 2023-12-18 07:05:01,104 44k INFO ====> Epoch: 1328, cost 18.46 s 2023-12-18 07:05:19,588 44k INFO ====> Epoch: 1329, cost 18.48 s 2023-12-18 07:05:37,921 44k INFO ====> Epoch: 1330, cost 18.33 s 2023-12-18 07:05:56,019 44k INFO ====> Epoch: 1331, cost 18.10 s 2023-12-18 07:06:14,579 44k INFO ====> Epoch: 1332, cost 18.56 s 2023-12-18 07:06:28,907 44k INFO Train Epoch: 1333 [60%] 2023-12-18 07:06:28,907 44k INFO Losses: [2.1944973468780518, 2.8952925205230713, 10.332586288452148, 21.561479568481445, 0.8830958604812622], step: 20000, lr: 8.464023658051271e-05, reference_loss: 37.86695098876953 2023-12-18 07:06:35,117 44k INFO Saving model and optimizer state at iteration 1333 to ./logs\44k\G_20000.pth 2023-12-18 07:06:36,666 44k INFO Saving model and optimizer state at iteration 1333 to ./logs\44k\D_20000.pth 2023-12-18 07:06:45,742 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_12000.pth 2023-12-18 07:06:45,742 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_12000.pth 2023-12-18 07:06:50,033 44k INFO ====> Epoch: 1333, cost 35.45 s 2023-12-18 07:07:08,756 44k INFO ====> Epoch: 1334, cost 18.72 s 2023-12-18 07:07:27,578 44k INFO ====> Epoch: 1335, cost 18.82 s 2023-12-18 07:07:45,991 44k INFO ====> Epoch: 1336, cost 18.41 s 2023-12-18 07:08:04,659 44k INFO ====> Epoch: 1337, cost 18.67 s 2023-12-18 07:08:23,408 44k INFO ====> Epoch: 1338, cost 18.75 s 2023-12-18 07:08:41,084 44k INFO ====> Epoch: 1339, cost 17.68 s 2023-12-18 07:08:57,498 44k INFO ====> Epoch: 1340, cost 16.41 s 2023-12-18 07:09:13,958 44k INFO ====> Epoch: 1341, cost 16.46 s 2023-12-18 07:09:30,361 44k INFO ====> Epoch: 1342, cost 16.40 s 2023-12-18 07:09:46,939 44k INFO ====> Epoch: 1343, cost 16.58 s 2023-12-18 07:10:03,615 44k INFO ====> Epoch: 1344, cost 16.68 s 2023-12-18 07:10:21,036 44k INFO ====> Epoch: 1345, cost 17.42 s 2023-12-18 07:10:37,070 44k INFO Train Epoch: 1346 [93%] 2023-12-18 07:10:37,070 44k INFO Losses: [2.239088296890259, 2.9878766536712646, 9.63883113861084, 20.143115997314453, 0.5448383688926697], step: 20200, lr: 8.450279930409292e-05, reference_loss: 35.553749084472656 2023-12-18 07:10:38,149 44k INFO ====> Epoch: 1346, cost 17.11 s 2023-12-18 07:10:54,613 44k INFO ====> Epoch: 1347, cost 16.46 s 2023-12-18 07:11:11,011 44k INFO ====> Epoch: 1348, cost 16.40 s 2023-12-18 07:11:27,568 44k INFO ====> Epoch: 1349, cost 16.56 s 2023-12-18 07:11:43,976 44k INFO ====> Epoch: 1350, cost 16.41 s 2023-12-18 07:12:00,669 44k INFO ====> Epoch: 1351, cost 16.69 s 2023-12-18 07:12:17,218 44k INFO ====> Epoch: 1352, cost 16.55 s 2023-12-18 07:12:33,667 44k INFO ====> Epoch: 1353, cost 16.45 s 2023-12-18 07:12:50,199 44k INFO ====> Epoch: 1354, cost 16.53 s 2023-12-18 07:13:06,559 44k INFO ====> Epoch: 1355, cost 16.36 s 2023-12-18 07:13:23,215 44k INFO ====> Epoch: 1356, cost 16.66 s 2023-12-18 07:13:39,685 44k INFO ====> Epoch: 1357, cost 16.47 s 2023-12-18 07:13:56,255 44k INFO ====> Epoch: 1358, cost 16.57 s 2023-12-18 07:14:12,848 44k INFO ====> Epoch: 1359, cost 16.59 s 2023-12-18 07:14:22,659 44k INFO Train Epoch: 1360 [27%] 2023-12-18 07:14:22,659 44k INFO Losses: [2.3740181922912598, 2.825003147125244, 4.8809685707092285, 15.673274040222168, 0.6318756937980652], step: 20400, lr: 8.43550394976729e-05, reference_loss: 26.385141372680664 2023-12-18 07:14:29,985 44k INFO ====> Epoch: 1360, cost 17.14 s 2023-12-18 07:14:46,615 44k INFO ====> Epoch: 1361, cost 16.63 s 2023-12-18 07:15:03,174 44k INFO ====> Epoch: 1362, cost 16.56 s 2023-12-18 07:15:19,460 44k INFO ====> Epoch: 1363, cost 16.29 s 2023-12-18 07:15:36,033 44k INFO ====> Epoch: 1364, cost 16.57 s 2023-12-18 07:15:52,729 44k INFO ====> Epoch: 1365, cost 16.70 s 2023-12-18 07:16:09,370 44k INFO ====> Epoch: 1366, cost 16.64 s 2023-12-18 07:16:25,838 44k INFO ====> Epoch: 1367, cost 16.46 s 2023-12-18 07:16:43,277 44k INFO ====> Epoch: 1368, cost 17.45 s 2023-12-18 07:16:59,743 44k INFO ====> Epoch: 1369, cost 16.47 s 2023-12-18 07:17:16,053 44k INFO ====> Epoch: 1370, cost 16.31 s 2023-12-18 07:17:32,469 44k INFO ====> Epoch: 1371, cost 16.42 s 2023-12-18 07:17:49,005 44k INFO ====> Epoch: 1372, cost 16.54 s 2023-12-18 07:18:01,921 44k INFO Train Epoch: 1373 [60%] 2023-12-18 07:18:01,931 44k INFO Losses: [1.9314253330230713, 2.923743963241577, 11.924635887145996, 21.917116165161133, 1.0592107772827148], step: 20600, lr: 8.421806531908801e-05, reference_loss: 39.756134033203125 2023-12-18 07:18:06,063 44k INFO ====> Epoch: 1373, cost 17.06 s 2023-12-18 07:18:22,493 44k INFO ====> Epoch: 1374, cost 16.43 s 2023-12-18 07:18:39,038 44k INFO ====> Epoch: 1375, cost 16.54 s 2023-12-18 07:18:55,422 44k INFO ====> Epoch: 1376, cost 16.38 s 2023-12-18 07:19:11,834 44k INFO ====> Epoch: 1377, cost 16.41 s 2023-12-18 07:19:28,303 44k INFO ====> Epoch: 1378, cost 16.47 s 2023-12-18 07:19:44,878 44k INFO ====> Epoch: 1379, cost 16.57 s 2023-12-18 07:20:01,322 44k INFO ====> Epoch: 1380, cost 16.44 s 2023-12-18 07:20:17,888 44k INFO ====> Epoch: 1381, cost 16.57 s 2023-12-18 07:20:34,859 44k INFO ====> Epoch: 1382, cost 16.97 s 2023-12-18 07:20:51,418 44k INFO ====> Epoch: 1383, cost 16.56 s 2023-12-18 07:21:07,872 44k INFO ====> Epoch: 1384, cost 16.45 s 2023-12-18 07:21:24,472 44k INFO ====> Epoch: 1385, cost 16.60 s 2023-12-18 07:21:40,390 44k INFO Train Epoch: 1386 [93%] 2023-12-18 07:21:40,390 44k INFO Losses: [1.5468571186065674, 3.2942280769348145, 15.407390594482422, 20.118900299072266, -0.4497150480747223], step: 20800, lr: 8.40813135566826e-05, reference_loss: 39.91766357421875 2023-12-18 07:21:45,885 44k INFO Saving model and optimizer state at iteration 1386 to ./logs\44k\G_20800.pth 2023-12-18 07:21:47,635 44k INFO Saving model and optimizer state at iteration 1386 to ./logs\44k\D_20800.pth 2023-12-18 07:21:55,545 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_12800.pth 2023-12-18 07:21:55,545 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_12800.pth 2023-12-18 07:21:56,078 44k INFO ====> Epoch: 1386, cost 31.61 s 2023-12-18 07:22:12,941 44k INFO ====> Epoch: 1387, cost 16.86 s 2023-12-18 07:22:29,615 44k INFO ====> Epoch: 1388, cost 16.67 s 2023-12-18 07:22:46,129 44k INFO ====> Epoch: 1389, cost 16.51 s 2023-12-18 07:23:02,519 44k INFO ====> Epoch: 1390, cost 16.39 s 2023-12-18 07:23:18,779 44k INFO ====> Epoch: 1391, cost 16.26 s 2023-12-18 07:23:36,491 44k INFO ====> Epoch: 1392, cost 17.71 s 2023-12-18 07:23:53,607 44k INFO ====> Epoch: 1393, cost 17.12 s 2023-12-18 07:24:10,062 44k INFO ====> Epoch: 1394, cost 16.46 s 2023-12-18 07:24:26,551 44k INFO ====> Epoch: 1395, cost 16.49 s 2023-12-18 07:24:42,946 44k INFO ====> Epoch: 1396, cost 16.39 s 2023-12-18 07:24:59,371 44k INFO ====> Epoch: 1397, cost 16.43 s 2023-12-18 07:25:15,734 44k INFO ====> Epoch: 1398, cost 16.36 s 2023-12-18 07:25:32,071 44k INFO ====> Epoch: 1399, cost 16.34 s 2023-12-18 07:25:41,861 44k INFO Train Epoch: 1400 [27%] 2023-12-18 07:25:41,861 44k INFO Losses: [2.112722396850586, 2.753509283065796, 10.690067291259766, 21.47221565246582, 0.5585850477218628], step: 21000, lr: 8.393429075132006e-05, reference_loss: 37.587100982666016 2023-12-18 07:25:49,081 44k INFO ====> Epoch: 1400, cost 17.01 s 2023-12-18 07:26:05,765 44k INFO ====> Epoch: 1401, cost 16.68 s 2023-12-18 07:26:22,369 44k INFO ====> Epoch: 1402, cost 16.60 s 2023-12-18 07:26:38,810 44k INFO ====> Epoch: 1403, cost 16.44 s 2023-12-18 07:26:55,489 44k INFO ====> Epoch: 1404, cost 16.68 s 2023-12-18 07:27:11,897 44k INFO ====> Epoch: 1405, cost 16.41 s 2023-12-18 07:27:28,335 44k INFO ====> Epoch: 1406, cost 16.44 s 2023-12-18 07:27:44,823 44k INFO ====> Epoch: 1407, cost 16.49 s 2023-12-18 07:28:02,056 44k INFO ====> Epoch: 1408, cost 17.23 s 2023-12-18 07:28:18,833 44k INFO ====> Epoch: 1409, cost 16.78 s 2023-12-18 07:28:35,543 44k INFO ====> Epoch: 1410, cost 16.71 s 2023-12-18 07:28:52,242 44k INFO ====> Epoch: 1411, cost 16.70 s 2023-12-18 07:29:08,690 44k INFO ====> Epoch: 1412, cost 16.45 s 2023-12-18 07:29:21,821 44k INFO Train Epoch: 1413 [60%] 2023-12-18 07:29:21,821 44k INFO Losses: [2.038316249847412, 3.2380669116973877, 11.795734405517578, 22.933055877685547, 0.6940712332725525], step: 21200, lr: 8.379799977689547e-05, reference_loss: 40.69924545288086 2023-12-18 07:29:25,984 44k INFO ====> Epoch: 1413, cost 17.29 s 2023-12-18 07:29:42,500 44k INFO ====> Epoch: 1414, cost 16.52 s 2023-12-18 07:29:59,254 44k INFO ====> Epoch: 1415, cost 16.75 s 2023-12-18 07:30:15,848 44k INFO ====> Epoch: 1416, cost 16.59 s 2023-12-18 07:30:32,345 44k INFO ====> Epoch: 1417, cost 16.50 s 2023-12-18 07:30:48,911 44k INFO ====> Epoch: 1418, cost 16.57 s 2023-12-18 07:31:05,946 44k INFO ====> Epoch: 1419, cost 17.04 s 2023-12-18 07:31:22,459 44k INFO ====> Epoch: 1420, cost 16.51 s 2023-12-18 07:31:39,092 44k INFO ====> Epoch: 1421, cost 16.63 s 2023-12-18 07:31:55,721 44k INFO ====> Epoch: 1422, cost 16.63 s 2023-12-18 07:32:12,131 44k INFO ====> Epoch: 1423, cost 16.41 s 2023-12-18 07:32:28,443 44k INFO ====> Epoch: 1424, cost 16.31 s 2023-12-18 07:32:44,854 44k INFO ====> Epoch: 1425, cost 16.41 s 2023-12-18 07:33:01,561 44k INFO Train Epoch: 1426 [93%] 2023-12-18 07:33:01,561 44k INFO Losses: [1.6781567335128784, 3.322093963623047, 9.943658828735352, 18.00555419921875, 0.5710685849189758], step: 21400, lr: 8.36619301092758e-05, reference_loss: 33.520530700683594 2023-12-18 07:33:02,644 44k INFO ====> Epoch: 1426, cost 17.79 s 2023-12-18 07:33:19,337 44k INFO ====> Epoch: 1427, cost 16.69 s 2023-12-18 07:33:35,814 44k INFO ====> Epoch: 1428, cost 16.48 s 2023-12-18 07:33:52,348 44k INFO ====> Epoch: 1429, cost 16.53 s 2023-12-18 07:34:08,731 44k INFO ====> Epoch: 1430, cost 16.38 s 2023-12-18 07:34:25,197 44k INFO ====> Epoch: 1431, cost 16.47 s 2023-12-18 07:34:41,713 44k INFO ====> Epoch: 1432, cost 16.52 s 2023-12-18 07:34:58,478 44k INFO ====> Epoch: 1433, cost 16.77 s 2023-12-18 07:35:14,922 44k INFO ====> Epoch: 1434, cost 16.44 s 2023-12-18 07:35:31,414 44k INFO ====> Epoch: 1435, cost 16.49 s 2023-12-18 07:35:47,989 44k INFO ====> Epoch: 1436, cost 16.58 s 2023-12-18 07:36:04,364 44k INFO ====> Epoch: 1437, cost 16.37 s 2023-12-18 07:36:20,761 44k INFO ====> Epoch: 1438, cost 16.40 s 2023-12-18 07:36:37,478 44k INFO ====> Epoch: 1439, cost 16.72 s 2023-12-18 07:36:47,223 44k INFO Train Epoch: 1440 [27%] 2023-12-18 07:36:47,223 44k INFO Losses: [2.126579523086548, 2.743391752243042, 10.293197631835938, 19.87753677368164, 0.7963422536849976], step: 21600, lr: 8.351564062893342e-05, reference_loss: 35.8370475769043 2023-12-18 07:36:52,414 44k INFO Saving model and optimizer state at iteration 1440 to ./logs\44k\G_21600.pth 2023-12-18 07:36:54,359 44k INFO Saving model and optimizer state at iteration 1440 to ./logs\44k\D_21600.pth 2023-12-18 07:37:05,154 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_13600.pth 2023-12-18 07:37:05,154 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_13600.pth 2023-12-18 07:37:12,412 44k INFO ====> Epoch: 1440, cost 34.93 s 2023-12-18 07:37:28,645 44k INFO ====> Epoch: 1441, cost 16.23 s 2023-12-18 07:37:45,078 44k INFO ====> Epoch: 1442, cost 16.43 s 2023-12-18 07:38:01,484 44k INFO ====> Epoch: 1443, cost 16.41 s 2023-12-18 07:38:18,073 44k INFO ====> Epoch: 1444, cost 16.59 s 2023-12-18 07:38:34,325 44k INFO ====> Epoch: 1445, cost 16.25 s 2023-12-18 07:38:51,656 44k INFO ====> Epoch: 1446, cost 17.33 s 2023-12-18 07:39:08,009 44k INFO ====> Epoch: 1447, cost 16.35 s 2023-12-18 07:39:24,144 44k INFO ====> Epoch: 1448, cost 16.14 s 2023-12-18 07:39:40,766 44k INFO ====> Epoch: 1449, cost 16.62 s 2023-12-18 07:39:57,280 44k INFO ====> Epoch: 1450, cost 16.51 s 2023-12-18 07:40:13,916 44k INFO ====> Epoch: 1451, cost 16.64 s 2023-12-18 07:40:30,481 44k INFO ====> Epoch: 1452, cost 16.57 s 2023-12-18 07:40:43,375 44k INFO Train Epoch: 1453 [60%] 2023-12-18 07:40:43,375 44k INFO Losses: [2.1548244953155518, 2.9541666507720947, 9.660812377929688, 20.433759689331055, 0.666740357875824], step: 21800, lr: 8.338002945096165e-05, reference_loss: 35.870304107666016 2023-12-18 07:40:47,417 44k INFO ====> Epoch: 1453, cost 16.94 s 2023-12-18 07:41:03,928 44k INFO ====> Epoch: 1454, cost 16.51 s 2023-12-18 07:41:20,264 44k INFO ====> Epoch: 1455, cost 16.34 s 2023-12-18 07:41:36,726 44k INFO ====> Epoch: 1456, cost 16.46 s 2023-12-18 07:41:53,783 44k INFO ====> Epoch: 1457, cost 17.06 s 2023-12-18 07:42:10,287 44k INFO ====> Epoch: 1458, cost 16.50 s 2023-12-18 07:42:26,819 44k INFO ====> Epoch: 1459, cost 16.53 s 2023-12-18 07:42:43,286 44k INFO ====> Epoch: 1460, cost 16.47 s 2023-12-18 07:42:59,636 44k INFO ====> Epoch: 1461, cost 16.35 s 2023-12-18 07:43:16,202 44k INFO ====> Epoch: 1462, cost 16.57 s 2023-12-18 07:43:32,926 44k INFO ====> Epoch: 1463, cost 16.72 s 2023-12-18 07:43:49,772 44k INFO ====> Epoch: 1464, cost 16.85 s 2023-12-18 07:44:06,322 44k INFO ====> Epoch: 1465, cost 16.55 s 2023-12-18 07:44:22,484 44k INFO Train Epoch: 1466 [93%] 2023-12-18 07:44:22,484 44k INFO Losses: [2.019838333129883, 3.134286642074585, 15.117714881896973, 19.533857345581055, 0.9029085636138916], step: 22000, lr: 8.324463847595367e-05, reference_loss: 40.70860290527344 2023-12-18 07:44:23,496 44k INFO ====> Epoch: 1466, cost 17.17 s 2023-12-18 07:44:39,960 44k INFO ====> Epoch: 1467, cost 16.46 s 2023-12-18 07:44:56,331 44k INFO ====> Epoch: 1468, cost 16.37 s 2023-12-18 07:45:12,874 44k INFO ====> Epoch: 1469, cost 16.54 s 2023-12-18 07:45:29,492 44k INFO ====> Epoch: 1470, cost 16.62 s 2023-12-18 07:45:46,238 44k INFO ====> Epoch: 1471, cost 16.75 s 2023-12-18 07:46:03,003 44k INFO ====> Epoch: 1472, cost 16.77 s 2023-12-18 07:46:19,984 44k INFO ====> Epoch: 1473, cost 16.98 s 2023-12-18 07:46:38,018 44k INFO ====> Epoch: 1474, cost 18.03 s 2023-12-18 07:46:55,905 44k INFO ====> Epoch: 1475, cost 17.89 s 2023-12-18 07:47:12,151 44k INFO ====> Epoch: 1476, cost 16.25 s 2023-12-18 07:47:28,484 44k INFO ====> Epoch: 1477, cost 16.33 s 2023-12-18 07:47:44,754 44k INFO ====> Epoch: 1478, cost 16.27 s 2023-12-18 07:48:00,953 44k INFO ====> Epoch: 1479, cost 16.20 s 2023-12-18 07:48:10,724 44k INFO Train Epoch: 1480 [27%] 2023-12-18 07:48:10,724 44k INFO Losses: [2.199357271194458, 2.438746690750122, 8.288128852844238, 19.75950050354004, 0.5457698106765747], step: 22200, lr: 8.309907866292964e-05, reference_loss: 33.231502532958984 2023-12-18 07:48:17,870 44k INFO ====> Epoch: 1480, cost 16.92 s 2023-12-18 07:48:34,291 44k INFO ====> Epoch: 1481, cost 16.42 s 2023-12-18 07:48:50,666 44k INFO ====> Epoch: 1482, cost 16.37 s 2023-12-18 07:49:06,825 44k INFO ====> Epoch: 1483, cost 16.16 s 2023-12-18 07:49:23,083 44k INFO ====> Epoch: 1484, cost 16.26 s 2023-12-18 07:49:39,457 44k INFO ====> Epoch: 1485, cost 16.37 s 2023-12-18 07:49:55,931 44k INFO ====> Epoch: 1486, cost 16.47 s 2023-12-18 07:50:12,357 44k INFO ====> Epoch: 1487, cost 16.43 s 2023-12-18 07:50:28,733 44k INFO ====> Epoch: 1488, cost 16.38 s 2023-12-18 07:50:44,912 44k INFO ====> Epoch: 1489, cost 16.18 s 2023-12-18 07:51:01,145 44k INFO ====> Epoch: 1490, cost 16.23 s 2023-12-18 07:51:17,475 44k INFO ====> Epoch: 1491, cost 16.33 s 2023-12-18 07:51:33,692 44k INFO ====> Epoch: 1492, cost 16.22 s 2023-12-18 07:51:46,292 44k INFO Train Epoch: 1493 [60%] 2023-12-18 07:51:46,292 44k INFO Losses: [2.3542392253875732, 2.526057481765747, 6.919995307922363, 19.040037155151367, 1.002414584159851], step: 22400, lr: 8.296414389070031e-05, reference_loss: 31.842742919921875 2023-12-18 07:51:51,583 44k INFO Saving model and optimizer state at iteration 1493 to ./logs\44k\G_22400.pth 2023-12-18 07:51:52,775 44k INFO Saving model and optimizer state at iteration 1493 to ./logs\44k\D_22400.pth 2023-12-18 07:51:59,925 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_14400.pth 2023-12-18 07:51:59,925 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_14400.pth 2023-12-18 07:52:03,680 44k INFO ====> Epoch: 1493, cost 29.99 s 2023-12-18 07:52:20,383 44k INFO ====> Epoch: 1494, cost 16.70 s 2023-12-18 07:52:36,713 44k INFO ====> Epoch: 1495, cost 16.33 s 2023-12-18 07:52:53,013 44k INFO ====> Epoch: 1496, cost 16.30 s 2023-12-18 07:53:09,497 44k INFO ====> Epoch: 1497, cost 16.48 s 2023-12-18 07:53:25,834 44k INFO ====> Epoch: 1498, cost 16.34 s 2023-12-18 07:53:41,964 44k INFO ====> Epoch: 1499, cost 16.13 s 2023-12-18 07:53:58,089 44k INFO ====> Epoch: 1500, cost 16.12 s 2023-12-18 07:54:14,388 44k INFO ====> Epoch: 1501, cost 16.30 s 2023-12-18 07:54:30,591 44k INFO ====> Epoch: 1502, cost 16.20 s 2023-12-18 07:54:46,688 44k INFO ====> Epoch: 1503, cost 16.10 s 2023-12-18 07:55:03,105 44k INFO ====> Epoch: 1504, cost 16.42 s 2023-12-18 07:55:19,162 44k INFO ====> Epoch: 1505, cost 16.06 s 2023-12-18 07:55:34,910 44k INFO Train Epoch: 1506 [93%] 2023-12-18 07:55:34,910 44k INFO Losses: [2.2275707721710205, 3.0072028636932373, 14.59841251373291, 21.830427169799805, 0.4631933569908142], step: 22600, lr: 8.282942822309947e-05, reference_loss: 42.12680435180664 2023-12-18 07:55:35,929 44k INFO ====> Epoch: 1506, cost 16.77 s 2023-12-18 07:55:52,290 44k INFO ====> Epoch: 1507, cost 16.36 s 2023-12-18 07:56:08,514 44k INFO ====> Epoch: 1508, cost 16.22 s 2023-12-18 07:56:24,720 44k INFO ====> Epoch: 1509, cost 16.21 s 2023-12-18 07:56:40,941 44k INFO ====> Epoch: 1510, cost 16.22 s 2023-12-18 07:56:57,193 44k INFO ====> Epoch: 1511, cost 16.25 s 2023-12-18 07:57:13,239 44k INFO ====> Epoch: 1512, cost 16.05 s 2023-12-18 07:57:29,457 44k INFO ====> Epoch: 1513, cost 16.22 s 2023-12-18 07:57:45,459 44k INFO ====> Epoch: 1514, cost 16.00 s 2023-12-18 07:58:01,640 44k INFO ====> Epoch: 1515, cost 16.18 s 2023-12-18 07:58:18,079 44k INFO ====> Epoch: 1516, cost 16.44 s 2023-12-18 07:58:34,284 44k INFO ====> Epoch: 1517, cost 16.21 s 2023-12-18 07:58:50,503 44k INFO ====> Epoch: 1518, cost 16.22 s 2023-12-18 07:59:06,710 44k INFO ====> Epoch: 1519, cost 16.21 s 2023-12-18 07:59:16,258 44k INFO Train Epoch: 1520 [27%] 2023-12-18 07:59:16,258 44k INFO Losses: [1.71194589138031, 3.363090753555298, 13.567883491516113, 27.540983200073242, 0.6672484874725342], step: 22800, lr: 8.268459443793592e-05, reference_loss: 46.85115051269531 2023-12-18 07:59:23,419 44k INFO ====> Epoch: 1520, cost 16.71 s 2023-12-18 07:59:39,621 44k INFO ====> Epoch: 1521, cost 16.20 s 2023-12-18 07:59:55,832 44k INFO ====> Epoch: 1522, cost 16.21 s 2023-12-18 08:00:12,171 44k INFO ====> Epoch: 1523, cost 16.34 s 2023-12-18 08:00:28,533 44k INFO ====> Epoch: 1524, cost 16.36 s 2023-12-18 08:00:44,757 44k INFO ====> Epoch: 1525, cost 16.22 s 2023-12-18 08:01:01,146 44k INFO ====> Epoch: 1526, cost 16.39 s 2023-12-18 08:01:17,250 44k INFO ====> Epoch: 1527, cost 16.10 s 2023-12-18 08:01:33,416 44k INFO ====> Epoch: 1528, cost 16.17 s 2023-12-18 08:01:49,564 44k INFO ====> Epoch: 1529, cost 16.15 s 2023-12-18 08:02:05,693 44k INFO ====> Epoch: 1530, cost 16.13 s 2023-12-18 08:02:21,910 44k INFO ====> Epoch: 1531, cost 16.22 s 2023-12-18 08:02:38,003 44k INFO ====> Epoch: 1532, cost 16.09 s 2023-12-18 08:02:50,673 44k INFO Train Epoch: 1533 [60%] 2023-12-18 08:02:50,683 44k INFO Losses: [1.8090155124664307, 3.122904062271118, 12.41435718536377, 23.92654037475586, 0.7505899667739868], step: 23000, lr: 8.255033269765102e-05, reference_loss: 42.02341079711914 2023-12-18 08:02:54,779 44k INFO ====> Epoch: 1533, cost 16.78 s 2023-12-18 08:03:11,028 44k INFO ====> Epoch: 1534, cost 16.25 s 2023-12-18 08:03:27,404 44k INFO ====> Epoch: 1535, cost 16.38 s 2023-12-18 08:03:43,516 44k INFO ====> Epoch: 1536, cost 16.11 s 2023-12-18 08:03:59,903 44k INFO ====> Epoch: 1537, cost 16.39 s 2023-12-18 08:04:16,126 44k INFO ====> Epoch: 1538, cost 16.22 s 2023-12-18 08:04:32,343 44k INFO ====> Epoch: 1539, cost 16.22 s 2023-12-18 08:04:48,698 44k INFO ====> Epoch: 1540, cost 16.36 s 2023-12-18 08:05:04,880 44k INFO ====> Epoch: 1541, cost 16.18 s 2023-12-18 08:05:21,330 44k INFO ====> Epoch: 1542, cost 16.45 s 2023-12-18 08:05:37,507 44k INFO ====> Epoch: 1543, cost 16.18 s 2023-12-18 08:05:53,633 44k INFO ====> Epoch: 1544, cost 16.13 s 2023-12-18 08:06:09,794 44k INFO ====> Epoch: 1545, cost 16.16 s 2023-12-18 08:06:25,506 44k INFO Train Epoch: 1546 [93%] 2023-12-18 08:06:25,506 44k INFO Losses: [2.1339004039764404, 2.8378357887268066, 6.137299060821533, 13.768893241882324, -0.35813963413238525], step: 23200, lr: 8.241628896913756e-05, reference_loss: 24.519790649414062 2023-12-18 08:06:30,748 44k INFO Saving model and optimizer state at iteration 1546 to ./logs\44k\G_23200.pth 2023-12-18 08:06:32,227 44k INFO Saving model and optimizer state at iteration 1546 to ./logs\44k\D_23200.pth 2023-12-18 08:06:37,118 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_15200.pth 2023-12-18 08:06:37,118 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_15200.pth 2023-12-18 08:06:37,653 44k INFO ====> Epoch: 1546, cost 27.86 s 2023-12-18 08:06:54,818 44k INFO ====> Epoch: 1547, cost 17.17 s 2023-12-18 08:07:11,411 44k INFO ====> Epoch: 1548, cost 16.59 s 2023-12-18 08:07:27,802 44k INFO ====> Epoch: 1549, cost 16.39 s 2023-12-18 08:07:44,036 44k INFO ====> Epoch: 1550, cost 16.23 s 2023-12-18 08:08:00,319 44k INFO ====> Epoch: 1551, cost 16.28 s 2023-12-18 08:08:16,437 44k INFO ====> Epoch: 1552, cost 16.12 s 2023-12-18 08:08:32,715 44k INFO ====> Epoch: 1553, cost 16.28 s 2023-12-18 08:08:48,854 44k INFO ====> Epoch: 1554, cost 16.14 s 2023-12-18 08:09:04,979 44k INFO ====> Epoch: 1555, cost 16.13 s 2023-12-18 08:09:21,255 44k INFO ====> Epoch: 1556, cost 16.28 s 2023-12-18 08:09:37,422 44k INFO ====> Epoch: 1557, cost 16.17 s 2023-12-18 08:09:53,631 44k INFO ====> Epoch: 1558, cost 16.21 s 2023-12-18 08:10:10,141 44k INFO ====> Epoch: 1559, cost 16.51 s 2023-12-18 08:10:20,024 44k INFO Train Epoch: 1560 [27%] 2023-12-18 08:10:20,024 44k INFO Losses: [2.3457705974578857, 2.811798572540283, 10.345260620117188, 19.562538146972656, 0.20345059037208557], step: 23400, lr: 8.227217759052969e-05, reference_loss: 35.26881790161133 2023-12-18 08:10:27,156 44k INFO ====> Epoch: 1560, cost 17.01 s 2023-12-18 08:10:43,512 44k INFO ====> Epoch: 1561, cost 16.36 s 2023-12-18 08:10:59,555 44k INFO ====> Epoch: 1562, cost 16.04 s 2023-12-18 08:11:15,941 44k INFO ====> Epoch: 1563, cost 16.39 s 2023-12-18 08:11:32,083 44k INFO ====> Epoch: 1564, cost 16.14 s 2023-12-18 08:11:48,284 44k INFO ====> Epoch: 1565, cost 16.20 s 2023-12-18 08:12:04,747 44k INFO ====> Epoch: 1566, cost 16.46 s 2023-12-18 08:12:20,987 44k INFO ====> Epoch: 1567, cost 16.24 s 2023-12-18 08:12:37,280 44k INFO ====> Epoch: 1568, cost 16.29 s 2023-12-18 08:12:53,435 44k INFO ====> Epoch: 1569, cost 16.15 s 2023-12-18 08:13:09,617 44k INFO ====> Epoch: 1570, cost 16.18 s 2023-12-18 08:13:25,798 44k INFO ====> Epoch: 1571, cost 16.18 s 2023-12-18 08:13:42,061 44k INFO ====> Epoch: 1572, cost 16.26 s 2023-12-18 08:13:54,741 44k INFO Train Epoch: 1573 [60%] 2023-12-18 08:13:54,741 44k INFO Losses: [2.0476620197296143, 2.8168067932128906, 14.1802978515625, 21.461339950561523, 0.7835718989372253], step: 23600, lr: 8.21385855252191e-05, reference_loss: 41.289676666259766 2023-12-18 08:13:58,799 44k INFO ====> Epoch: 1573, cost 16.74 s 2023-12-18 08:14:15,062 44k INFO ====> Epoch: 1574, cost 16.26 s 2023-12-18 08:14:31,368 44k INFO ====> Epoch: 1575, cost 16.31 s 2023-12-18 08:14:47,623 44k INFO ====> Epoch: 1576, cost 16.25 s 2023-12-18 08:15:03,756 44k INFO ====> Epoch: 1577, cost 16.13 s 2023-12-18 08:15:19,937 44k INFO ====> Epoch: 1578, cost 16.18 s 2023-12-18 08:15:36,233 44k INFO ====> Epoch: 1579, cost 16.30 s 2023-12-18 08:15:52,716 44k INFO ====> Epoch: 1580, cost 16.48 s 2023-12-18 08:16:09,324 44k INFO ====> Epoch: 1581, cost 16.61 s 2023-12-18 08:16:25,562 44k INFO ====> Epoch: 1582, cost 16.24 s 2023-12-18 08:16:41,704 44k INFO ====> Epoch: 1583, cost 16.14 s 2023-12-18 08:16:57,904 44k INFO ====> Epoch: 1584, cost 16.20 s 2023-12-18 08:17:14,181 44k INFO ====> Epoch: 1585, cost 16.28 s 2023-12-18 08:17:30,103 44k INFO Train Epoch: 1586 [93%] 2023-12-18 08:17:30,113 44k INFO Losses: [2.358436346054077, 3.9155044555664062, 16.03541374206543, 26.708087921142578, 0.5396987795829773], step: 23800, lr: 8.20052103842739e-05, reference_loss: 49.5571403503418 2023-12-18 08:17:31,179 44k INFO ====> Epoch: 1586, cost 17.00 s 2023-12-18 08:17:47,657 44k INFO ====> Epoch: 1587, cost 16.48 s 2023-12-18 08:18:03,996 44k INFO ====> Epoch: 1588, cost 16.34 s 2023-12-18 08:18:20,346 44k INFO ====> Epoch: 1589, cost 16.35 s 2023-12-18 08:18:36,670 44k INFO ====> Epoch: 1590, cost 16.32 s 2023-12-18 08:18:52,833 44k INFO ====> Epoch: 1591, cost 16.16 s 2023-12-18 08:19:09,056 44k INFO ====> Epoch: 1592, cost 16.22 s 2023-12-18 08:19:25,141 44k INFO ====> Epoch: 1593, cost 16.09 s 2023-12-18 08:19:41,349 44k INFO ====> Epoch: 1594, cost 16.21 s 2023-12-18 08:19:57,763 44k INFO ====> Epoch: 1595, cost 16.41 s 2023-12-18 08:20:13,988 44k INFO ====> Epoch: 1596, cost 16.22 s 2023-12-18 08:20:30,214 44k INFO ====> Epoch: 1597, cost 16.23 s 2023-12-18 08:20:46,415 44k INFO ====> Epoch: 1598, cost 16.20 s 2023-12-18 08:21:02,682 44k INFO ====> Epoch: 1599, cost 16.27 s 2023-12-18 08:21:12,251 44k INFO Train Epoch: 1600 [27%] 2023-12-18 08:21:12,251 44k INFO Losses: [2.389068126678467, 2.7547788619995117, 10.195369720458984, 21.84174156188965, 0.5020940899848938], step: 24000, lr: 8.186181780897936e-05, reference_loss: 37.68305206298828 2023-12-18 08:21:17,457 44k INFO Saving model and optimizer state at iteration 1600 to ./logs\44k\G_24000.pth 2023-12-18 08:21:18,648 44k INFO Saving model and optimizer state at iteration 1600 to ./logs\44k\D_24000.pth 2023-12-18 08:21:25,617 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_16000.pth 2023-12-18 08:21:25,617 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_16000.pth 2023-12-18 08:21:32,820 44k INFO ====> Epoch: 1600, cost 30.14 s 2023-12-18 08:21:49,115 44k INFO ====> Epoch: 1601, cost 16.30 s 2023-12-18 08:22:05,445 44k INFO ====> Epoch: 1602, cost 16.33 s 2023-12-18 08:22:21,680 44k INFO ====> Epoch: 1603, cost 16.24 s 2023-12-18 08:22:38,052 44k INFO ====> Epoch: 1604, cost 16.37 s 2023-12-18 08:22:54,404 44k INFO ====> Epoch: 1605, cost 16.35 s 2023-12-18 08:23:10,689 44k INFO ====> Epoch: 1606, cost 16.29 s 2023-12-18 08:23:26,960 44k INFO ====> Epoch: 1607, cost 16.27 s 2023-12-18 08:23:43,452 44k INFO ====> Epoch: 1608, cost 16.49 s 2023-12-18 08:24:03,303 44k INFO ====> Epoch: 1609, cost 19.85 s 2023-12-18 08:24:25,572 44k INFO ====> Epoch: 1610, cost 22.27 s 2023-12-18 08:24:46,328 44k INFO ====> Epoch: 1611, cost 20.76 s 2023-12-18 08:25:06,632 44k INFO ====> Epoch: 1612, cost 20.30 s 2023-12-18 08:25:22,730 44k INFO Train Epoch: 1613 [60%] 2023-12-18 08:25:22,730 44k INFO Losses: [2.1355767250061035, 2.58480167388916, 9.032773971557617, 18.851511001586914, 0.7562994360923767], step: 24200, lr: 8.172889207841696e-05, reference_loss: 33.3609619140625 2023-12-18 08:25:27,582 44k INFO ====> Epoch: 1613, cost 20.95 s 2023-12-18 08:25:48,040 44k INFO ====> Epoch: 1614, cost 20.46 s 2023-12-18 08:26:08,221 44k INFO ====> Epoch: 1615, cost 20.18 s 2023-12-18 08:26:28,237 44k INFO ====> Epoch: 1616, cost 20.02 s 2023-12-18 08:26:48,072 44k INFO ====> Epoch: 1617, cost 19.84 s 2023-12-18 08:27:08,464 44k INFO ====> Epoch: 1618, cost 20.39 s 2023-12-18 08:27:28,545 44k INFO ====> Epoch: 1619, cost 20.08 s 2023-12-18 08:27:48,326 44k INFO ====> Epoch: 1620, cost 19.78 s 2023-12-18 08:28:08,205 44k INFO ====> Epoch: 1621, cost 19.87 s 2023-12-18 08:28:27,678 44k INFO ====> Epoch: 1622, cost 19.48 s 2023-12-18 08:28:47,952 44k INFO ====> Epoch: 1623, cost 20.27 s 2023-12-18 08:29:07,909 44k INFO ====> Epoch: 1624, cost 19.96 s 2023-12-18 08:29:27,643 44k INFO ====> Epoch: 1625, cost 19.73 s 2023-12-18 08:29:47,644 44k INFO Train Epoch: 1626 [93%] 2023-12-18 08:29:47,644 44k INFO Losses: [2.1381587982177734, 2.8078479766845703, 12.923676490783691, 19.802343368530273, 0.16612057387828827], step: 24400, lr: 8.159618219023775e-05, reference_loss: 37.83815002441406 2023-12-18 08:29:48,904 44k INFO ====> Epoch: 1626, cost 21.26 s 2023-12-18 08:30:08,878 44k INFO ====> Epoch: 1627, cost 19.97 s 2023-12-18 08:30:28,590 44k INFO ====> Epoch: 1628, cost 19.71 s 2023-12-18 08:30:48,730 44k INFO ====> Epoch: 1629, cost 20.14 s 2023-12-18 08:31:09,076 44k INFO ====> Epoch: 1630, cost 20.35 s 2023-12-18 08:31:29,185 44k INFO ====> Epoch: 1631, cost 20.11 s 2023-12-18 08:31:49,557 44k INFO ====> Epoch: 1632, cost 20.37 s 2023-12-18 08:32:09,743 44k INFO ====> Epoch: 1633, cost 20.19 s 2023-12-18 08:32:28,905 44k INFO ====> Epoch: 1634, cost 19.16 s 2023-12-18 08:32:48,087 44k INFO ====> Epoch: 1635, cost 19.18 s 2023-12-18 08:33:08,235 44k INFO ====> Epoch: 1636, cost 20.15 s 2023-12-18 08:33:28,136 44k INFO ====> Epoch: 1637, cost 19.90 s 2023-12-18 08:33:48,905 44k INFO ====> Epoch: 1638, cost 20.77 s 2023-12-18 08:34:09,289 44k INFO ====> Epoch: 1639, cost 20.38 s 2023-12-18 08:34:21,479 44k INFO Train Epoch: 1640 [27%] 2023-12-18 08:34:21,479 44k INFO Losses: [2.016619920730591, 3.452375888824463, 10.415731430053711, 19.69516372680664, 0.4950009286403656], step: 24600, lr: 8.145350483298648e-05, reference_loss: 36.074893951416016 2023-12-18 08:34:30,211 44k INFO ====> Epoch: 1640, cost 20.92 s 2023-12-18 08:34:50,414 44k INFO ====> Epoch: 1641, cost 20.20 s 2023-12-18 08:35:11,027 44k INFO ====> Epoch: 1642, cost 20.61 s 2023-12-18 08:35:31,372 44k INFO ====> Epoch: 1643, cost 20.34 s 2023-12-18 08:35:51,158 44k INFO ====> Epoch: 1644, cost 19.79 s 2023-12-18 08:36:11,449 44k INFO ====> Epoch: 1645, cost 20.29 s 2023-12-18 08:36:32,191 44k INFO ====> Epoch: 1646, cost 20.74 s 2023-12-18 08:36:52,386 44k INFO ====> Epoch: 1647, cost 20.19 s 2023-12-18 08:37:12,640 44k INFO ====> Epoch: 1648, cost 20.25 s 2023-12-18 08:37:32,939 44k INFO ====> Epoch: 1649, cost 20.30 s 2023-12-18 08:37:53,207 44k INFO ====> Epoch: 1650, cost 20.27 s 2023-12-18 08:38:13,582 44k INFO ====> Epoch: 1651, cost 20.38 s 2023-12-18 08:38:33,990 44k INFO ====> Epoch: 1652, cost 20.41 s 2023-12-18 08:38:50,250 44k INFO Train Epoch: 1653 [60%] 2023-12-18 08:38:50,250 44k INFO Losses: [2.276561737060547, 2.975928544998169, 7.9189229011535645, 18.231260299682617, 0.8706393241882324], step: 24800, lr: 8.132124211360665e-05, reference_loss: 32.273311614990234 2023-12-18 08:38:57,109 44k INFO Saving model and optimizer state at iteration 1653 to ./logs\44k\G_24800.pth 2023-12-18 08:38:58,762 44k INFO Saving model and optimizer state at iteration 1653 to ./logs\44k\D_24800.pth 2023-12-18 08:39:05,165 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_16800.pth 2023-12-18 08:39:05,165 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_16800.pth 2023-12-18 08:39:09,695 44k INFO ====> Epoch: 1653, cost 35.71 s 2023-12-18 08:39:30,805 44k INFO ====> Epoch: 1654, cost 21.11 s 2023-12-18 08:39:50,950 44k INFO ====> Epoch: 1655, cost 20.15 s 2023-12-18 08:40:11,781 44k INFO ====> Epoch: 1656, cost 20.83 s 2023-12-18 08:40:31,992 44k INFO ====> Epoch: 1657, cost 20.21 s 2023-12-18 08:40:52,094 44k INFO ====> Epoch: 1658, cost 20.10 s 2023-12-18 08:41:12,214 44k INFO ====> Epoch: 1659, cost 20.12 s 2023-12-18 08:41:32,110 44k INFO ====> Epoch: 1660, cost 19.90 s 2023-12-18 08:41:52,105 44k INFO ====> Epoch: 1661, cost 20.00 s 2023-12-18 08:42:11,652 44k INFO ====> Epoch: 1662, cost 19.55 s 2023-12-18 08:42:32,004 44k INFO ====> Epoch: 1663, cost 20.35 s 2023-12-18 08:42:51,821 44k INFO ====> Epoch: 1664, cost 19.82 s 2023-12-18 08:43:11,802 44k INFO ====> Epoch: 1665, cost 19.98 s 2023-12-18 08:43:31,114 44k INFO Train Epoch: 1666 [93%] 2023-12-18 08:43:31,114 44k INFO Losses: [2.0233466625213623, 3.0446527004241943, 11.105870246887207, 16.779006958007812, 0.054374564439058304], step: 25000, lr: 8.11891941600245e-05, reference_loss: 33.00725173950195 2023-12-18 08:43:32,404 44k INFO ====> Epoch: 1666, cost 20.60 s 2023-12-18 08:43:52,864 44k INFO ====> Epoch: 1667, cost 20.46 s 2023-12-18 08:44:12,834 44k INFO ====> Epoch: 1668, cost 19.97 s 2023-12-18 08:44:32,618 44k INFO ====> Epoch: 1669, cost 19.78 s 2023-12-18 08:44:52,334 44k INFO ====> Epoch: 1670, cost 19.72 s 2023-12-18 08:45:12,440 44k INFO ====> Epoch: 1671, cost 20.11 s 2023-12-18 08:45:32,521 44k INFO ====> Epoch: 1672, cost 20.08 s 2023-12-18 08:45:52,695 44k INFO ====> Epoch: 1673, cost 20.17 s 2023-12-18 08:46:13,118 44k INFO ====> Epoch: 1674, cost 20.42 s 2023-12-18 08:46:32,901 44k INFO ====> Epoch: 1675, cost 19.78 s 2023-12-18 08:46:53,086 44k INFO ====> Epoch: 1676, cost 20.18 s 2023-12-18 08:47:13,272 44k INFO ====> Epoch: 1677, cost 20.19 s 2023-12-18 08:47:33,381 44k INFO ====> Epoch: 1678, cost 20.11 s 2023-12-18 08:47:53,724 44k INFO ====> Epoch: 1679, cost 20.34 s 2023-12-18 08:48:06,015 44k INFO Train Epoch: 1680 [27%] 2023-12-18 08:48:06,015 44k INFO Losses: [2.0101265907287598, 2.779895782470703, 11.294553756713867, 20.434438705444336, 0.6567613482475281], step: 25200, lr: 8.104722845342925e-05, reference_loss: 37.17577362060547 2023-12-18 08:48:14,554 44k INFO ====> Epoch: 1680, cost 20.83 s 2023-12-18 08:48:34,819 44k INFO ====> Epoch: 1681, cost 20.26 s 2023-12-18 08:48:54,733 44k INFO ====> Epoch: 1682, cost 19.91 s 2023-12-18 08:49:14,978 44k INFO ====> Epoch: 1683, cost 20.24 s 2023-12-18 08:49:35,374 44k INFO ====> Epoch: 1684, cost 20.40 s 2023-12-18 08:49:55,753 44k INFO ====> Epoch: 1685, cost 20.38 s 2023-12-18 08:50:16,130 44k INFO ====> Epoch: 1686, cost 20.38 s 2023-12-18 08:50:36,684 44k INFO ====> Epoch: 1687, cost 20.55 s 2023-12-18 08:50:57,514 44k INFO ====> Epoch: 1688, cost 20.83 s 2023-12-18 08:51:17,655 44k INFO ====> Epoch: 1689, cost 20.14 s 2023-12-18 08:51:38,610 44k INFO ====> Epoch: 1690, cost 20.96 s 2023-12-18 08:51:58,856 44k INFO ====> Epoch: 1691, cost 20.25 s 2023-12-18 08:52:18,426 44k INFO ====> Epoch: 1692, cost 19.57 s 2023-12-18 08:52:34,115 44k INFO Train Epoch: 1693 [60%] 2023-12-18 08:52:34,115 44k INFO Losses: [1.9679383039474487, 3.110562562942505, 10.99479866027832, 23.04935073852539, 0.5363203883171082], step: 25400, lr: 8.091562543824374e-05, reference_loss: 39.65896987915039 2023-12-18 08:52:38,895 44k INFO ====> Epoch: 1693, cost 20.47 s 2023-12-18 08:52:58,574 44k INFO ====> Epoch: 1694, cost 19.68 s 2023-12-18 08:53:18,096 44k INFO ====> Epoch: 1695, cost 19.52 s 2023-12-18 08:53:37,475 44k INFO ====> Epoch: 1696, cost 19.38 s 2023-12-18 08:53:57,085 44k INFO ====> Epoch: 1697, cost 19.61 s 2023-12-18 08:54:16,681 44k INFO ====> Epoch: 1698, cost 19.60 s 2023-12-18 08:54:36,271 44k INFO ====> Epoch: 1699, cost 19.59 s 2023-12-18 08:54:55,673 44k INFO ====> Epoch: 1700, cost 19.40 s 2023-12-18 08:55:15,733 44k INFO ====> Epoch: 1701, cost 20.06 s 2023-12-18 08:55:35,602 44k INFO ====> Epoch: 1702, cost 19.87 s 2023-12-18 08:55:55,543 44k INFO ====> Epoch: 1703, cost 19.94 s 2023-12-18 08:56:15,493 44k INFO ====> Epoch: 1704, cost 19.95 s 2023-12-18 08:56:33,763 44k INFO ====> Epoch: 1705, cost 18.27 s 2023-12-18 08:56:49,634 44k INFO Train Epoch: 1706 [93%] 2023-12-18 08:56:49,644 44k INFO Losses: [1.1918867826461792, 3.8542721271514893, 16.706298828125, 22.00444793701172, 0.22113649547100067], step: 25600, lr: 8.078423611764021e-05, reference_loss: 43.97804260253906 2023-12-18 08:56:54,955 44k INFO Saving model and optimizer state at iteration 1706 to ./logs\44k\G_25600.pth 2023-12-18 08:56:56,212 44k INFO Saving model and optimizer state at iteration 1706 to ./logs\44k\D_25600.pth 2023-12-18 08:56:59,457 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_17600.pth 2023-12-18 08:56:59,457 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_17600.pth 2023-12-18 08:57:00,024 44k INFO ====> Epoch: 1706, cost 26.26 s 2023-12-18 08:57:18,198 44k INFO ====> Epoch: 1707, cost 18.17 s 2023-12-18 08:57:35,104 44k INFO ====> Epoch: 1708, cost 16.91 s 2023-12-18 08:57:51,853 44k INFO ====> Epoch: 1709, cost 16.75 s 2023-12-18 08:58:08,032 44k INFO ====> Epoch: 1710, cost 16.18 s 2023-12-18 08:58:24,192 44k INFO ====> Epoch: 1711, cost 16.16 s 2023-12-18 08:58:40,467 44k INFO ====> Epoch: 1712, cost 16.27 s 2023-12-18 08:58:56,750 44k INFO ====> Epoch: 1713, cost 16.28 s 2023-12-18 08:59:13,015 44k INFO ====> Epoch: 1714, cost 16.26 s 2023-12-18 08:59:29,329 44k INFO ====> Epoch: 1715, cost 16.31 s 2023-12-18 08:59:45,883 44k INFO ====> Epoch: 1716, cost 16.55 s 2023-12-18 09:00:02,132 44k INFO ====> Epoch: 1717, cost 16.25 s 2023-12-18 09:00:18,375 44k INFO ====> Epoch: 1718, cost 16.24 s 2023-12-18 09:00:34,742 44k INFO ====> Epoch: 1719, cost 16.37 s 2023-12-18 09:00:44,551 44k INFO Train Epoch: 1720 [27%] 2023-12-18 09:00:44,551 44k INFO Losses: [1.9788492918014526, 2.8552985191345215, 12.47994327545166, 20.395248413085938, 0.7447962164878845], step: 25800, lr: 8.064297851210724e-05, reference_loss: 38.45413589477539 2023-12-18 09:00:51,677 44k INFO ====> Epoch: 1720, cost 16.94 s 2023-12-18 09:01:08,262 44k INFO ====> Epoch: 1721, cost 16.58 s 2023-12-18 09:01:24,474 44k INFO ====> Epoch: 1722, cost 16.21 s 2023-12-18 09:01:40,986 44k INFO ====> Epoch: 1723, cost 16.51 s 2023-12-18 09:01:57,414 44k INFO ====> Epoch: 1724, cost 16.43 s 2023-12-18 09:02:13,572 44k INFO ====> Epoch: 1725, cost 16.16 s 2023-12-18 09:02:29,784 44k INFO ====> Epoch: 1726, cost 16.21 s 2023-12-18 09:02:46,045 44k INFO ====> Epoch: 1727, cost 16.26 s 2023-12-18 09:03:02,170 44k INFO ====> Epoch: 1728, cost 16.12 s 2023-12-18 09:03:18,432 44k INFO ====> Epoch: 1729, cost 16.26 s 2023-12-18 09:03:34,631 44k INFO ====> Epoch: 1730, cost 16.20 s 2023-12-18 09:03:50,900 44k INFO ====> Epoch: 1731, cost 16.27 s 2023-12-18 09:04:07,065 44k INFO ====> Epoch: 1732, cost 16.16 s 2023-12-18 09:04:19,833 44k INFO Train Epoch: 1733 [60%] 2023-12-18 09:04:19,833 44k INFO Losses: [1.8051795959472656, 3.1406521797180176, 12.487703323364258, 23.789897918701172, 1.0320721864700317], step: 26000, lr: 8.051203191062253e-05, reference_loss: 42.25550842285156 2023-12-18 09:04:23,852 44k INFO ====> Epoch: 1733, cost 16.79 s 2023-12-18 09:04:40,081 44k INFO ====> Epoch: 1734, cost 16.23 s 2023-12-18 09:04:56,235 44k INFO ====> Epoch: 1735, cost 16.15 s 2023-12-18 09:05:12,515 44k INFO ====> Epoch: 1736, cost 16.28 s 2023-12-18 09:05:28,706 44k INFO ====> Epoch: 1737, cost 16.19 s 2023-12-18 09:05:44,862 44k INFO ====> Epoch: 1738, cost 16.16 s 2023-12-18 09:06:01,089 44k INFO ====> Epoch: 1739, cost 16.23 s 2023-12-18 09:06:17,410 44k INFO ====> Epoch: 1740, cost 16.32 s 2023-12-18 09:06:33,582 44k INFO ====> Epoch: 1741, cost 16.17 s 2023-12-18 09:06:49,825 44k INFO ====> Epoch: 1742, cost 16.24 s 2023-12-18 09:07:06,193 44k INFO ====> Epoch: 1743, cost 16.37 s 2023-12-18 09:07:22,460 44k INFO ====> Epoch: 1744, cost 16.27 s 2023-12-18 09:07:38,679 44k INFO ====> Epoch: 1745, cost 16.22 s 2023-12-18 09:07:54,350 44k INFO Train Epoch: 1746 [93%] 2023-12-18 09:07:54,350 44k INFO Losses: [2.04575514793396, 2.7847588062286377, 12.424880981445312, 21.24745750427246, 0.8558704853057861], step: 26200, lr: 8.038129793784715e-05, reference_loss: 39.35872268676758 2023-12-18 09:07:55,382 44k INFO ====> Epoch: 1746, cost 16.70 s 2023-12-18 09:08:11,718 44k INFO ====> Epoch: 1747, cost 16.34 s 2023-12-18 09:08:27,804 44k INFO ====> Epoch: 1748, cost 16.09 s 2023-12-18 09:08:44,122 44k INFO ====> Epoch: 1749, cost 16.32 s 2023-12-18 09:09:00,400 44k INFO ====> Epoch: 1750, cost 16.28 s 2023-12-18 09:09:16,681 44k INFO ====> Epoch: 1751, cost 16.28 s 2023-12-18 09:09:32,848 44k INFO ====> Epoch: 1752, cost 16.17 s 2023-12-18 09:09:49,156 44k INFO ====> Epoch: 1753, cost 16.31 s 2023-12-18 09:10:05,593 44k INFO ====> Epoch: 1754, cost 16.44 s 2023-12-18 09:10:21,852 44k INFO ====> Epoch: 1755, cost 16.26 s 2023-12-18 09:10:38,028 44k INFO ====> Epoch: 1756, cost 16.18 s 2023-12-18 09:10:54,273 44k INFO ====> Epoch: 1757, cost 16.25 s 2023-12-18 09:11:10,410 44k INFO ====> Epoch: 1758, cost 16.14 s 2023-12-18 09:11:26,613 44k INFO ====> Epoch: 1759, cost 16.20 s 2023-12-18 09:11:36,358 44k INFO Train Epoch: 1760 [27%] 2023-12-18 09:11:36,358 44k INFO Losses: [2.2471253871917725, 2.9541819095611572, 8.271241188049316, 18.114139556884766, 0.5552789568901062], step: 26400, lr: 8.024074490148745e-05, reference_loss: 32.1419677734375 2023-12-18 09:11:41,614 44k INFO Saving model and optimizer state at iteration 1760 to ./logs\44k\G_26400.pth 2023-12-18 09:11:43,045 44k INFO Saving model and optimizer state at iteration 1760 to ./logs\44k\D_26400.pth 2023-12-18 09:11:46,065 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_18400.pth 2023-12-18 09:11:46,065 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_18400.pth 2023-12-18 09:11:53,456 44k INFO ====> Epoch: 1760, cost 26.84 s 2023-12-18 09:12:10,617 44k INFO ====> Epoch: 1761, cost 17.16 s 2023-12-18 09:12:26,838 44k INFO ====> Epoch: 1762, cost 16.22 s 2023-12-18 09:12:43,351 44k INFO ====> Epoch: 1763, cost 16.51 s 2023-12-18 09:12:59,853 44k INFO ====> Epoch: 1764, cost 16.50 s 2023-12-18 09:13:16,079 44k INFO ====> Epoch: 1765, cost 16.23 s 2023-12-18 09:13:32,560 44k INFO ====> Epoch: 1766, cost 16.48 s 2023-12-18 09:13:48,731 44k INFO ====> Epoch: 1767, cost 16.17 s 2023-12-18 09:14:04,998 44k INFO ====> Epoch: 1768, cost 16.27 s 2023-12-18 09:14:21,313 44k INFO ====> Epoch: 1769, cost 16.32 s 2023-12-18 09:14:37,563 44k INFO ====> Epoch: 1770, cost 16.25 s 2023-12-18 09:14:54,008 44k INFO ====> Epoch: 1771, cost 16.45 s 2023-12-18 09:15:10,223 44k INFO ====> Epoch: 1772, cost 16.21 s 2023-12-18 09:15:23,045 44k INFO Train Epoch: 1773 [60%] 2023-12-18 09:15:23,045 44k INFO Losses: [2.2851855754852295, 2.555488109588623, 9.116683006286621, 21.532739639282227, 1.0040562152862549], step: 26600, lr: 8.011045143962237e-05, reference_loss: 36.4941520690918 2023-12-18 09:15:27,070 44k INFO ====> Epoch: 1773, cost 16.85 s 2023-12-18 09:15:43,528 44k INFO ====> Epoch: 1774, cost 16.46 s 2023-12-18 09:15:59,743 44k INFO ====> Epoch: 1775, cost 16.22 s 2023-12-18 09:16:16,141 44k INFO ====> Epoch: 1776, cost 16.40 s 2023-12-18 09:16:32,442 44k INFO ====> Epoch: 1777, cost 16.30 s 2023-12-18 09:16:48,820 44k INFO ====> Epoch: 1778, cost 16.38 s 2023-12-18 09:17:05,164 44k INFO ====> Epoch: 1779, cost 16.34 s 2023-12-18 09:17:21,363 44k INFO ====> Epoch: 1780, cost 16.20 s 2023-12-18 09:17:37,628 44k INFO ====> Epoch: 1781, cost 16.26 s 2023-12-18 09:17:53,924 44k INFO ====> Epoch: 1782, cost 16.30 s 2023-12-18 09:18:10,298 44k INFO ====> Epoch: 1783, cost 16.37 s 2023-12-18 09:18:26,491 44k INFO ====> Epoch: 1784, cost 16.19 s 2023-12-18 09:18:42,821 44k INFO ====> Epoch: 1785, cost 16.33 s 2023-12-18 09:18:58,579 44k INFO Train Epoch: 1786 [93%] 2023-12-18 09:18:58,579 44k INFO Losses: [2.145582914352417, 2.488348960876465, 8.986124038696289, 17.15497398376465, 0.010354717262089252], step: 26800, lr: 7.998036954591042e-05, reference_loss: 30.785385131835938 2023-12-18 09:18:59,652 44k INFO ====> Epoch: 1786, cost 16.83 s 2023-12-18 09:19:16,256 44k INFO ====> Epoch: 1787, cost 16.60 s 2023-12-18 09:19:32,459 44k INFO ====> Epoch: 1788, cost 16.20 s 2023-12-18 09:19:48,760 44k INFO ====> Epoch: 1789, cost 16.30 s 2023-12-18 09:20:05,093 44k INFO ====> Epoch: 1790, cost 16.33 s 2023-12-18 09:20:21,300 44k INFO ====> Epoch: 1791, cost 16.21 s 2023-12-18 09:20:37,398 44k INFO ====> Epoch: 1792, cost 16.10 s 2023-12-18 09:20:53,560 44k INFO ====> Epoch: 1793, cost 16.16 s 2023-12-18 09:21:09,804 44k INFO ====> Epoch: 1794, cost 16.24 s 2023-12-18 09:21:26,036 44k INFO ====> Epoch: 1795, cost 16.23 s 2023-12-18 09:21:42,202 44k INFO ====> Epoch: 1796, cost 16.17 s 2023-12-18 09:21:58,542 44k INFO ====> Epoch: 1797, cost 16.34 s 2023-12-18 09:22:14,856 44k INFO ====> Epoch: 1798, cost 16.31 s 2023-12-18 09:22:31,276 44k INFO ====> Epoch: 1799, cost 16.42 s 2023-12-18 09:22:40,894 44k INFO Train Epoch: 1800 [27%] 2023-12-18 09:22:40,894 44k INFO Losses: [2.0387461185455322, 3.2199182510375977, 10.608101844787598, 19.860836029052734, 0.5945751070976257], step: 27000, lr: 7.984051756445148e-05, reference_loss: 36.322174072265625 2023-12-18 09:22:48,097 44k INFO ====> Epoch: 1800, cost 16.82 s 2023-12-18 09:23:04,352 44k INFO ====> Epoch: 1801, cost 16.25 s 2023-12-18 09:23:20,560 44k INFO ====> Epoch: 1802, cost 16.21 s 2023-12-18 09:23:36,795 44k INFO ====> Epoch: 1803, cost 16.24 s 2023-12-18 09:23:53,027 44k INFO ====> Epoch: 1804, cost 16.23 s 2023-12-18 09:24:09,455 44k INFO ====> Epoch: 1805, cost 16.43 s 2023-12-18 09:24:25,801 44k INFO ====> Epoch: 1806, cost 16.35 s 2023-12-18 09:24:42,129 44k INFO ====> Epoch: 1807, cost 16.33 s 2023-12-18 09:24:58,405 44k INFO ====> Epoch: 1808, cost 16.28 s 2023-12-18 09:25:14,619 44k INFO ====> Epoch: 1809, cost 16.21 s 2023-12-18 09:25:30,847 44k INFO ====> Epoch: 1810, cost 16.23 s 2023-12-18 09:25:47,071 44k INFO ====> Epoch: 1811, cost 16.22 s 2023-12-18 09:26:03,323 44k INFO ====> Epoch: 1812, cost 16.25 s 2023-12-18 09:26:16,037 44k INFO Train Epoch: 1813 [60%] 2023-12-18 09:26:16,037 44k INFO Losses: [2.354008436203003, 2.5334932804107666, 12.16633415222168, 21.32012367248535, 0.8521097302436829], step: 27200, lr: 7.971087398445551e-05, reference_loss: 39.226070404052734 2023-12-18 09:26:21,173 44k INFO Saving model and optimizer state at iteration 1813 to ./logs\44k\G_27200.pth 2023-12-18 09:26:22,463 44k INFO Saving model and optimizer state at iteration 1813 to ./logs\44k\D_27200.pth 2023-12-18 09:26:25,803 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_19200.pth 2023-12-18 09:26:25,803 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_19200.pth 2023-12-18 09:26:29,925 44k INFO ====> Epoch: 1813, cost 26.60 s 2023-12-18 09:26:46,871 44k INFO ====> Epoch: 1814, cost 16.95 s 2023-12-18 09:27:03,234 44k INFO ====> Epoch: 1815, cost 16.36 s 2023-12-18 09:27:19,616 44k INFO ====> Epoch: 1816, cost 16.38 s 2023-12-18 09:27:36,081 44k INFO ====> Epoch: 1817, cost 16.46 s 2023-12-18 09:27:52,324 44k INFO ====> Epoch: 1818, cost 16.24 s 2023-12-18 09:28:08,494 44k INFO ====> Epoch: 1819, cost 16.17 s 2023-12-18 09:28:24,693 44k INFO ====> Epoch: 1820, cost 16.20 s 2023-12-18 09:28:41,064 44k INFO ====> Epoch: 1821, cost 16.37 s 2023-12-18 09:28:57,292 44k INFO ====> Epoch: 1822, cost 16.23 s 2023-12-18 09:29:13,668 44k INFO ====> Epoch: 1823, cost 16.38 s 2023-12-18 09:29:29,748 44k INFO ====> Epoch: 1824, cost 16.08 s 2023-12-18 09:29:46,169 44k INFO ====> Epoch: 1825, cost 16.42 s 2023-12-18 09:30:01,875 44k INFO Train Epoch: 1826 [93%] 2023-12-18 09:30:01,875 44k INFO Losses: [1.7687098979949951, 3.0839309692382812, 9.799501419067383, 17.285364151000977, 0.27982649207115173], step: 27400, lr: 7.958144091734628e-05, reference_loss: 32.21733474731445 2023-12-18 09:30:02,904 44k INFO ====> Epoch: 1826, cost 16.73 s 2023-12-18 09:30:19,215 44k INFO ====> Epoch: 1827, cost 16.31 s 2023-12-18 09:30:35,460 44k INFO ====> Epoch: 1828, cost 16.24 s 2023-12-18 09:30:51,660 44k INFO ====> Epoch: 1829, cost 16.20 s 2023-12-18 09:31:07,822 44k INFO ====> Epoch: 1830, cost 16.16 s 2023-12-18 09:31:24,191 44k INFO ====> Epoch: 1831, cost 16.37 s 2023-12-18 09:31:40,425 44k INFO ====> Epoch: 1832, cost 16.23 s 2023-12-18 09:31:56,706 44k INFO ====> Epoch: 1833, cost 16.28 s 2023-12-18 09:32:12,870 44k INFO ====> Epoch: 1834, cost 16.16 s 2023-12-18 09:32:29,148 44k INFO ====> Epoch: 1835, cost 16.28 s 2023-12-18 09:32:45,437 44k INFO ====> Epoch: 1836, cost 16.29 s 2023-12-18 09:33:01,792 44k INFO ====> Epoch: 1837, cost 16.36 s 2023-12-18 09:33:18,051 44k INFO ====> Epoch: 1838, cost 16.26 s 2023-12-18 09:33:34,176 44k INFO ====> Epoch: 1839, cost 16.13 s 2023-12-18 09:33:43,759 44k INFO Train Epoch: 1840 [27%] 2023-12-18 09:33:43,769 44k INFO Losses: [2.2256288528442383, 3.1922876834869385, 13.984726905822754, 23.24559211730957, 0.48959702253341675], step: 27600, lr: 7.94422864940442e-05, reference_loss: 43.13783264160156 2023-12-18 09:33:50,928 44k INFO ====> Epoch: 1840, cost 16.75 s 2023-12-18 09:34:07,232 44k INFO ====> Epoch: 1841, cost 16.30 s 2023-12-18 09:34:23,544 44k INFO ====> Epoch: 1842, cost 16.31 s 2023-12-18 09:34:39,772 44k INFO ====> Epoch: 1843, cost 16.23 s 2023-12-18 09:34:56,072 44k INFO ====> Epoch: 1844, cost 16.30 s 2023-12-18 09:35:12,360 44k INFO ====> Epoch: 1845, cost 16.29 s 2023-12-18 09:35:28,545 44k INFO ====> Epoch: 1846, cost 16.18 s 2023-12-18 09:35:44,643 44k INFO ====> Epoch: 1847, cost 16.10 s 2023-12-18 09:36:00,930 44k INFO ====> Epoch: 1848, cost 16.29 s 2023-12-18 09:36:17,075 44k INFO ====> Epoch: 1849, cost 16.15 s 2023-12-18 09:36:33,364 44k INFO ====> Epoch: 1850, cost 16.29 s 2023-12-18 09:36:49,593 44k INFO ====> Epoch: 1851, cost 16.23 s 2023-12-18 09:37:05,871 44k INFO ====> Epoch: 1852, cost 16.28 s 2023-12-18 09:37:18,473 44k INFO Train Epoch: 1853 [60%] 2023-12-18 09:37:18,473 44k INFO Losses: [1.747084379196167, 3.2456276416778564, 11.77369499206543, 22.6661434173584, 0.6637884974479675], step: 27800, lr: 7.93132895544159e-05, reference_loss: 40.096336364746094 2023-12-18 09:37:22,490 44k INFO ====> Epoch: 1853, cost 16.62 s 2023-12-18 09:37:38,641 44k INFO ====> Epoch: 1854, cost 16.15 s 2023-12-18 09:37:54,969 44k INFO ====> Epoch: 1855, cost 16.33 s 2023-12-18 09:38:11,162 44k INFO ====> Epoch: 1856, cost 16.19 s 2023-12-18 09:38:27,371 44k INFO ====> Epoch: 1857, cost 16.21 s 2023-12-18 09:38:43,390 44k INFO ====> Epoch: 1858, cost 16.02 s 2023-12-18 09:38:59,598 44k INFO ====> Epoch: 1859, cost 16.21 s 2023-12-18 09:39:15,936 44k INFO ====> Epoch: 1860, cost 16.34 s 2023-12-18 09:39:32,149 44k INFO ====> Epoch: 1861, cost 16.21 s 2023-12-18 09:39:48,318 44k INFO ====> Epoch: 1862, cost 16.17 s 2023-12-18 09:40:04,709 44k INFO ====> Epoch: 1863, cost 16.39 s 2023-12-18 09:40:21,076 44k INFO ====> Epoch: 1864, cost 16.37 s 2023-12-18 09:40:37,277 44k INFO ====> Epoch: 1865, cost 16.20 s 2023-12-18 09:40:52,972 44k INFO Train Epoch: 1866 [93%] 2023-12-18 09:40:52,972 44k INFO Losses: [1.9220346212387085, 2.894807815551758, 14.145669937133789, 16.25343132019043, 0.09651528298854828], step: 28000, lr: 7.918450207767153e-05, reference_loss: 35.31245803833008 2023-12-18 09:40:58,280 44k INFO Saving model and optimizer state at iteration 1866 to ./logs\44k\G_28000.pth 2023-12-18 09:40:59,518 44k INFO Saving model and optimizer state at iteration 1866 to ./logs\44k\D_28000.pth 2023-12-18 09:41:11,538 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_20000.pth 2023-12-18 09:41:11,658 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_20000.pth 2023-12-18 09:41:12,203 44k INFO ====> Epoch: 1866, cost 34.93 s 2023-12-18 09:41:28,988 44k INFO ====> Epoch: 1867, cost 16.78 s 2023-12-18 09:41:45,266 44k INFO ====> Epoch: 1868, cost 16.28 s 2023-12-18 09:42:01,434 44k INFO ====> Epoch: 1869, cost 16.17 s 2023-12-18 09:42:17,850 44k INFO ====> Epoch: 1870, cost 16.42 s 2023-12-18 09:42:33,994 44k INFO ====> Epoch: 1871, cost 16.14 s 2023-12-18 09:42:50,037 44k INFO ====> Epoch: 1872, cost 16.04 s 2023-12-18 09:43:06,367 44k INFO ====> Epoch: 1873, cost 16.33 s 2023-12-18 09:43:22,620 44k INFO ====> Epoch: 1874, cost 16.25 s 2023-12-18 09:43:38,830 44k INFO ====> Epoch: 1875, cost 16.21 s 2023-12-18 09:43:54,898 44k INFO ====> Epoch: 1876, cost 16.07 s 2023-12-18 09:44:11,081 44k INFO ====> Epoch: 1877, cost 16.18 s 2023-12-18 09:44:27,247 44k INFO ====> Epoch: 1878, cost 16.17 s 2023-12-18 09:44:43,371 44k INFO ====> Epoch: 1879, cost 16.12 s 2023-12-18 09:44:52,864 44k INFO Train Epoch: 1880 [27%] 2023-12-18 09:44:52,864 44k INFO Losses: [2.123248338699341, 3.180938243865967, 11.131062507629395, 18.673999786376953, 0.7869545817375183], step: 28200, lr: 7.904604173322357e-05, reference_loss: 35.896202087402344 2023-12-18 09:45:00,027 44k INFO ====> Epoch: 1880, cost 16.66 s 2023-12-18 09:45:16,221 44k INFO ====> Epoch: 1881, cost 16.19 s 2023-12-18 09:45:32,558 44k INFO ====> Epoch: 1882, cost 16.34 s 2023-12-18 09:45:48,771 44k INFO ====> Epoch: 1883, cost 16.21 s 2023-12-18 09:46:05,110 44k INFO ====> Epoch: 1884, cost 16.34 s 2023-12-18 09:46:21,498 44k INFO ====> Epoch: 1885, cost 16.39 s 2023-12-18 09:46:37,932 44k INFO ====> Epoch: 1886, cost 16.43 s 2023-12-18 09:46:54,131 44k INFO ====> Epoch: 1887, cost 16.20 s 2023-12-18 09:47:10,495 44k INFO ====> Epoch: 1888, cost 16.36 s 2023-12-18 09:47:26,723 44k INFO ====> Epoch: 1889, cost 16.23 s 2023-12-18 09:47:42,902 44k INFO ====> Epoch: 1890, cost 16.18 s 2023-12-18 09:47:59,234 44k INFO ====> Epoch: 1891, cost 16.33 s 2023-12-18 09:48:15,472 44k INFO ====> Epoch: 1892, cost 16.24 s 2023-12-18 09:48:28,150 44k INFO Train Epoch: 1893 [60%] 2023-12-18 09:48:28,150 44k INFO Losses: [2.1985280513763428, 2.7213735580444336, 8.449751853942871, 18.924358367919922, 0.7978450655937195], step: 28400, lr: 7.891768820862956e-05, reference_loss: 33.09185791015625 2023-12-18 09:48:32,251 44k INFO ====> Epoch: 1893, cost 16.78 s 2023-12-18 09:48:48,558 44k INFO ====> Epoch: 1894, cost 16.31 s 2023-12-18 09:49:04,963 44k INFO ====> Epoch: 1895, cost 16.40 s 2023-12-18 09:49:21,217 44k INFO ====> Epoch: 1896, cost 16.25 s 2023-12-18 09:49:37,378 44k INFO ====> Epoch: 1897, cost 16.16 s 2023-12-18 09:49:53,597 44k INFO ====> Epoch: 1898, cost 16.22 s 2023-12-18 09:50:09,803 44k INFO ====> Epoch: 1899, cost 16.21 s 2023-12-18 09:50:26,115 44k INFO ====> Epoch: 1900, cost 16.31 s 2023-12-18 09:50:42,218 44k INFO ====> Epoch: 1901, cost 16.10 s 2023-12-18 09:50:58,452 44k INFO ====> Epoch: 1902, cost 16.23 s 2023-12-18 09:51:14,793 44k INFO ====> Epoch: 1903, cost 16.34 s 2023-12-18 09:51:30,960 44k INFO ====> Epoch: 1904, cost 16.17 s 2023-12-18 09:51:47,206 44k INFO ====> Epoch: 1905, cost 16.25 s 2023-12-18 09:52:03,018 44k INFO Train Epoch: 1906 [93%] 2023-12-18 09:52:03,018 44k INFO Losses: [1.5832041501998901, 3.6108055114746094, 14.235448837280273, 19.98975944519043, 0.12866343557834625], step: 28600, lr: 7.878954310215385e-05, reference_loss: 39.547882080078125 2023-12-18 09:52:04,030 44k INFO ====> Epoch: 1906, cost 16.82 s 2023-12-18 09:52:20,367 44k INFO ====> Epoch: 1907, cost 16.34 s 2023-12-18 09:52:36,708 44k INFO ====> Epoch: 1908, cost 16.34 s 2023-12-18 09:52:52,912 44k INFO ====> Epoch: 1909, cost 16.20 s 2023-12-18 09:53:09,275 44k INFO ====> Epoch: 1910, cost 16.36 s 2023-12-18 09:53:25,685 44k INFO ====> Epoch: 1911, cost 16.41 s 2023-12-18 09:53:42,015 44k INFO ====> Epoch: 1912, cost 16.33 s 2023-12-18 09:53:58,119 44k INFO ====> Epoch: 1913, cost 16.10 s 2023-12-18 09:54:14,442 44k INFO ====> Epoch: 1914, cost 16.32 s 2023-12-18 09:54:30,558 44k INFO ====> Epoch: 1915, cost 16.12 s 2023-12-18 09:54:46,807 44k INFO ====> Epoch: 1916, cost 16.25 s 2023-12-18 09:55:03,216 44k INFO ====> Epoch: 1917, cost 16.41 s 2023-12-18 09:55:19,608 44k INFO ====> Epoch: 1918, cost 16.39 s 2023-12-18 09:55:35,968 44k INFO ====> Epoch: 1919, cost 16.36 s 2023-12-18 09:55:45,461 44k INFO Train Epoch: 1920 [27%] 2023-12-18 09:55:45,461 44k INFO Losses: [1.6066831350326538, 3.2297685146331787, 11.122459411621094, 22.862207412719727, 0.9498076438903809], step: 28800, lr: 7.865177337461142e-05, reference_loss: 39.77092361450195 2023-12-18 09:55:50,800 44k INFO Saving model and optimizer state at iteration 1920 to ./logs\44k\G_28800.pth 2023-12-18 09:55:52,286 44k INFO Saving model and optimizer state at iteration 1920 to ./logs\44k\D_28800.pth 2023-12-18 09:56:04,405 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_20800.pth 2023-12-18 09:56:04,454 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_20800.pth 2023-12-18 09:56:11,498 44k INFO ====> Epoch: 1920, cost 35.53 s 2023-12-18 09:56:28,266 44k INFO ====> Epoch: 1921, cost 16.77 s 2023-12-18 09:56:44,587 44k INFO ====> Epoch: 1922, cost 16.32 s 2023-12-18 09:57:01,024 44k INFO ====> Epoch: 1923, cost 16.44 s 2023-12-18 09:57:17,472 44k INFO ====> Epoch: 1924, cost 16.45 s 2023-12-18 09:57:33,752 44k INFO ====> Epoch: 1925, cost 16.28 s 2023-12-18 09:57:50,083 44k INFO ====> Epoch: 1926, cost 16.33 s 2023-12-18 09:58:06,461 44k INFO ====> Epoch: 1927, cost 16.38 s 2023-12-18 09:58:22,987 44k INFO ====> Epoch: 1928, cost 16.53 s 2023-12-18 09:58:39,226 44k INFO ====> Epoch: 1929, cost 16.24 s 2023-12-18 09:58:55,522 44k INFO ====> Epoch: 1930, cost 16.30 s 2023-12-18 09:59:11,690 44k INFO ====> Epoch: 1931, cost 16.17 s 2023-12-18 09:59:28,028 44k INFO ====> Epoch: 1932, cost 16.34 s 2023-12-18 09:59:40,757 44k INFO Train Epoch: 1933 [60%] 2023-12-18 09:59:40,757 44k INFO Losses: [1.992563009262085, 2.9708750247955322, 9.389686584472656, 18.6323184967041, 0.37348514795303345], step: 29000, lr: 7.852406005580576e-05, reference_loss: 33.35892868041992 2023-12-18 09:59:44,790 44k INFO ====> Epoch: 1933, cost 16.76 s 2023-12-18 10:00:01,170 44k INFO ====> Epoch: 1934, cost 16.38 s 2023-12-18 10:00:17,437 44k INFO ====> Epoch: 1935, cost 16.27 s 2023-12-18 10:00:33,693 44k INFO ====> Epoch: 1936, cost 16.26 s 2023-12-18 10:00:49,891 44k INFO ====> Epoch: 1937, cost 16.20 s 2023-12-18 10:01:06,178 44k INFO ====> Epoch: 1938, cost 16.29 s 2023-12-18 10:01:22,581 44k INFO ====> Epoch: 1939, cost 16.40 s 2023-12-18 10:01:38,847 44k INFO ====> Epoch: 1940, cost 16.27 s 2023-12-18 10:01:55,178 44k INFO ====> Epoch: 1941, cost 16.33 s 2023-12-18 10:02:11,391 44k INFO ====> Epoch: 1942, cost 16.21 s 2023-12-18 10:02:27,637 44k INFO ====> Epoch: 1943, cost 16.25 s 2023-12-18 10:02:44,217 44k INFO ====> Epoch: 1944, cost 16.58 s 2023-12-18 10:03:00,401 44k INFO ====> Epoch: 1945, cost 16.18 s 2023-12-18 10:03:16,122 44k INFO Train Epoch: 1946 [93%] 2023-12-18 10:03:16,122 44k INFO Losses: [2.1467533111572266, 2.6842918395996094, 11.939395904541016, 18.921878814697266, 0.5926773548126221], step: 29200, lr: 7.839655411556386e-05, reference_loss: 36.28499984741211 2023-12-18 10:03:17,211 44k INFO ====> Epoch: 1946, cost 16.81 s 2023-12-18 10:03:33,385 44k INFO ====> Epoch: 1947, cost 16.17 s 2023-12-18 10:03:49,489 44k INFO ====> Epoch: 1948, cost 16.10 s 2023-12-18 10:04:05,797 44k INFO ====> Epoch: 1949, cost 16.31 s 2023-12-18 10:04:22,115 44k INFO ====> Epoch: 1950, cost 16.32 s 2023-12-18 10:04:38,288 44k INFO ====> Epoch: 1951, cost 16.17 s 2023-12-18 10:04:54,520 44k INFO ====> Epoch: 1952, cost 16.23 s 2023-12-18 10:05:10,773 44k INFO ====> Epoch: 1953, cost 16.25 s 2023-12-18 10:05:26,933 44k INFO ====> Epoch: 1954, cost 16.16 s 2023-12-18 10:05:43,320 44k INFO ====> Epoch: 1955, cost 16.39 s 2023-12-18 10:05:59,623 44k INFO ====> Epoch: 1956, cost 16.30 s 2023-12-18 10:06:15,993 44k INFO ====> Epoch: 1957, cost 16.37 s 2023-12-18 10:06:32,267 44k INFO ====> Epoch: 1958, cost 16.27 s 2023-12-18 10:06:48,515 44k INFO ====> Epoch: 1959, cost 16.25 s 2023-12-18 10:06:58,085 44k INFO Train Epoch: 1960 [27%] 2023-12-18 10:06:58,085 44k INFO Losses: [2.04026460647583, 2.6911511421203613, 10.780580520629883, 19.019798278808594, 0.4454192519187927], step: 29400, lr: 7.825947156024605e-05, reference_loss: 34.97721481323242 2023-12-18 10:07:05,218 44k INFO ====> Epoch: 1960, cost 16.70 s 2023-12-18 10:07:21,391 44k INFO ====> Epoch: 1961, cost 16.17 s 2023-12-18 10:07:37,624 44k INFO ====> Epoch: 1962, cost 16.23 s 2023-12-18 10:07:53,689 44k INFO ====> Epoch: 1963, cost 16.06 s 2023-12-18 10:08:10,039 44k INFO ====> Epoch: 1964, cost 16.35 s 2023-12-18 10:08:26,309 44k INFO ====> Epoch: 1965, cost 16.27 s 2023-12-18 10:08:42,537 44k INFO ====> Epoch: 1966, cost 16.23 s 2023-12-18 10:08:58,697 44k INFO ====> Epoch: 1967, cost 16.16 s 2023-12-18 10:09:15,015 44k INFO ====> Epoch: 1968, cost 16.32 s 2023-12-18 10:09:31,273 44k INFO ====> Epoch: 1969, cost 16.26 s 2023-12-18 10:09:47,540 44k INFO ====> Epoch: 1970, cost 16.27 s 2023-12-18 10:10:04,066 44k INFO ====> Epoch: 1971, cost 16.53 s 2023-12-18 10:10:20,310 44k INFO ====> Epoch: 1972, cost 16.24 s 2023-12-18 10:10:33,147 44k INFO Train Epoch: 1973 [60%] 2023-12-18 10:10:33,147 44k INFO Losses: [1.8937240839004517, 2.663097381591797, 13.549813270568848, 20.61578369140625, 0.872254490852356], step: 29600, lr: 7.813239525398997e-05, reference_loss: 39.59467315673828 2023-12-18 10:10:38,646 44k INFO Saving model and optimizer state at iteration 1973 to ./logs\44k\G_29600.pth 2023-12-18 10:10:40,113 44k INFO Saving model and optimizer state at iteration 1973 to ./logs\44k\D_29600.pth 2023-12-18 10:10:54,057 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_21600.pth 2023-12-18 10:10:54,680 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_21600.pth 2023-12-18 10:10:58,688 44k INFO ====> Epoch: 1973, cost 38.38 s 2023-12-18 10:11:15,232 44k INFO ====> Epoch: 1974, cost 16.54 s 2023-12-18 10:11:31,634 44k INFO ====> Epoch: 1975, cost 16.40 s 2023-12-18 10:11:47,944 44k INFO ====> Epoch: 1976, cost 16.31 s 2023-12-18 10:12:04,097 44k INFO ====> Epoch: 1977, cost 16.15 s 2023-12-18 10:12:20,657 44k INFO ====> Epoch: 1978, cost 16.56 s 2023-12-18 10:12:36,966 44k INFO ====> Epoch: 1979, cost 16.31 s 2023-12-18 10:12:53,258 44k INFO ====> Epoch: 1980, cost 16.29 s 2023-12-18 10:13:09,405 44k INFO ====> Epoch: 1981, cost 16.15 s 2023-12-18 10:13:25,552 44k INFO ====> Epoch: 1982, cost 16.15 s 2023-12-18 10:13:41,753 44k INFO ====> Epoch: 1983, cost 16.20 s 2023-12-18 10:13:58,078 44k INFO ====> Epoch: 1984, cost 16.32 s 2023-12-18 10:14:14,672 44k INFO ====> Epoch: 1985, cost 16.58 s 2023-12-18 10:14:30,431 44k INFO Train Epoch: 1986 [93%] 2023-12-18 10:14:30,431 44k INFO Losses: [1.9327971935272217, 3.0963222980499268, 9.649613380432129, 17.66571617126465, 0.2623206079006195], step: 29800, lr: 7.800552529192829e-05, reference_loss: 32.60676956176758 2023-12-18 10:14:31,466 44k INFO ====> Epoch: 1986, cost 16.80 s 2023-12-18 10:14:47,782 44k INFO ====> Epoch: 1987, cost 16.32 s 2023-12-18 10:15:04,054 44k INFO ====> Epoch: 1988, cost 16.27 s 2023-12-18 10:15:20,368 44k INFO ====> Epoch: 1989, cost 16.31 s 2023-12-18 10:15:36,745 44k INFO ====> Epoch: 1990, cost 16.38 s 2023-12-18 10:15:53,101 44k INFO ====> Epoch: 1991, cost 16.36 s 2023-12-18 10:16:09,315 44k INFO ====> Epoch: 1992, cost 16.21 s 2023-12-18 10:16:25,759 44k INFO ====> Epoch: 1993, cost 16.44 s 2023-12-18 10:16:42,312 44k INFO ====> Epoch: 1994, cost 16.55 s 2023-12-18 10:16:58,552 44k INFO ====> Epoch: 1995, cost 16.24 s 2023-12-18 10:17:14,862 44k INFO ====> Epoch: 1996, cost 16.31 s 2023-12-18 10:17:31,144 44k INFO ====> Epoch: 1997, cost 16.28 s 2023-12-18 10:17:47,306 44k INFO ====> Epoch: 1998, cost 16.16 s 2023-12-18 10:18:03,535 44k INFO ====> Epoch: 1999, cost 16.23 s 2023-12-18 10:18:13,171 44k INFO Train Epoch: 2000 [27%] 2023-12-18 10:18:13,171 44k INFO Losses: [1.8719959259033203, 3.1042652130126953, 10.687170028686523, 18.92031478881836, 1.0546483993530273], step: 30000, lr: 7.786912648133565e-05, reference_loss: 35.638397216796875 2023-12-18 10:18:20,416 44k INFO ====> Epoch: 2000, cost 16.88 s 2023-12-18 10:18:36,663 44k INFO ====> Epoch: 2001, cost 16.25 s 2023-12-18 10:18:53,027 44k INFO ====> Epoch: 2002, cost 16.36 s 2023-12-18 10:19:09,311 44k INFO ====> Epoch: 2003, cost 16.28 s 2023-12-18 10:19:25,741 44k INFO ====> Epoch: 2004, cost 16.43 s 2023-12-18 10:19:42,029 44k INFO ====> Epoch: 2005, cost 16.29 s 2023-12-18 10:19:58,304 44k INFO ====> Epoch: 2006, cost 16.28 s 2023-12-18 10:20:14,393 44k INFO ====> Epoch: 2007, cost 16.09 s 2023-12-18 10:20:30,675 44k INFO ====> Epoch: 2008, cost 16.28 s 2023-12-18 10:20:47,057 44k INFO ====> Epoch: 2009, cost 16.38 s 2023-12-18 10:21:03,427 44k INFO ====> Epoch: 2010, cost 16.37 s 2023-12-18 10:21:19,804 44k INFO ====> Epoch: 2011, cost 16.38 s 2023-12-18 10:21:36,216 44k INFO ====> Epoch: 2012, cost 16.41 s 2023-12-18 10:21:48,876 44k INFO Train Epoch: 2013 [60%] 2023-12-18 10:21:48,886 44k INFO Losses: [1.902477502822876, 3.2247676849365234, 12.021904945373535, 19.866422653198242, 0.6179165244102478], step: 30200, lr: 7.774268401031771e-05, reference_loss: 37.633487701416016 2023-12-18 10:21:52,960 44k INFO ====> Epoch: 2013, cost 16.74 s 2023-12-18 10:22:09,202 44k INFO ====> Epoch: 2014, cost 16.24 s 2023-12-18 10:22:25,430 44k INFO ====> Epoch: 2015, cost 16.23 s 2023-12-18 10:22:41,615 44k INFO ====> Epoch: 2016, cost 16.18 s 2023-12-18 10:22:57,851 44k INFO ====> Epoch: 2017, cost 16.24 s 2023-12-18 10:23:14,196 44k INFO ====> Epoch: 2018, cost 16.35 s 2023-12-18 10:23:30,388 44k INFO ====> Epoch: 2019, cost 16.19 s 2023-12-18 10:23:46,816 44k INFO ====> Epoch: 2020, cost 16.43 s 2023-12-18 10:24:03,059 44k INFO ====> Epoch: 2021, cost 16.24 s 2023-12-18 10:24:19,377 44k INFO ====> Epoch: 2022, cost 16.32 s 2023-12-18 10:24:35,586 44k INFO ====> Epoch: 2023, cost 16.21 s 2023-12-18 10:24:51,897 44k INFO ====> Epoch: 2024, cost 16.31 s 2023-12-18 10:25:08,223 44k INFO ====> Epoch: 2025, cost 16.33 s 2023-12-18 10:25:23,824 44k INFO Train Epoch: 2026 [93%] 2023-12-18 10:25:23,824 44k INFO Losses: [1.6962676048278809, 3.242138385772705, 11.306113243103027, 18.096586227416992, 0.5246453881263733], step: 30400, lr: 7.761644685428404e-05, reference_loss: 34.86574935913086 2023-12-18 10:25:28,999 44k INFO Saving model and optimizer state at iteration 2026 to ./logs\44k\G_30400.pth 2023-12-18 10:25:30,331 44k INFO Saving model and optimizer state at iteration 2026 to ./logs\44k\D_30400.pth 2023-12-18 10:25:35,580 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_22400.pth 2023-12-18 10:25:35,649 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_22400.pth 2023-12-18 10:25:36,209 44k INFO ====> Epoch: 2026, cost 27.99 s 2023-12-18 10:25:53,138 44k INFO ====> Epoch: 2027, cost 16.93 s 2023-12-18 10:26:09,609 44k INFO ====> Epoch: 2028, cost 16.47 s 2023-12-18 10:26:25,850 44k INFO ====> Epoch: 2029, cost 16.24 s 2023-12-18 10:26:42,055 44k INFO ====> Epoch: 2030, cost 16.20 s 2023-12-18 10:26:58,269 44k INFO ====> Epoch: 2031, cost 16.21 s 2023-12-18 10:27:14,692 44k INFO ====> Epoch: 2032, cost 16.42 s 2023-12-18 10:27:31,082 44k INFO ====> Epoch: 2033, cost 16.39 s 2023-12-18 10:27:47,362 44k INFO ====> Epoch: 2034, cost 16.28 s 2023-12-18 10:28:03,678 44k INFO ====> Epoch: 2035, cost 16.32 s 2023-12-18 10:28:19,909 44k INFO ====> Epoch: 2036, cost 16.23 s 2023-12-18 10:28:36,309 44k INFO ====> Epoch: 2037, cost 16.40 s 2023-12-18 10:28:52,784 44k INFO ====> Epoch: 2038, cost 16.48 s 2023-12-18 10:29:09,219 44k INFO ====> Epoch: 2039, cost 16.43 s 2023-12-18 10:29:18,912 44k INFO Train Epoch: 2040 [27%] 2023-12-18 10:29:18,912 44k INFO Losses: [2.412411689758301, 2.6060197353363037, 5.888246536254883, 16.89883804321289, 0.46194395422935486], step: 30600, lr: 7.748072837801289e-05, reference_loss: 28.267459869384766 2023-12-18 10:29:26,103 44k INFO ====> Epoch: 2040, cost 16.88 s 2023-12-18 10:29:42,658 44k INFO ====> Epoch: 2041, cost 16.55 s 2023-12-18 10:29:58,854 44k INFO ====> Epoch: 2042, cost 16.20 s 2023-12-18 10:30:15,057 44k INFO ====> Epoch: 2043, cost 16.20 s 2023-12-18 10:30:31,412 44k INFO ====> Epoch: 2044, cost 16.35 s 2023-12-18 10:30:47,664 44k INFO ====> Epoch: 2045, cost 16.25 s 2023-12-18 10:31:03,967 44k INFO ====> Epoch: 2046, cost 16.30 s 2023-12-18 10:31:20,392 44k INFO ====> Epoch: 2047, cost 16.42 s 2023-12-18 10:31:36,719 44k INFO ====> Epoch: 2048, cost 16.33 s 2023-12-18 10:31:53,025 44k INFO ====> Epoch: 2049, cost 16.31 s 2023-12-18 10:32:09,153 44k INFO ====> Epoch: 2050, cost 16.13 s 2023-12-18 10:32:25,477 44k INFO ====> Epoch: 2051, cost 16.32 s 2023-12-18 10:32:41,888 44k INFO ====> Epoch: 2052, cost 16.41 s 2023-12-18 10:32:54,645 44k INFO Train Epoch: 2053 [60%] 2023-12-18 10:32:54,645 44k INFO Losses: [2.101926565170288, 2.5368237495422363, 10.788867950439453, 21.322282791137695, 0.5628498196601868], step: 30800, lr: 7.735491658076955e-05, reference_loss: 37.31275177001953 2023-12-18 10:32:58,765 44k INFO ====> Epoch: 2053, cost 16.88 s 2023-12-18 10:33:14,991 44k INFO ====> Epoch: 2054, cost 16.23 s 2023-12-18 10:33:31,178 44k INFO ====> Epoch: 2055, cost 16.19 s 2023-12-18 10:33:47,452 44k INFO ====> Epoch: 2056, cost 16.27 s 2023-12-18 10:34:03,824 44k INFO ====> Epoch: 2057, cost 16.37 s 2023-12-18 10:34:20,208 44k INFO ====> Epoch: 2058, cost 16.38 s 2023-12-18 10:34:36,455 44k INFO ====> Epoch: 2059, cost 16.25 s 2023-12-18 10:34:52,842 44k INFO ====> Epoch: 2060, cost 16.39 s 2023-12-18 10:35:09,179 44k INFO ====> Epoch: 2061, cost 16.34 s 2023-12-18 10:35:25,407 44k INFO ====> Epoch: 2062, cost 16.23 s 2023-12-18 10:35:41,936 44k INFO ====> Epoch: 2063, cost 16.53 s 2023-12-18 10:35:58,194 44k INFO ====> Epoch: 2064, cost 16.26 s 2023-12-18 10:36:14,418 44k INFO ====> Epoch: 2065, cost 16.22 s 2023-12-18 10:36:30,209 44k INFO Train Epoch: 2066 [93%] 2023-12-18 10:36:30,209 44k INFO Losses: [1.3651131391525269, 3.369978666305542, 16.689699172973633, 18.285642623901367, 0.9313102960586548], step: 31000, lr: 7.722930907443384e-05, reference_loss: 40.64174270629883 2023-12-18 10:36:31,252 44k INFO ====> Epoch: 2066, cost 16.83 s 2023-12-18 10:36:47,588 44k INFO ====> Epoch: 2067, cost 16.34 s 2023-12-18 10:37:03,953 44k INFO ====> Epoch: 2068, cost 16.37 s 2023-12-18 10:37:20,334 44k INFO ====> Epoch: 2069, cost 16.38 s 2023-12-18 10:37:36,614 44k INFO ====> Epoch: 2070, cost 16.28 s 2023-12-18 10:37:52,980 44k INFO ====> Epoch: 2071, cost 16.37 s 2023-12-18 10:38:09,371 44k INFO ====> Epoch: 2072, cost 16.39 s 2023-12-18 10:38:25,457 44k INFO ====> Epoch: 2073, cost 16.09 s 2023-12-18 10:38:41,711 44k INFO ====> Epoch: 2074, cost 16.25 s 2023-12-18 10:38:57,928 44k INFO ====> Epoch: 2075, cost 16.22 s 2023-12-18 10:39:14,121 44k INFO ====> Epoch: 2076, cost 16.19 s 2023-12-18 10:39:30,372 44k INFO ====> Epoch: 2077, cost 16.25 s 2023-12-18 10:39:46,685 44k INFO ====> Epoch: 2078, cost 16.31 s 2023-12-18 10:40:03,199 44k INFO ====> Epoch: 2079, cost 16.51 s 2023-12-18 10:40:12,879 44k INFO Train Epoch: 2080 [27%] 2023-12-18 10:40:12,879 44k INFO Losses: [2.4098665714263916, 3.1288912296295166, 11.169779777526855, 19.77461814880371, 0.7812569737434387], step: 31200, lr: 7.709426753909104e-05, reference_loss: 37.26441192626953 2023-12-18 10:40:18,191 44k INFO Saving model and optimizer state at iteration 2080 to ./logs\44k\G_31200.pth 2023-12-18 10:40:19,461 44k INFO Saving model and optimizer state at iteration 2080 to ./logs\44k\D_31200.pth 2023-12-18 10:40:24,231 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_23200.pth 2023-12-18 10:40:24,251 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_23200.pth 2023-12-18 10:40:31,496 44k INFO ====> Epoch: 2080, cost 28.30 s 2023-12-18 10:40:48,406 44k INFO ====> Epoch: 2081, cost 16.91 s 2023-12-18 10:41:04,762 44k INFO ====> Epoch: 2082, cost 16.36 s 2023-12-18 10:41:21,449 44k INFO ====> Epoch: 2083, cost 16.69 s 2023-12-18 10:41:37,917 44k INFO ====> Epoch: 2084, cost 16.47 s 2023-12-18 10:41:54,215 44k INFO ====> Epoch: 2085, cost 16.30 s 2023-12-18 10:42:10,599 44k INFO ====> Epoch: 2086, cost 16.38 s 2023-12-18 10:42:26,978 44k INFO ====> Epoch: 2087, cost 16.38 s 2023-12-18 10:42:43,297 44k INFO ====> Epoch: 2088, cost 16.32 s 2023-12-18 10:42:59,704 44k INFO ====> Epoch: 2089, cost 16.41 s 2023-12-18 10:43:16,128 44k INFO ====> Epoch: 2090, cost 16.42 s 2023-12-18 10:43:32,431 44k INFO ====> Epoch: 2091, cost 16.30 s 2023-12-18 10:43:48,921 44k INFO ====> Epoch: 2092, cost 16.49 s 2023-12-18 10:44:01,721 44k INFO Train Epoch: 2093 [60%] 2023-12-18 10:44:01,721 44k INFO Losses: [2.189920425415039, 2.792755126953125, 12.944823265075684, 24.886192321777344, 0.6519286036491394], step: 31400, lr: 7.696908326992759e-05, reference_loss: 43.46561813354492 2023-12-18 10:44:05,768 44k INFO ====> Epoch: 2093, cost 16.85 s 2023-12-18 10:44:22,030 44k INFO ====> Epoch: 2094, cost 16.26 s 2023-12-18 10:44:38,385 44k INFO ====> Epoch: 2095, cost 16.36 s 2023-12-18 10:44:54,666 44k INFO ====> Epoch: 2096, cost 16.28 s 2023-12-18 10:45:11,117 44k INFO ====> Epoch: 2097, cost 16.45 s 2023-12-18 10:45:27,436 44k INFO ====> Epoch: 2098, cost 16.32 s 2023-12-18 10:45:43,713 44k INFO ====> Epoch: 2099, cost 16.28 s 2023-12-18 10:46:00,051 44k INFO ====> Epoch: 2100, cost 16.34 s 2023-12-18 10:46:16,450 44k INFO ====> Epoch: 2101, cost 16.40 s 2023-12-18 10:46:33,092 44k INFO ====> Epoch: 2102, cost 16.64 s 2023-12-18 10:46:49,435 44k INFO ====> Epoch: 2103, cost 16.34 s 2023-12-18 10:47:05,836 44k INFO ====> Epoch: 2104, cost 16.40 s 2023-12-18 10:47:22,274 44k INFO ====> Epoch: 2105, cost 16.44 s 2023-12-18 10:47:38,069 44k INFO Train Epoch: 2106 [93%] 2023-12-18 10:47:38,069 44k INFO Losses: [1.2304956912994385, 3.7341294288635254, 18.536252975463867, 23.022418975830078, -0.14643454551696777], step: 31600, lr: 7.684410227270316e-05, reference_loss: 46.376861572265625 2023-12-18 10:47:39,078 44k INFO ====> Epoch: 2106, cost 16.80 s 2023-12-18 10:47:55,622 44k INFO ====> Epoch: 2107, cost 16.54 s 2023-12-18 10:48:12,065 44k INFO ====> Epoch: 2108, cost 16.44 s 2023-12-18 10:48:28,395 44k INFO ====> Epoch: 2109, cost 16.33 s 2023-12-18 10:48:44,771 44k INFO ====> Epoch: 2110, cost 16.38 s 2023-12-18 10:49:00,985 44k INFO ====> Epoch: 2111, cost 16.21 s 2023-12-18 10:49:17,394 44k INFO ====> Epoch: 2112, cost 16.41 s 2023-12-18 10:49:33,756 44k INFO ====> Epoch: 2113, cost 16.36 s 2023-12-18 10:49:50,270 44k INFO ====> Epoch: 2114, cost 16.51 s 2023-12-18 10:50:06,526 44k INFO ====> Epoch: 2115, cost 16.26 s 2023-12-18 10:50:22,757 44k INFO ====> Epoch: 2116, cost 16.23 s 2023-12-18 10:50:39,212 44k INFO ====> Epoch: 2117, cost 16.45 s 2023-12-18 10:50:55,579 44k INFO ====> Epoch: 2118, cost 16.37 s 2023-12-18 10:51:12,182 44k INFO ====> Epoch: 2119, cost 16.60 s 2023-12-18 10:51:21,958 44k INFO Train Epoch: 2120 [27%] 2023-12-18 10:51:21,958 44k INFO Losses: [2.4455418586730957, 2.476118803024292, 7.330844402313232, 18.125600814819336, 0.5132375955581665], step: 31800, lr: 7.670973430182125e-05, reference_loss: 30.891342163085938 2023-12-18 10:51:29,225 44k INFO ====> Epoch: 2120, cost 17.04 s 2023-12-18 10:51:45,522 44k INFO ====> Epoch: 2121, cost 16.30 s 2023-12-18 10:52:01,868 44k INFO ====> Epoch: 2122, cost 16.35 s 2023-12-18 10:52:18,372 44k INFO ====> Epoch: 2123, cost 16.50 s 2023-12-18 10:52:34,607 44k INFO ====> Epoch: 2124, cost 16.23 s 2023-12-18 10:52:50,972 44k INFO ====> Epoch: 2125, cost 16.37 s 2023-12-18 10:53:07,342 44k INFO ====> Epoch: 2126, cost 16.37 s 2023-12-18 10:53:23,631 44k INFO ====> Epoch: 2127, cost 16.29 s 2023-12-18 10:53:39,969 44k INFO ====> Epoch: 2128, cost 16.34 s 2023-12-18 10:53:56,517 44k INFO ====> Epoch: 2129, cost 16.55 s 2023-12-18 10:54:12,756 44k INFO ====> Epoch: 2130, cost 16.24 s 2023-12-18 10:54:29,047 44k INFO ====> Epoch: 2131, cost 16.29 s 2023-12-18 10:54:45,540 44k INFO ====> Epoch: 2132, cost 16.49 s 2023-12-18 10:54:58,370 44k INFO Train Epoch: 2133 [60%] 2023-12-18 10:54:58,370 44k INFO Losses: [1.8762805461883545, 3.041628360748291, 12.796360969543457, 19.20253562927246, 0.7277417182922363], step: 32000, lr: 7.658517443073324e-05, reference_loss: 37.64454650878906 2023-12-18 10:55:03,707 44k INFO Saving model and optimizer state at iteration 2133 to ./logs\44k\G_32000.pth 2023-12-18 10:55:05,014 44k INFO Saving model and optimizer state at iteration 2133 to ./logs\44k\D_32000.pth 2023-12-18 10:55:11,364 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_24000.pth 2023-12-18 10:55:11,374 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_24000.pth 2023-12-18 10:55:15,546 44k INFO ====> Epoch: 2133, cost 30.01 s 2023-12-18 10:55:32,772 44k INFO ====> Epoch: 2134, cost 17.23 s 2023-12-18 10:55:49,142 44k INFO ====> Epoch: 2135, cost 16.37 s 2023-12-18 10:56:05,470 44k INFO ====> Epoch: 2136, cost 16.33 s 2023-12-18 10:56:21,773 44k INFO ====> Epoch: 2137, cost 16.30 s 2023-12-18 10:56:38,231 44k INFO ====> Epoch: 2138, cost 16.46 s 2023-12-18 10:56:54,671 44k INFO ====> Epoch: 2139, cost 16.44 s 2023-12-18 10:57:11,001 44k INFO ====> Epoch: 2140, cost 16.33 s 2023-12-18 10:57:27,209 44k INFO ====> Epoch: 2141, cost 16.21 s 2023-12-18 10:57:43,602 44k INFO ====> Epoch: 2142, cost 16.39 s 2023-12-18 10:57:59,850 44k INFO ====> Epoch: 2143, cost 16.25 s 2023-12-18 10:58:16,253 44k INFO ====> Epoch: 2144, cost 16.39 s 2023-12-18 10:58:32,549 44k INFO ====> Epoch: 2145, cost 16.31 s 2023-12-18 10:58:48,299 44k INFO Train Epoch: 2146 [93%] 2023-12-18 10:58:48,299 44k INFO Losses: [1.207231879234314, 3.543466091156006, 15.431828498840332, 17.778871536254883, 0.055150873959064484], step: 32200, lr: 7.646081681769796e-05, reference_loss: 38.016544342041016 2023-12-18 10:58:49,319 44k INFO ====> Epoch: 2146, cost 16.77 s 2023-12-18 10:59:05,766 44k INFO ====> Epoch: 2147, cost 16.45 s 2023-12-18 10:59:22,493 44k INFO ====> Epoch: 2148, cost 16.73 s 2023-12-18 10:59:38,751 44k INFO ====> Epoch: 2149, cost 16.26 s 2023-12-18 10:59:55,272 44k INFO ====> Epoch: 2150, cost 16.52 s 2023-12-18 11:00:11,790 44k INFO ====> Epoch: 2151, cost 16.52 s 2023-12-18 11:00:28,167 44k INFO ====> Epoch: 2152, cost 16.38 s 2023-12-18 11:00:44,493 44k INFO ====> Epoch: 2153, cost 16.33 s 2023-12-18 11:01:00,751 44k INFO ====> Epoch: 2154, cost 16.25 s 2023-12-18 11:01:17,301 44k INFO ====> Epoch: 2155, cost 16.56 s 2023-12-18 11:01:33,732 44k INFO ====> Epoch: 2156, cost 16.43 s 2023-12-18 11:01:50,176 44k INFO ====> Epoch: 2157, cost 16.44 s 2023-12-18 11:02:06,564 44k INFO ====> Epoch: 2158, cost 16.39 s 2023-12-18 11:02:23,008 44k INFO ====> Epoch: 2159, cost 16.44 s 2023-12-18 11:02:32,685 44k INFO Train Epoch: 2160 [27%] 2023-12-18 11:02:32,695 44k INFO Losses: [1.5874969959259033, 3.6782279014587402, 12.69244384765625, 23.538314819335938, 0.6592702865600586], step: 32400, lr: 7.632711905165067e-05, reference_loss: 42.15575408935547 2023-12-18 11:02:39,878 44k INFO ====> Epoch: 2160, cost 16.87 s 2023-12-18 11:02:56,259 44k INFO ====> Epoch: 2161, cost 16.38 s 2023-12-18 11:03:12,599 44k INFO ====> Epoch: 2162, cost 16.34 s 2023-12-18 11:03:28,804 44k INFO ====> Epoch: 2163, cost 16.21 s 2023-12-18 11:03:45,028 44k INFO ====> Epoch: 2164, cost 16.22 s 2023-12-18 11:04:01,375 44k INFO ====> Epoch: 2165, cost 16.35 s 2023-12-18 11:04:17,740 44k INFO ====> Epoch: 2166, cost 16.37 s 2023-12-18 11:04:34,223 44k INFO ====> Epoch: 2167, cost 16.48 s 2023-12-18 11:04:50,921 44k INFO ====> Epoch: 2168, cost 16.70 s 2023-12-18 11:05:07,160 44k INFO ====> Epoch: 2169, cost 16.24 s 2023-12-18 11:05:23,665 44k INFO ====> Epoch: 2170, cost 16.50 s 2023-12-18 11:05:39,986 44k INFO ====> Epoch: 2171, cost 16.32 s 2023-12-18 11:05:56,278 44k INFO ====> Epoch: 2172, cost 16.29 s 2023-12-18 11:06:09,071 44k INFO Train Epoch: 2173 [60%] 2023-12-18 11:06:09,071 44k INFO Losses: [1.970259428024292, 2.8494205474853516, 12.478228569030762, 19.962446212768555, 0.7334083318710327], step: 32600, lr: 7.620318046424553e-05, reference_loss: 37.9937629699707 2023-12-18 11:06:13,168 44k INFO ====> Epoch: 2173, cost 16.89 s 2023-12-18 11:06:29,677 44k INFO ====> Epoch: 2174, cost 16.51 s 2023-12-18 11:06:46,207 44k INFO ====> Epoch: 2175, cost 16.53 s 2023-12-18 11:07:02,857 44k INFO ====> Epoch: 2176, cost 16.65 s 2023-12-18 11:07:19,480 44k INFO ====> Epoch: 2177, cost 16.62 s 2023-12-18 11:07:35,828 44k INFO ====> Epoch: 2178, cost 16.35 s 2023-12-18 11:07:52,195 44k INFO ====> Epoch: 2179, cost 16.37 s 2023-12-18 11:08:08,564 44k INFO ====> Epoch: 2180, cost 16.37 s 2023-12-18 11:08:24,978 44k INFO ====> Epoch: 2181, cost 16.41 s 2023-12-18 11:08:41,531 44k INFO ====> Epoch: 2182, cost 16.55 s 2023-12-18 11:08:57,949 44k INFO ====> Epoch: 2183, cost 16.42 s 2023-12-18 11:09:14,229 44k INFO ====> Epoch: 2184, cost 16.28 s 2023-12-18 11:09:30,622 44k INFO ====> Epoch: 2185, cost 16.39 s 2023-12-18 11:09:46,427 44k INFO Train Epoch: 2186 [93%] 2023-12-18 11:09:46,427 44k INFO Losses: [1.1923282146453857, 3.825554609298706, 17.00979995727539, 22.323392868041992, 0.35360243916511536], step: 32800, lr: 7.607944312606395e-05, reference_loss: 44.70467758178711 2023-12-18 11:09:51,876 44k INFO Saving model and optimizer state at iteration 2186 to ./logs\44k\G_32800.pth 2023-12-18 11:09:53,032 44k INFO Saving model and optimizer state at iteration 2186 to ./logs\44k\D_32800.pth 2023-12-18 11:09:55,986 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_24800.pth 2023-12-18 11:09:57,124 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_24800.pth 2023-12-18 11:09:57,711 44k INFO ====> Epoch: 2186, cost 27.09 s 2023-12-18 11:10:14,978 44k INFO ====> Epoch: 2187, cost 17.27 s 2023-12-18 11:10:31,624 44k INFO ====> Epoch: 2188, cost 16.65 s 2023-12-18 11:10:48,345 44k INFO ====> Epoch: 2189, cost 16.72 s 2023-12-18 11:11:04,672 44k INFO ====> Epoch: 2190, cost 16.33 s 2023-12-18 11:11:21,154 44k INFO ====> Epoch: 2191, cost 16.48 s 2023-12-18 11:11:37,616 44k INFO ====> Epoch: 2192, cost 16.46 s 2023-12-18 11:11:54,133 44k INFO ====> Epoch: 2193, cost 16.52 s 2023-12-18 11:12:10,543 44k INFO ====> Epoch: 2194, cost 16.41 s 2023-12-18 11:12:26,987 44k INFO ====> Epoch: 2195, cost 16.44 s 2023-12-18 11:12:43,540 44k INFO ====> Epoch: 2196, cost 16.55 s 2023-12-18 11:13:00,076 44k INFO ====> Epoch: 2197, cost 16.54 s 2023-12-18 11:13:16,529 44k INFO ====> Epoch: 2198, cost 16.45 s 2023-12-18 11:13:32,945 44k INFO ====> Epoch: 2199, cost 16.42 s 2023-12-18 11:13:42,597 44k INFO Train Epoch: 2200 [27%] 2023-12-18 11:13:42,597 44k INFO Losses: [1.9381115436553955, 3.556706666946411, 12.643278121948242, 23.30306625366211, 0.5637285709381104], step: 33000, lr: 7.594641222198233e-05, reference_loss: 42.00489044189453 2023-12-18 11:13:49,775 44k INFO ====> Epoch: 2200, cost 16.83 s 2023-12-18 11:14:06,354 44k INFO ====> Epoch: 2201, cost 16.58 s 2023-12-18 11:14:22,684 44k INFO ====> Epoch: 2202, cost 16.33 s 2023-12-18 11:14:39,085 44k INFO ====> Epoch: 2203, cost 16.40 s 2023-12-18 11:14:55,432 44k INFO ====> Epoch: 2204, cost 16.35 s 2023-12-18 11:15:12,010 44k INFO ====> Epoch: 2205, cost 16.57 s 2023-12-18 11:15:28,423 44k INFO ====> Epoch: 2206, cost 16.42 s 2023-12-18 11:15:44,846 44k INFO ====> Epoch: 2207, cost 16.42 s 2023-12-18 11:16:01,298 44k INFO ====> Epoch: 2208, cost 16.45 s 2023-12-18 11:16:17,941 44k INFO ====> Epoch: 2209, cost 16.64 s 2023-12-18 11:16:34,579 44k INFO ====> Epoch: 2210, cost 16.64 s 2023-12-18 11:16:51,167 44k INFO ====> Epoch: 2211, cost 16.59 s 2023-12-18 11:17:07,520 44k INFO ====> Epoch: 2212, cost 16.35 s 2023-12-18 11:17:20,415 44k INFO Train Epoch: 2213 [60%] 2023-12-18 11:17:20,415 44k INFO Losses: [2.140159845352173, 3.1955227851867676, 9.957656860351562, 17.62181854248047, 0.8006429672241211], step: 33200, lr: 7.582309181940152e-05, reference_loss: 33.71580123901367 2023-12-18 11:17:24,513 44k INFO ====> Epoch: 2213, cost 16.99 s 2023-12-18 11:17:40,828 44k INFO ====> Epoch: 2214, cost 16.32 s 2023-12-18 11:17:57,221 44k INFO ====> Epoch: 2215, cost 16.39 s 2023-12-18 11:18:13,659 44k INFO ====> Epoch: 2216, cost 16.44 s 2023-12-18 11:18:29,952 44k INFO ====> Epoch: 2217, cost 16.29 s 2023-12-18 11:18:46,472 44k INFO ====> Epoch: 2218, cost 16.52 s 2023-12-18 11:19:02,913 44k INFO ====> Epoch: 2219, cost 16.44 s 2023-12-18 11:19:19,232 44k INFO ====> Epoch: 2220, cost 16.32 s 2023-12-18 11:19:35,742 44k INFO ====> Epoch: 2221, cost 16.51 s 2023-12-18 11:19:52,087 44k INFO ====> Epoch: 2222, cost 16.35 s 2023-12-18 11:20:08,400 44k INFO ====> Epoch: 2223, cost 16.31 s 2023-12-18 11:20:24,804 44k INFO ====> Epoch: 2224, cost 16.40 s 2023-12-18 11:20:41,157 44k INFO ====> Epoch: 2225, cost 16.35 s 2023-12-18 11:20:57,062 44k INFO Train Epoch: 2226 [93%] 2023-12-18 11:20:57,062 44k INFO Losses: [1.3854377269744873, 3.5431480407714844, 14.917083740234375, 20.00389862060547, 0.6053844094276428], step: 33400, lr: 7.569997166224704e-05, reference_loss: 40.454952239990234 2023-12-18 11:20:58,130 44k INFO ====> Epoch: 2226, cost 16.97 s 2023-12-18 11:21:14,688 44k INFO ====> Epoch: 2227, cost 16.56 s 2023-12-18 11:21:30,957 44k INFO ====> Epoch: 2228, cost 16.27 s 2023-12-18 11:21:47,314 44k INFO ====> Epoch: 2229, cost 16.36 s 2023-12-18 11:22:03,813 44k INFO ====> Epoch: 2230, cost 16.50 s 2023-12-18 11:22:20,264 44k INFO ====> Epoch: 2231, cost 16.45 s 2023-12-18 11:22:36,784 44k INFO ====> Epoch: 2232, cost 16.52 s 2023-12-18 11:22:53,221 44k INFO ====> Epoch: 2233, cost 16.44 s 2023-12-18 11:23:09,616 44k INFO ====> Epoch: 2234, cost 16.39 s 2023-12-18 11:23:26,054 44k INFO ====> Epoch: 2235, cost 16.44 s 2023-12-18 11:23:42,398 44k INFO ====> Epoch: 2236, cost 16.34 s 2023-12-18 11:23:58,663 44k INFO ====> Epoch: 2237, cost 16.26 s 2023-12-18 11:24:15,003 44k INFO ====> Epoch: 2238, cost 16.34 s 2023-12-18 11:24:31,360 44k INFO ====> Epoch: 2239, cost 16.36 s 2023-12-18 11:24:41,084 44k INFO Train Epoch: 2240 [27%] 2023-12-18 11:24:41,084 44k INFO Losses: [1.9895873069763184, 2.9280593395233154, 9.759982109069824, 17.165315628051758, 0.5962346792221069], step: 33600, lr: 7.55676042939358e-05, reference_loss: 32.439178466796875 2023-12-18 11:24:46,435 44k INFO Saving model and optimizer state at iteration 2240 to ./logs\44k\G_33600.pth 2023-12-18 11:24:47,731 44k INFO Saving model and optimizer state at iteration 2240 to ./logs\44k\D_33600.pth 2023-12-18 11:24:54,088 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_25600.pth 2023-12-18 11:24:54,088 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_25600.pth 2023-12-18 11:25:01,402 44k INFO ====> Epoch: 2240, cost 30.04 s 2023-12-18 11:25:18,364 44k INFO ====> Epoch: 2241, cost 16.96 s 2023-12-18 11:25:34,649 44k INFO ====> Epoch: 2242, cost 16.28 s 2023-12-18 11:25:51,005 44k INFO ====> Epoch: 2243, cost 16.36 s 2023-12-18 11:26:07,343 44k INFO ====> Epoch: 2244, cost 16.34 s 2023-12-18 11:26:23,495 44k INFO ====> Epoch: 2245, cost 16.15 s 2023-12-18 11:26:39,721 44k INFO ====> Epoch: 2246, cost 16.23 s 2023-12-18 11:26:56,100 44k INFO ====> Epoch: 2247, cost 16.38 s 2023-12-18 11:27:12,367 44k INFO ====> Epoch: 2248, cost 16.27 s 2023-12-18 11:27:28,592 44k INFO ====> Epoch: 2249, cost 16.22 s 2023-12-18 11:27:44,933 44k INFO ====> Epoch: 2250, cost 16.34 s 2023-12-18 11:28:01,279 44k INFO ====> Epoch: 2251, cost 16.35 s 2023-12-18 11:28:17,700 44k INFO ====> Epoch: 2252, cost 16.42 s 2023-12-18 11:28:30,417 44k INFO Train Epoch: 2253 [60%] 2023-12-18 11:28:30,417 44k INFO Losses: [2.0059263706207275, 3.1904687881469727, 9.744091987609863, 19.912883758544922, 0.7433456182479858], step: 33800, lr: 7.544489899277746e-05, reference_loss: 35.596717834472656 2023-12-18 11:28:34,426 44k INFO ====> Epoch: 2253, cost 16.73 s 2023-12-18 11:28:50,674 44k INFO ====> Epoch: 2254, cost 16.25 s 2023-12-18 11:29:06,958 44k INFO ====> Epoch: 2255, cost 16.28 s 2023-12-18 11:29:23,325 44k INFO ====> Epoch: 2256, cost 16.37 s 2023-12-18 11:29:39,601 44k INFO ====> Epoch: 2257, cost 16.28 s 2023-12-18 11:29:55,866 44k INFO ====> Epoch: 2258, cost 16.26 s 2023-12-18 11:30:12,299 44k INFO ====> Epoch: 2259, cost 16.43 s 2023-12-18 11:30:28,482 44k INFO ====> Epoch: 2260, cost 16.18 s 2023-12-18 11:30:44,813 44k INFO ====> Epoch: 2261, cost 16.33 s 2023-12-18 11:31:01,011 44k INFO ====> Epoch: 2262, cost 16.20 s 2023-12-18 11:31:17,311 44k INFO ====> Epoch: 2263, cost 16.30 s 2023-12-18 11:31:33,748 44k INFO ====> Epoch: 2264, cost 16.44 s 2023-12-18 11:31:50,044 44k INFO ====> Epoch: 2265, cost 16.30 s 2023-12-18 11:32:05,716 44k INFO Train Epoch: 2266 [93%] 2023-12-18 11:32:05,716 44k INFO Losses: [1.6762096881866455, 3.251845598220825, 16.287513732910156, 18.89430809020996, 0.3641401529312134], step: 34000, lr: 7.532239293825491e-05, reference_loss: 40.47401809692383 2023-12-18 11:32:06,790 44k INFO ====> Epoch: 2266, cost 16.75 s 2023-12-18 11:32:23,133 44k INFO ====> Epoch: 2267, cost 16.34 s 2023-12-18 11:32:39,426 44k INFO ====> Epoch: 2268, cost 16.29 s 2023-12-18 11:32:56,107 44k INFO ====> Epoch: 2269, cost 16.68 s 2023-12-18 11:33:12,481 44k INFO ====> Epoch: 2270, cost 16.37 s 2023-12-18 11:33:29,017 44k INFO ====> Epoch: 2271, cost 16.54 s 2023-12-18 11:33:45,439 44k INFO ====> Epoch: 2272, cost 16.42 s 2023-12-18 11:34:01,613 44k INFO ====> Epoch: 2273, cost 16.17 s 2023-12-18 11:34:17,764 44k INFO ====> Epoch: 2274, cost 16.15 s 2023-12-18 11:34:33,916 44k INFO ====> Epoch: 2275, cost 16.15 s 2023-12-18 11:34:50,117 44k INFO ====> Epoch: 2276, cost 16.20 s 2023-12-18 11:35:06,461 44k INFO ====> Epoch: 2277, cost 16.34 s 2023-12-18 11:35:22,667 44k INFO ====> Epoch: 2278, cost 16.21 s 2023-12-18 11:35:38,985 44k INFO ====> Epoch: 2279, cost 16.32 s 2023-12-18 11:35:48,553 44k INFO Train Epoch: 2280 [27%] 2023-12-18 11:35:48,553 44k INFO Losses: [2.2889020442962646, 2.7503180503845215, 8.17262077331543, 21.701416015625, 0.6173967123031616], step: 34200, lr: 7.519068579610928e-05, reference_loss: 35.53065490722656 2023-12-18 11:35:55,691 44k INFO ====> Epoch: 2280, cost 16.71 s 2023-12-18 11:36:12,062 44k INFO ====> Epoch: 2281, cost 16.37 s 2023-12-18 11:36:28,287 44k INFO ====> Epoch: 2282, cost 16.23 s 2023-12-18 11:36:44,640 44k INFO ====> Epoch: 2283, cost 16.35 s 2023-12-18 11:37:00,882 44k INFO ====> Epoch: 2284, cost 16.24 s 2023-12-18 11:37:17,242 44k INFO ====> Epoch: 2285, cost 16.36 s 2023-12-18 11:37:33,480 44k INFO ====> Epoch: 2286, cost 16.24 s 2023-12-18 11:37:49,923 44k INFO ====> Epoch: 2287, cost 16.44 s 2023-12-18 11:38:06,192 44k INFO ====> Epoch: 2288, cost 16.27 s 2023-12-18 11:38:22,593 44k INFO ====> Epoch: 2289, cost 16.40 s 2023-12-18 11:38:38,882 44k INFO ====> Epoch: 2290, cost 16.29 s 2023-12-18 11:38:55,152 44k INFO ====> Epoch: 2291, cost 16.27 s 2023-12-18 11:39:11,499 44k INFO ====> Epoch: 2292, cost 16.35 s 2023-12-18 11:39:24,169 44k INFO Train Epoch: 2293 [60%] 2023-12-18 11:39:24,169 44k INFO Losses: [2.2270450592041016, 2.640655040740967, 8.58346939086914, 18.874897003173828, 0.5409868359565735], step: 34400, lr: 7.506859252835094e-05, reference_loss: 32.86705017089844 2023-12-18 11:39:29,567 44k INFO Saving model and optimizer state at iteration 2293 to ./logs\44k\G_34400.pth 2023-12-18 11:39:30,785 44k INFO Saving model and optimizer state at iteration 2293 to ./logs\44k\D_34400.pth 2023-12-18 11:39:33,482 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_26400.pth 2023-12-18 11:39:33,482 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_26400.pth 2023-12-18 11:39:37,643 44k INFO ====> Epoch: 2293, cost 26.14 s 2023-12-18 11:39:55,540 44k INFO ====> Epoch: 2294, cost 17.90 s 2023-12-18 11:40:11,958 44k INFO ====> Epoch: 2295, cost 16.42 s 2023-12-18 11:40:28,348 44k INFO ====> Epoch: 2296, cost 16.39 s 2023-12-18 11:40:44,598 44k INFO ====> Epoch: 2297, cost 16.25 s 2023-12-18 11:41:01,007 44k INFO ====> Epoch: 2298, cost 16.41 s 2023-12-18 11:41:17,450 44k INFO ====> Epoch: 2299, cost 16.44 s 2023-12-18 11:41:33,845 44k INFO ====> Epoch: 2300, cost 16.39 s 2023-12-18 11:41:50,221 44k INFO ====> Epoch: 2301, cost 16.38 s 2023-12-18 11:42:06,582 44k INFO ====> Epoch: 2302, cost 16.36 s 2023-12-18 11:42:22,890 44k INFO ====> Epoch: 2303, cost 16.31 s 2023-12-18 11:42:39,294 44k INFO ====> Epoch: 2304, cost 16.40 s 2023-12-18 11:42:55,612 44k INFO ====> Epoch: 2305, cost 16.32 s 2023-12-18 11:43:11,446 44k INFO Train Epoch: 2306 [93%] 2023-12-18 11:43:11,446 44k INFO Losses: [1.8376150131225586, 2.8742828369140625, 9.239768981933594, 17.526214599609375, 0.3302317261695862], step: 34600, lr: 7.494669751341973e-05, reference_loss: 31.808115005493164 2023-12-18 11:43:12,464 44k INFO ====> Epoch: 2306, cost 16.85 s 2023-12-18 11:43:28,791 44k INFO ====> Epoch: 2307, cost 16.33 s 2023-12-18 11:43:45,078 44k INFO ====> Epoch: 2308, cost 16.29 s 2023-12-18 11:44:01,633 44k INFO ====> Epoch: 2309, cost 16.55 s 2023-12-18 11:44:18,032 44k INFO ====> Epoch: 2310, cost 16.40 s 2023-12-18 11:44:34,381 44k INFO ====> Epoch: 2311, cost 16.35 s 2023-12-18 11:44:50,747 44k INFO ====> Epoch: 2312, cost 16.37 s 2023-12-18 11:45:07,067 44k INFO ====> Epoch: 2313, cost 16.32 s 2023-12-18 11:45:23,406 44k INFO ====> Epoch: 2314, cost 16.34 s 2023-12-18 11:45:39,842 44k INFO ====> Epoch: 2315, cost 16.44 s 2023-12-18 11:45:56,088 44k INFO ====> Epoch: 2316, cost 16.25 s 2023-12-18 11:46:12,432 44k INFO ====> Epoch: 2317, cost 16.34 s 2023-12-18 11:46:28,887 44k INFO ====> Epoch: 2318, cost 16.45 s 2023-12-18 11:46:45,428 44k INFO ====> Epoch: 2319, cost 16.54 s 2023-12-18 11:46:55,096 44k INFO Train Epoch: 2320 [27%] 2023-12-18 11:46:55,106 44k INFO Losses: [1.8116226196289062, 3.661442995071411, 12.796745300292969, 22.56110954284668, 0.7366653084754944], step: 34800, lr: 7.481564730434262e-05, reference_loss: 41.56758117675781 2023-12-18 11:47:02,316 44k INFO ====> Epoch: 2320, cost 16.89 s 2023-12-18 11:47:18,660 44k INFO ====> Epoch: 2321, cost 16.34 s 2023-12-18 11:47:35,189 44k INFO ====> Epoch: 2322, cost 16.53 s 2023-12-18 11:47:51,526 44k INFO ====> Epoch: 2323, cost 16.34 s 2023-12-18 11:48:07,914 44k INFO ====> Epoch: 2324, cost 16.39 s 2023-12-18 11:48:24,249 44k INFO ====> Epoch: 2325, cost 16.33 s 2023-12-18 11:48:40,643 44k INFO ====> Epoch: 2326, cost 16.39 s 2023-12-18 11:48:56,936 44k INFO ====> Epoch: 2327, cost 16.29 s 2023-12-18 11:49:13,259 44k INFO ====> Epoch: 2328, cost 16.32 s 2023-12-18 11:49:29,749 44k INFO ====> Epoch: 2329, cost 16.49 s 2023-12-18 11:49:46,083 44k INFO ====> Epoch: 2330, cost 16.33 s 2023-12-18 11:50:02,350 44k INFO ====> Epoch: 2331, cost 16.27 s 2023-12-18 11:50:18,864 44k INFO ====> Epoch: 2332, cost 16.51 s 2023-12-18 11:50:31,685 44k INFO Train Epoch: 2333 [60%] 2023-12-18 11:50:31,685 44k INFO Losses: [1.931926965713501, 3.314007520675659, 9.707188606262207, 18.11225700378418, 0.8085576295852661], step: 35000, lr: 7.469416301726467e-05, reference_loss: 33.873939514160156 2023-12-18 11:50:35,741 44k INFO ====> Epoch: 2333, cost 16.88 s 2023-12-18 11:50:51,975 44k INFO ====> Epoch: 2334, cost 16.23 s 2023-12-18 11:51:08,319 44k INFO ====> Epoch: 2335, cost 16.34 s 2023-12-18 11:51:24,811 44k INFO ====> Epoch: 2336, cost 16.49 s 2023-12-18 11:51:41,233 44k INFO ====> Epoch: 2337, cost 16.42 s 2023-12-18 11:51:57,484 44k INFO ====> Epoch: 2338, cost 16.25 s 2023-12-18 11:52:13,898 44k INFO ====> Epoch: 2339, cost 16.41 s 2023-12-18 11:52:30,143 44k INFO ====> Epoch: 2340, cost 16.24 s 2023-12-18 11:52:46,558 44k INFO ====> Epoch: 2341, cost 16.42 s 2023-12-18 11:53:02,854 44k INFO ====> Epoch: 2342, cost 16.30 s 2023-12-18 11:53:19,216 44k INFO ====> Epoch: 2343, cost 16.36 s 2023-12-18 11:53:35,461 44k INFO ====> Epoch: 2344, cost 16.24 s 2023-12-18 11:53:51,774 44k INFO ====> Epoch: 2345, cost 16.31 s 2023-12-18 11:54:07,465 44k INFO Train Epoch: 2346 [93%] 2023-12-18 11:54:07,465 44k INFO Losses: [2.522393226623535, 2.108071804046631, 1.6755051612854004, 10.740166664123535, 0.6170383095741272], step: 35200, lr: 7.457287599416209e-05, reference_loss: 17.663175582885742 2023-12-18 11:54:12,797 44k INFO Saving model and optimizer state at iteration 2346 to ./logs\44k\G_35200.pth 2023-12-18 11:54:14,022 44k INFO Saving model and optimizer state at iteration 2346 to ./logs\44k\D_35200.pth 2023-12-18 11:54:17,625 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_27200.pth 2023-12-18 11:54:17,625 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_27200.pth 2023-12-18 11:54:18,226 44k INFO ====> Epoch: 2346, cost 26.45 s 2023-12-18 11:54:36,902 44k INFO ====> Epoch: 2347, cost 18.68 s 2023-12-18 11:54:53,556 44k INFO ====> Epoch: 2348, cost 16.65 s 2023-12-18 11:55:10,024 44k INFO ====> Epoch: 2349, cost 16.47 s 2023-12-18 11:55:26,307 44k INFO ====> Epoch: 2350, cost 16.28 s 2023-12-18 11:55:42,506 44k INFO ====> Epoch: 2351, cost 16.20 s 2023-12-18 11:55:58,784 44k INFO ====> Epoch: 2352, cost 16.28 s 2023-12-18 11:56:15,108 44k INFO ====> Epoch: 2353, cost 16.32 s 2023-12-18 11:56:31,479 44k INFO ====> Epoch: 2354, cost 16.37 s 2023-12-18 11:56:47,997 44k INFO ====> Epoch: 2355, cost 16.52 s 2023-12-18 11:57:04,281 44k INFO ====> Epoch: 2356, cost 16.28 s 2023-12-18 11:57:20,520 44k INFO ====> Epoch: 2357, cost 16.24 s 2023-12-18 11:57:36,856 44k INFO ====> Epoch: 2358, cost 16.34 s 2023-12-18 11:57:53,299 44k INFO ====> Epoch: 2359, cost 16.44 s 2023-12-18 11:58:03,012 44k INFO Train Epoch: 2360 [27%] 2023-12-18 11:58:03,022 44k INFO Losses: [1.839370608329773, 2.921431541442871, 9.693913459777832, 18.799938201904297, 0.44188663363456726], step: 35400, lr: 7.444247944148188e-05, reference_loss: 33.69654083251953 2023-12-18 11:58:10,220 44k INFO ====> Epoch: 2360, cost 16.92 s 2023-12-18 11:58:26,597 44k INFO ====> Epoch: 2361, cost 16.38 s 2023-12-18 11:58:42,693 44k INFO ====> Epoch: 2362, cost 16.10 s 2023-12-18 11:58:58,863 44k INFO ====> Epoch: 2363, cost 16.17 s 2023-12-18 11:59:15,068 44k INFO ====> Epoch: 2364, cost 16.20 s 2023-12-18 11:59:31,466 44k INFO ====> Epoch: 2365, cost 16.40 s 2023-12-18 11:59:47,963 44k INFO ====> Epoch: 2366, cost 16.50 s 2023-12-18 12:00:04,349 44k INFO ====> Epoch: 2367, cost 16.39 s 2023-12-18 12:00:20,849 44k INFO ====> Epoch: 2368, cost 16.50 s 2023-12-18 12:00:37,187 44k INFO ====> Epoch: 2369, cost 16.34 s 2023-12-18 12:00:53,374 44k INFO ====> Epoch: 2370, cost 16.19 s 2023-12-18 12:01:09,630 44k INFO ====> Epoch: 2371, cost 16.26 s 2023-12-18 12:01:25,998 44k INFO ====> Epoch: 2372, cost 16.37 s 2023-12-18 12:01:38,860 44k INFO Train Epoch: 2373 [60%] 2023-12-18 12:01:38,860 44k INFO Losses: [1.7788819074630737, 2.987579822540283, 11.123029708862305, 20.930091857910156, 0.6591160297393799], step: 35600, lr: 7.432160109759116e-05, reference_loss: 37.47869873046875 2023-12-18 12:01:42,891 44k INFO ====> Epoch: 2373, cost 16.89 s 2023-12-18 12:01:59,462 44k INFO ====> Epoch: 2374, cost 16.57 s 2023-12-18 12:02:15,783 44k INFO ====> Epoch: 2375, cost 16.32 s 2023-12-18 12:02:32,061 44k INFO ====> Epoch: 2376, cost 16.28 s 2023-12-18 12:02:48,433 44k INFO ====> Epoch: 2377, cost 16.37 s 2023-12-18 12:03:04,716 44k INFO ====> Epoch: 2378, cost 16.28 s 2023-12-18 12:03:20,932 44k INFO ====> Epoch: 2379, cost 16.22 s 2023-12-18 12:03:37,323 44k INFO ====> Epoch: 2380, cost 16.39 s 2023-12-18 12:03:53,551 44k INFO ====> Epoch: 2381, cost 16.23 s 2023-12-18 12:04:09,979 44k INFO ====> Epoch: 2382, cost 16.43 s 2023-12-18 12:04:26,137 44k INFO ====> Epoch: 2383, cost 16.16 s 2023-12-18 12:04:42,531 44k INFO ====> Epoch: 2384, cost 16.39 s 2023-12-18 12:04:58,874 44k INFO ====> Epoch: 2385, cost 16.34 s 2023-12-18 12:05:14,590 44k INFO Train Epoch: 2386 [93%] 2023-12-18 12:05:14,590 44k INFO Losses: [1.3656693696975708, 3.7841708660125732, 12.944469451904297, 19.984739303588867, 0.008771657012403011], step: 35800, lr: 7.420091903375627e-05, reference_loss: 38.08781814575195 2023-12-18 12:05:15,600 44k INFO ====> Epoch: 2386, cost 16.73 s 2023-12-18 12:05:31,820 44k INFO ====> Epoch: 2387, cost 16.22 s 2023-12-18 12:05:47,975 44k INFO ====> Epoch: 2388, cost 16.15 s 2023-12-18 12:06:04,373 44k INFO ====> Epoch: 2389, cost 16.40 s 2023-12-18 12:06:20,506 44k INFO ====> Epoch: 2390, cost 16.13 s 2023-12-18 12:06:36,711 44k INFO ====> Epoch: 2391, cost 16.20 s 2023-12-18 12:06:52,921 44k INFO ====> Epoch: 2392, cost 16.21 s 2023-12-18 12:07:09,352 44k INFO ====> Epoch: 2393, cost 16.43 s 2023-12-18 12:07:25,643 44k INFO ====> Epoch: 2394, cost 16.29 s 2023-12-18 12:07:41,850 44k INFO ====> Epoch: 2395, cost 16.21 s 2023-12-18 12:07:58,219 44k INFO ====> Epoch: 2396, cost 16.37 s 2023-12-18 12:08:14,419 44k INFO ====> Epoch: 2397, cost 16.20 s 2023-12-18 12:08:30,676 44k INFO ====> Epoch: 2398, cost 16.26 s 2023-12-18 12:08:46,957 44k INFO ====> Epoch: 2399, cost 16.28 s 2023-12-18 12:08:56,708 44k INFO Train Epoch: 2400 [27%] 2023-12-18 12:08:56,708 44k INFO Losses: [1.6438357830047607, 3.276431083679199, 13.68618106842041, 22.332918167114258, 0.45133426785469055], step: 36000, lr: 7.407117287714481e-05, reference_loss: 41.39070510864258 2023-12-18 12:09:02,080 44k INFO Saving model and optimizer state at iteration 2400 to ./logs\44k\G_36000.pth 2023-12-18 12:09:03,350 44k INFO Saving model and optimizer state at iteration 2400 to ./logs\44k\D_36000.pth 2023-12-18 12:09:08,927 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_28000.pth 2023-12-18 12:09:08,927 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_28000.pth 2023-12-18 12:09:16,259 44k INFO ====> Epoch: 2400, cost 29.30 s 2023-12-18 12:09:32,704 44k INFO ====> Epoch: 2401, cost 16.45 s 2023-12-18 12:09:49,411 44k INFO ====> Epoch: 2402, cost 16.71 s 2023-12-18 12:10:06,062 44k INFO ====> Epoch: 2403, cost 16.65 s 2023-12-18 12:10:22,674 44k INFO ====> Epoch: 2404, cost 16.61 s 2023-12-18 12:10:39,017 44k INFO ====> Epoch: 2405, cost 16.34 s 2023-12-18 12:10:55,166 44k INFO ====> Epoch: 2406, cost 16.15 s 2023-12-18 12:11:11,424 44k INFO ====> Epoch: 2407, cost 16.26 s 2023-12-18 12:11:28,409 44k INFO ====> Epoch: 2408, cost 16.98 s 2023-12-18 12:11:44,779 44k INFO ====> Epoch: 2409, cost 16.37 s 2023-12-18 12:12:01,057 44k INFO ====> Epoch: 2410, cost 16.28 s 2023-12-18 12:12:17,187 44k INFO ====> Epoch: 2411, cost 16.13 s 2023-12-18 12:12:33,466 44k INFO ====> Epoch: 2412, cost 16.28 s 2023-12-18 12:12:46,244 44k INFO Train Epoch: 2413 [60%] 2023-12-18 12:12:46,244 44k INFO Losses: [1.7904788255691528, 3.2023961544036865, 12.153936386108398, 18.299484252929688, 0.5988679528236389], step: 36200, lr: 7.395089745409861e-05, reference_loss: 36.045166015625 2023-12-18 12:12:50,336 44k INFO ====> Epoch: 2413, cost 16.87 s 2023-12-18 12:13:06,781 44k INFO ====> Epoch: 2414, cost 16.45 s 2023-12-18 12:13:22,984 44k INFO ====> Epoch: 2415, cost 16.20 s 2023-12-18 12:13:39,324 44k INFO ====> Epoch: 2416, cost 16.34 s 2023-12-18 12:13:55,593 44k INFO ====> Epoch: 2417, cost 16.27 s 2023-12-18 12:14:11,945 44k INFO ====> Epoch: 2418, cost 16.35 s 2023-12-18 12:14:28,317 44k INFO ====> Epoch: 2419, cost 16.37 s 2023-12-18 12:14:44,546 44k INFO ====> Epoch: 2420, cost 16.23 s 2023-12-18 12:15:00,861 44k INFO ====> Epoch: 2421, cost 16.32 s 2023-12-18 12:15:17,124 44k INFO ====> Epoch: 2422, cost 16.26 s 2023-12-18 12:15:33,331 44k INFO ====> Epoch: 2423, cost 16.21 s 2023-12-18 12:15:49,722 44k INFO ====> Epoch: 2424, cost 16.39 s 2023-12-18 12:16:06,090 44k INFO ====> Epoch: 2425, cost 16.37 s 2023-12-18 12:16:21,905 44k INFO Train Epoch: 2426 [93%] 2023-12-18 12:16:21,905 44k INFO Losses: [1.4369168281555176, 3.7730956077575684, 11.330588340759277, 15.298237800598145, 0.8859033584594727], step: 36400, lr: 7.383081733209632e-05, reference_loss: 32.72473907470703 2023-12-18 12:16:22,946 44k INFO ====> Epoch: 2426, cost 16.86 s 2023-12-18 12:16:39,435 44k INFO ====> Epoch: 2427, cost 16.49 s 2023-12-18 12:16:55,876 44k INFO ====> Epoch: 2428, cost 16.44 s 2023-12-18 12:17:12,154 44k INFO ====> Epoch: 2429, cost 16.28 s 2023-12-18 12:17:28,498 44k INFO ====> Epoch: 2430, cost 16.34 s 2023-12-18 12:17:44,896 44k INFO ====> Epoch: 2431, cost 16.40 s 2023-12-18 12:18:01,273 44k INFO ====> Epoch: 2432, cost 16.38 s 2023-12-18 12:18:17,687 44k INFO ====> Epoch: 2433, cost 16.41 s 2023-12-18 12:18:34,084 44k INFO ====> Epoch: 2434, cost 16.40 s 2023-12-18 12:18:50,330 44k INFO ====> Epoch: 2435, cost 16.25 s 2023-12-18 12:19:06,523 44k INFO ====> Epoch: 2436, cost 16.19 s 2023-12-18 12:19:22,798 44k INFO ====> Epoch: 2437, cost 16.27 s 2023-12-18 12:19:39,437 44k INFO ====> Epoch: 2438, cost 16.64 s 2023-12-18 12:19:55,824 44k INFO ====> Epoch: 2439, cost 16.39 s 2023-12-18 12:20:05,586 44k INFO Train Epoch: 2440 [27%] 2023-12-18 12:20:05,586 44k INFO Losses: [1.6771786212921143, 3.379948616027832, 12.592855453491211, 18.989704132080078, 0.4484730660915375], step: 36600, lr: 7.370171832748744e-05, reference_loss: 37.08816146850586 2023-12-18 12:20:12,740 44k INFO ====> Epoch: 2440, cost 16.92 s 2023-12-18 12:20:29,009 44k INFO ====> Epoch: 2441, cost 16.27 s 2023-12-18 12:20:45,209 44k INFO ====> Epoch: 2442, cost 16.20 s 2023-12-18 12:21:01,529 44k INFO ====> Epoch: 2443, cost 16.32 s 2023-12-18 12:21:17,896 44k INFO ====> Epoch: 2444, cost 16.37 s 2023-12-18 12:21:34,467 44k INFO ====> Epoch: 2445, cost 16.57 s 2023-12-18 12:21:50,911 44k INFO ====> Epoch: 2446, cost 16.44 s 2023-12-18 12:22:07,230 44k INFO ====> Epoch: 2447, cost 16.32 s 2023-12-18 12:22:23,543 44k INFO ====> Epoch: 2448, cost 16.31 s 2023-12-18 12:22:39,947 44k INFO ====> Epoch: 2449, cost 16.40 s 2023-12-18 12:22:56,215 44k INFO ====> Epoch: 2450, cost 16.27 s 2023-12-18 12:23:12,585 44k INFO ====> Epoch: 2451, cost 16.37 s 2023-12-18 12:23:28,771 44k INFO ====> Epoch: 2452, cost 16.19 s 2023-12-18 12:23:41,522 44k INFO Train Epoch: 2453 [60%] 2023-12-18 12:23:41,522 44k INFO Losses: [1.895605444908142, 3.450462579727173, 11.576433181762695, 19.994056701660156, 0.29080629348754883], step: 36800, lr: 7.358204281801799e-05, reference_loss: 37.20736312866211 2023-12-18 12:23:46,832 44k INFO Saving model and optimizer state at iteration 2453 to ./logs\44k\G_36800.pth 2023-12-18 12:23:48,057 44k INFO Saving model and optimizer state at iteration 2453 to ./logs\44k\D_36800.pth 2023-12-18 12:23:51,284 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_28800.pth 2023-12-18 12:23:51,284 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_28800.pth 2023-12-18 12:23:55,368 44k INFO ====> Epoch: 2453, cost 26.60 s 2023-12-18 12:24:12,627 44k INFO ====> Epoch: 2454, cost 17.26 s 2023-12-18 12:24:28,982 44k INFO ====> Epoch: 2455, cost 16.35 s 2023-12-18 12:24:45,235 44k INFO ====> Epoch: 2456, cost 16.25 s 2023-12-18 12:25:01,607 44k INFO ====> Epoch: 2457, cost 16.37 s 2023-12-18 12:25:17,813 44k INFO ====> Epoch: 2458, cost 16.21 s 2023-12-18 12:25:34,101 44k INFO ====> Epoch: 2459, cost 16.29 s 2023-12-18 12:25:50,277 44k INFO ====> Epoch: 2460, cost 16.18 s 2023-12-18 12:26:06,486 44k INFO ====> Epoch: 2461, cost 16.21 s 2023-12-18 12:26:22,637 44k INFO ====> Epoch: 2462, cost 16.15 s 2023-12-18 12:26:38,848 44k INFO ====> Epoch: 2463, cost 16.21 s 2023-12-18 12:26:55,091 44k INFO ====> Epoch: 2464, cost 16.24 s 2023-12-18 12:27:11,310 44k INFO ====> Epoch: 2465, cost 16.22 s 2023-12-18 12:27:27,128 44k INFO Train Epoch: 2466 [93%] 2023-12-18 12:27:27,128 44k INFO Losses: [2.594360828399658, 2.607551336288452, 10.574933052062988, 18.55804443359375, 0.47160065174102783], step: 37000, lr: 7.346256163546372e-05, reference_loss: 34.806488037109375 2023-12-18 12:27:28,218 44k INFO ====> Epoch: 2466, cost 16.91 s 2023-12-18 12:27:44,446 44k INFO ====> Epoch: 2467, cost 16.23 s 2023-12-18 12:28:00,608 44k INFO ====> Epoch: 2468, cost 16.16 s 2023-12-18 12:28:16,838 44k INFO ====> Epoch: 2469, cost 16.23 s 2023-12-18 12:28:33,190 44k INFO ====> Epoch: 2470, cost 16.35 s 2023-12-18 12:28:49,408 44k INFO ====> Epoch: 2471, cost 16.22 s 2023-12-18 12:29:05,623 44k INFO ====> Epoch: 2472, cost 16.21 s 2023-12-18 12:29:21,978 44k INFO ====> Epoch: 2473, cost 16.36 s 2023-12-18 12:29:38,224 44k INFO ====> Epoch: 2474, cost 16.25 s 2023-12-18 12:29:54,510 44k INFO ====> Epoch: 2475, cost 16.29 s 2023-12-18 12:30:10,618 44k INFO ====> Epoch: 2476, cost 16.11 s 2023-12-18 12:30:27,042 44k INFO ====> Epoch: 2477, cost 16.42 s 2023-12-18 12:30:43,297 44k INFO ====> Epoch: 2478, cost 16.26 s 2023-12-18 12:30:59,528 44k INFO ====> Epoch: 2479, cost 16.23 s 2023-12-18 12:31:09,054 44k INFO Train Epoch: 2480 [27%] 2023-12-18 12:31:09,054 44k INFO Losses: [1.6485702991485596, 3.6325395107269287, 12.493321418762207, 18.41002655029297, 0.653689980506897], step: 37200, lr: 7.333410655497212e-05, reference_loss: 36.8381462097168 2023-12-18 12:31:16,134 44k INFO ====> Epoch: 2480, cost 16.61 s 2023-12-18 12:31:32,341 44k INFO ====> Epoch: 2481, cost 16.21 s 2023-12-18 12:31:48,674 44k INFO ====> Epoch: 2482, cost 16.33 s 2023-12-18 12:32:05,122 44k INFO ====> Epoch: 2483, cost 16.45 s 2023-12-18 12:32:21,354 44k INFO ====> Epoch: 2484, cost 16.23 s 2023-12-18 12:32:37,574 44k INFO ====> Epoch: 2485, cost 16.22 s 2023-12-18 12:32:53,988 44k INFO ====> Epoch: 2486, cost 16.41 s 2023-12-18 12:33:10,130 44k INFO ====> Epoch: 2487, cost 16.14 s 2023-12-18 12:33:26,310 44k INFO ====> Epoch: 2488, cost 16.18 s 2023-12-18 12:33:42,502 44k INFO ====> Epoch: 2489, cost 16.19 s 2023-12-18 12:33:58,741 44k INFO ====> Epoch: 2490, cost 16.24 s 2023-12-18 12:34:14,858 44k INFO ====> Epoch: 2491, cost 16.12 s 2023-12-18 12:34:30,942 44k INFO ====> Epoch: 2492, cost 16.08 s 2023-12-18 12:34:43,521 44k INFO Train Epoch: 2493 [60%] 2023-12-18 12:34:43,521 44k INFO Losses: [1.9523813724517822, 3.12516713142395, 10.993964195251465, 19.711170196533203, 0.5466232895851135], step: 37400, lr: 7.321502796681144e-05, reference_loss: 36.329307556152344 2023-12-18 12:34:47,567 44k INFO ====> Epoch: 2493, cost 16.63 s 2023-12-18 12:35:03,815 44k INFO ====> Epoch: 2494, cost 16.25 s 2023-12-18 12:35:20,079 44k INFO ====> Epoch: 2495, cost 16.26 s 2023-12-18 12:35:36,691 44k INFO ====> Epoch: 2496, cost 16.61 s 2023-12-18 12:35:53,063 44k INFO ====> Epoch: 2497, cost 16.37 s 2023-12-18 12:36:09,490 44k INFO ====> Epoch: 2498, cost 16.43 s 2023-12-18 12:36:25,653 44k INFO ====> Epoch: 2499, cost 16.16 s 2023-12-18 12:36:41,905 44k INFO ====> Epoch: 2500, cost 16.25 s 2023-12-18 12:36:58,067 44k INFO ====> Epoch: 2501, cost 16.16 s 2023-12-18 12:37:14,471 44k INFO ====> Epoch: 2502, cost 16.40 s 2023-12-18 12:37:30,688 44k INFO ====> Epoch: 2503, cost 16.22 s 2023-12-18 12:37:47,038 44k INFO ====> Epoch: 2504, cost 16.35 s 2023-12-18 12:38:03,328 44k INFO ====> Epoch: 2505, cost 16.29 s 2023-12-18 12:38:18,895 44k INFO Train Epoch: 2506 [93%] 2023-12-18 12:38:18,895 44k INFO Losses: [1.4508090019226074, 3.8603355884552, 15.537680625915527, 21.403926849365234, 0.5179211497306824], step: 37600, lr: 7.309614273629596e-05, reference_loss: 42.77067565917969 2023-12-18 12:38:24,131 44k INFO Saving model and optimizer state at iteration 2506 to ./logs\44k\G_37600.pth 2023-12-18 12:38:25,301 44k INFO Saving model and optimizer state at iteration 2506 to ./logs\44k\D_37600.pth 2023-12-18 12:38:32,097 44k INFO .. Free up space by deleting ckpt ./logs\44k\G_29600.pth 2023-12-18 12:38:32,097 44k INFO .. Free up space by deleting ckpt ./logs\44k\D_29600.pth 2023-12-18 12:38:32,697 44k INFO ====> Epoch: 2506, cost 29.37 s 2023-12-18 12:38:50,187 44k INFO ====> Epoch: 2507, cost 17.49 s 2023-12-18 12:39:06,334 44k INFO ====> Epoch: 2508, cost 16.15 s 2023-12-18 12:39:22,498 44k INFO ====> Epoch: 2509, cost 16.16 s 2023-12-18 12:39:38,786 44k INFO ====> Epoch: 2510, cost 16.29 s 2023-12-18 12:39:54,962 44k INFO ====> Epoch: 2511, cost 16.18 s 2023-12-18 12:40:11,575 44k INFO ====> Epoch: 2512, cost 16.61 s 2023-12-18 12:40:27,896 44k INFO ====> Epoch: 2513, cost 16.32 s 2023-12-18 12:40:44,325 44k INFO ====> Epoch: 2514, cost 16.43 s 2023-12-18 12:41:00,600 44k INFO ====> Epoch: 2515, cost 16.27 s 2023-12-18 12:41:16,760 44k INFO ====> Epoch: 2516, cost 16.16 s 2023-12-18 12:41:33,158 44k INFO ====> Epoch: 2517, cost 16.40 s 2023-12-18 12:41:49,378 44k INFO ====> Epoch: 2518, cost 16.22 s 2023-12-18 12:42:05,775 44k INFO ====> Epoch: 2519, cost 16.40 s 2023-12-18 12:42:15,381 44k INFO Train Epoch: 2520 [27%] 2023-12-18 12:42:15,381 44k INFO Losses: [1.7540324926376343, 3.3329250812530518, 13.909547805786133, 21.25723648071289, 0.3658987879753113], step: 37800, lr: 7.296832836813642e-05, reference_loss: 40.6196403503418 2023-12-18 12:42:22,544 44k INFO ====> Epoch: 2520, cost 16.77 s 2023-12-18 12:42:38,682 44k INFO ====> Epoch: 2521, cost 16.14 s 2023-12-18 12:42:54,852 44k INFO ====> Epoch: 2522, cost 16.17 s 2023-12-18 12:43:11,127 44k INFO ====> Epoch: 2523, cost 16.27 s 2023-12-18 12:43:27,511 44k INFO ====> Epoch: 2524, cost 16.38 s 2023-12-18 12:43:43,703 44k INFO ====> Epoch: 2525, cost 16.19 s 2023-12-18 12:43:59,961 44k INFO ====> Epoch: 2526, cost 16.26 s 2023-12-18 12:44:16,398 44k INFO ====> Epoch: 2527, cost 16.44 s 2023-12-18 12:44:32,551 44k INFO ====> Epoch: 2528, cost 16.15 s 2023-12-18 12:44:48,877 44k INFO ====> Epoch: 2529, cost 16.33 s 2023-12-18 12:45:05,082 44k INFO ====> Epoch: 2530, cost 16.21 s 2023-12-18 12:45:21,328 44k INFO ====> Epoch: 2531, cost 16.25 s 2023-12-18 12:45:37,525 44k INFO ====> Epoch: 2532, cost 16.20 s 2023-12-18 12:45:50,447 44k INFO Train Epoch: 2533 [60%] 2023-12-18 12:45:50,447 44k INFO Losses: [1.615832805633545, 3.475477457046509, 12.555639266967773, 20.304550170898438, 0.32248374819755554], step: 38000, lr: 7.284984372394143e-05, reference_loss: 38.273983001708984 2023-12-18 12:45:54,482 44k INFO ====> Epoch: 2533, cost 16.96 s 2023-12-18 12:46:10,838 44k INFO ====> Epoch: 2534, cost 16.36 s 2023-12-18 12:46:27,091 44k INFO ====> Epoch: 2535, cost 16.25 s 2023-12-18 12:46:43,827 44k INFO ====> Epoch: 2536, cost 16.74 s 2023-12-18 12:47:00,121 44k INFO ====> Epoch: 2537, cost 16.29 s 2023-12-18 12:47:16,351 44k INFO ====> Epoch: 2538, cost 16.23 s 2023-12-18 12:47:32,592 44k INFO ====> Epoch: 2539, cost 16.24 s 2023-12-18 12:47:48,887 44k INFO ====> Epoch: 2540, cost 16.30 s 2023-12-18 12:48:05,131 44k INFO ====> Epoch: 2541, cost 16.24 s 2023-12-18 12:48:21,327 44k INFO ====> Epoch: 2542, cost 16.20 s 2023-12-18 12:48:37,610 44k INFO ====> Epoch: 2543, cost 16.28 s 2023-12-18 12:48:53,884 44k INFO ====> Epoch: 2544, cost 16.27 s 2023-12-18 12:49:10,367 44k INFO ====> Epoch: 2545, cost 16.48 s 2023-12-18 12:49:26,031 44k INFO Train Epoch: 2546 [93%] 2023-12-18 12:49:26,041 44k INFO Losses: [1.3140060901641846, 3.946552276611328, 11.630853652954102, 17.980533599853516, 0.9035536646842957], step: 38200, lr: 7.273155147295627e-05, reference_loss: 35.77549743652344 2023-12-18 12:49:27,137 44k INFO ====> Epoch: 2546, cost 16.77 s 2023-12-18 12:49:43,398 44k INFO ====> Epoch: 2547, cost 16.26 s 2023-12-18 12:49:59,728 44k INFO ====> Epoch: 2548, cost 16.33 s 2023-12-18 12:50:15,991 44k INFO ====> Epoch: 2549, cost 16.26 s 2023-12-18 12:50:32,231 44k INFO ====> Epoch: 2550, cost 16.24 s 2023-12-18 12:50:48,649 44k INFO ====> Epoch: 2551, cost 16.42 s 2023-12-18 12:51:04,875 44k INFO ====> Epoch: 2552, cost 16.23 s 2023-12-18 12:51:21,114 44k INFO ====> Epoch: 2553, cost 16.24 s 2023-12-18 12:51:37,482 44k INFO ====> Epoch: 2554, cost 16.37 s 2023-12-18 12:51:53,859 44k INFO ====> Epoch: 2555, cost 16.38 s 2023-12-18 12:52:10,174 44k INFO ====> Epoch: 2556, cost 16.32 s 2023-12-18 12:52:26,552 44k INFO ====> Epoch: 2557, cost 16.38 s 2023-12-18 12:52:42,954 44k INFO ====> Epoch: 2558, cost 16.40 s 2023-12-18 12:52:59,128 44k INFO ====> Epoch: 2559, cost 16.17 s 2023-12-18 12:53:08,805 44k INFO Train Epoch: 2560 [27%] 2023-12-18 12:53:08,805 44k INFO Losses: [1.6487252712249756, 3.7455790042877197, 12.684045791625977, 21.380022048950195, 0.347027450799942], step: 38400, lr: 7.260437462136348e-05, reference_loss: 39.80540084838867 2023-12-18 12:53:14,033 44k INFO Saving model and optimizer state at iteration 2560 to ./logs\44k\G_38400.pth 2023-12-18 12:53:15,253 44k INFO Saving model and optimizer state at iteration 2560 to ./logs\44k\D_38400.pth