Parappanon's picture
Uploaded Kenma model
47bd6f4
raw
history blame
47.3 kB
2023-05-08 23:16:29,434 kenma_eng INFO {'train': {'log_interval': 200, 'seed': 1234, 'epochs': 20000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 4, 'fp16_run': True, 'lr_decay': 0.999875, 'segment_size': 12800, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0}, 'data': {'max_wav_value': 32768.0, 'sampling_rate': 40000, 'filter_length': 2048, 'hop_length': 400, 'win_length': 2048, 'n_mel_channels': 125, 'mel_fmin': 0.0, 'mel_fmax': None, 'training_files': './logs/kenma_eng/filelist.txt'}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 10, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'use_spectral_norm': False, 'gin_channels': 256, 'spk_embed_dim': 109}, 'model_dir': './logs/kenma_eng', 'experiment_dir': './logs/kenma_eng', 'save_every_epoch': 5, 'name': 'kenma_eng', 'total_epoch': 300, 'pretrainG': 'pretrained/G40k.pth', 'pretrainD': 'pretrained/D40k.pth', 'gpus': '0', 'sample_rate': '40k', 'if_f0': 0, 'if_latest': 1, 'if_cache_data_in_gpu': 1}
2023-05-08 23:16:29,434 kenma_eng WARNING /home/parappa/Retrieval-based-Voice-Conversion-WebUI/train is not a git repository, therefore hash value comparison will be ignored.
2023-05-08 23:16:30,257 kenma_eng INFO loaded pretrained pretrained/G40k.pth pretrained/D40k.pth
2023-05-08 23:16:33,875 kenma_eng INFO Train Epoch: 1 [0%]
2023-05-08 23:16:33,875 kenma_eng INFO [0, 0.0001]
2023-05-08 23:16:33,875 kenma_eng INFO loss_disc=2.865, loss_gen=2.661, loss_fm=12.986,loss_mel=27.710, loss_kl=5.000
2023-05-08 23:16:43,850 kenma_eng INFO ====> Epoch: 1
2023-05-08 23:16:51,519 kenma_eng INFO ====> Epoch: 2
2023-05-08 23:16:59,170 kenma_eng INFO ====> Epoch: 3
2023-05-08 23:17:06,842 kenma_eng INFO ====> Epoch: 4
2023-05-08 23:17:14,520 kenma_eng INFO Saving model and optimizer state at epoch 5 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:17:14,928 kenma_eng INFO Saving model and optimizer state at epoch 5 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:17:15,398 kenma_eng INFO ====> Epoch: 5
2023-05-08 23:17:16,579 kenma_eng INFO Train Epoch: 6 [64%]
2023-05-08 23:17:16,581 kenma_eng INFO [200, 9.993751562304699e-05]
2023-05-08 23:17:16,581 kenma_eng INFO loss_disc=3.256, loss_gen=2.900, loss_fm=14.750,loss_mel=22.623, loss_kl=2.727
2023-05-08 23:17:23,363 kenma_eng INFO ====> Epoch: 6
2023-05-08 23:17:31,038 kenma_eng INFO ====> Epoch: 7
2023-05-08 23:17:38,720 kenma_eng INFO ====> Epoch: 8
2023-05-08 23:17:46,403 kenma_eng INFO ====> Epoch: 9
2023-05-08 23:17:54,083 kenma_eng INFO Saving model and optimizer state at epoch 10 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:17:54,588 kenma_eng INFO Saving model and optimizer state at epoch 10 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:17:55,221 kenma_eng INFO ====> Epoch: 10
2023-05-08 23:17:57,412 kenma_eng INFO Train Epoch: 11 [87%]
2023-05-08 23:17:57,414 kenma_eng INFO [400, 9.987507028906759e-05]
2023-05-08 23:17:57,414 kenma_eng INFO loss_disc=2.973, loss_gen=2.690, loss_fm=9.204,loss_mel=17.433, loss_kl=1.308
2023-05-08 23:18:03,121 kenma_eng INFO ====> Epoch: 11
2023-05-08 23:18:10,833 kenma_eng INFO ====> Epoch: 12
2023-05-08 23:18:18,570 kenma_eng INFO ====> Epoch: 13
2023-05-08 23:18:26,316 kenma_eng INFO ====> Epoch: 14
2023-05-08 23:18:34,038 kenma_eng INFO Saving model and optimizer state at epoch 15 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:18:34,479 kenma_eng INFO Saving model and optimizer state at epoch 15 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:18:35,055 kenma_eng INFO ====> Epoch: 15
2023-05-08 23:18:38,225 kenma_eng INFO Train Epoch: 16 [64%]
2023-05-08 23:18:38,227 kenma_eng INFO [600, 9.981266397366609e-05]
2023-05-08 23:18:38,233 kenma_eng INFO loss_disc=2.721, loss_gen=3.351, loss_fm=12.137,loss_mel=21.595, loss_kl=2.537
2023-05-08 23:18:43,061 kenma_eng INFO ====> Epoch: 16
2023-05-08 23:18:50,809 kenma_eng INFO ====> Epoch: 17
2023-05-08 23:18:58,553 kenma_eng INFO ====> Epoch: 18
2023-05-08 23:19:06,305 kenma_eng INFO ====> Epoch: 19
2023-05-08 23:19:14,071 kenma_eng INFO Saving model and optimizer state at epoch 20 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:19:14,509 kenma_eng INFO Saving model and optimizer state at epoch 20 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:19:15,088 kenma_eng INFO ====> Epoch: 20
2023-05-08 23:19:19,247 kenma_eng INFO Train Epoch: 21 [3%]
2023-05-08 23:19:19,249 kenma_eng INFO [800, 9.975029665246193e-05]
2023-05-08 23:19:19,255 kenma_eng INFO loss_disc=2.381, loss_gen=3.654, loss_fm=16.410,loss_mel=22.015, loss_kl=1.045
2023-05-08 23:19:23,079 kenma_eng INFO ====> Epoch: 21
2023-05-08 23:19:30,836 kenma_eng INFO ====> Epoch: 22
2023-05-08 23:19:38,610 kenma_eng INFO ====> Epoch: 23
2023-05-08 23:19:46,392 kenma_eng INFO ====> Epoch: 24
2023-05-08 23:19:54,174 kenma_eng INFO Saving model and optimizer state at epoch 25 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:19:54,687 kenma_eng INFO Saving model and optimizer state at epoch 25 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:19:55,263 kenma_eng INFO ====> Epoch: 25
2023-05-08 23:20:00,421 kenma_eng INFO Train Epoch: 26 [79%]
2023-05-08 23:20:00,423 kenma_eng INFO [1000, 9.968796830108985e-05]
2023-05-08 23:20:00,424 kenma_eng INFO loss_disc=2.815, loss_gen=2.932, loss_fm=13.485,loss_mel=22.573, loss_kl=2.084
2023-05-08 23:20:03,231 kenma_eng INFO ====> Epoch: 26
2023-05-08 23:20:11,057 kenma_eng INFO ====> Epoch: 27
2023-05-08 23:20:18,894 kenma_eng INFO ====> Epoch: 28
2023-05-08 23:20:26,705 kenma_eng INFO ====> Epoch: 29
2023-05-08 23:20:34,555 kenma_eng INFO Saving model and optimizer state at epoch 30 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:20:34,998 kenma_eng INFO Saving model and optimizer state at epoch 30 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:20:35,579 kenma_eng INFO ====> Epoch: 30
2023-05-08 23:20:41,757 kenma_eng INFO Train Epoch: 31 [95%]
2023-05-08 23:20:41,759 kenma_eng INFO [1200, 9.962567889519979e-05]
2023-05-08 23:20:41,759 kenma_eng INFO loss_disc=2.354, loss_gen=3.439, loss_fm=14.556,loss_mel=21.892, loss_kl=2.104
2023-05-08 23:20:43,557 kenma_eng INFO ====> Epoch: 31
2023-05-08 23:20:51,393 kenma_eng INFO ====> Epoch: 32
2023-05-08 23:20:59,229 kenma_eng INFO ====> Epoch: 33
2023-05-08 23:21:07,063 kenma_eng INFO ====> Epoch: 34
2023-05-08 23:21:14,902 kenma_eng INFO Saving model and optimizer state at epoch 35 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:21:15,345 kenma_eng INFO Saving model and optimizer state at epoch 35 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:21:16,004 kenma_eng INFO ====> Epoch: 35
2023-05-08 23:21:23,238 kenma_eng INFO Train Epoch: 36 [90%]
2023-05-08 23:21:23,240 kenma_eng INFO [1400, 9.956342841045691e-05]
2023-05-08 23:21:23,240 kenma_eng INFO loss_disc=2.838, loss_gen=3.338, loss_fm=11.191,loss_mel=22.750, loss_kl=1.916
2023-05-08 23:21:24,048 kenma_eng INFO ====> Epoch: 36
2023-05-08 23:21:31,856 kenma_eng INFO ====> Epoch: 37
2023-05-08 23:21:39,717 kenma_eng INFO ====> Epoch: 38
2023-05-08 23:21:47,559 kenma_eng INFO ====> Epoch: 39
2023-05-08 23:21:55,380 kenma_eng INFO Saving model and optimizer state at epoch 40 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:21:55,826 kenma_eng INFO Saving model and optimizer state at epoch 40 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:21:56,409 kenma_eng INFO ====> Epoch: 40
2023-05-08 23:22:04,246 kenma_eng INFO ====> Epoch: 41
2023-05-08 23:22:04,648 kenma_eng INFO Train Epoch: 42 [77%]
2023-05-08 23:22:04,650 kenma_eng INFO [1600, 9.948877917043875e-05]
2023-05-08 23:22:04,650 kenma_eng INFO loss_disc=2.622, loss_gen=3.326, loss_fm=13.962,loss_mel=21.530, loss_kl=1.679
2023-05-08 23:22:12,307 kenma_eng INFO ====> Epoch: 42
2023-05-08 23:22:20,153 kenma_eng INFO ====> Epoch: 43
2023-05-08 23:22:27,975 kenma_eng INFO ====> Epoch: 44
2023-05-08 23:22:35,867 kenma_eng INFO Saving model and optimizer state at epoch 45 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:22:36,326 kenma_eng INFO Saving model and optimizer state at epoch 45 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:22:36,907 kenma_eng INFO ====> Epoch: 45
2023-05-08 23:22:44,804 kenma_eng INFO ====> Epoch: 46
2023-05-08 23:22:46,210 kenma_eng INFO Train Epoch: 47 [41%]
2023-05-08 23:22:46,212 kenma_eng INFO [1800, 9.942661422663591e-05]
2023-05-08 23:22:46,212 kenma_eng INFO loss_disc=2.595, loss_gen=3.967, loss_fm=11.861,loss_mel=20.913, loss_kl=1.444
2023-05-08 23:22:52,914 kenma_eng INFO ====> Epoch: 47
2023-05-08 23:23:00,743 kenma_eng INFO ====> Epoch: 48
2023-05-08 23:23:08,617 kenma_eng INFO ====> Epoch: 49
2023-05-08 23:23:16,504 kenma_eng INFO Saving model and optimizer state at epoch 50 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:23:16,945 kenma_eng INFO Saving model and optimizer state at epoch 50 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:23:17,525 kenma_eng INFO ====> Epoch: 50
2023-05-08 23:23:25,389 kenma_eng INFO ====> Epoch: 51
2023-05-08 23:23:27,806 kenma_eng INFO Train Epoch: 52 [41%]
2023-05-08 23:23:27,809 kenma_eng INFO [2000, 9.936448812621091e-05]
2023-05-08 23:23:27,809 kenma_eng INFO loss_disc=2.828, loss_gen=3.041, loss_fm=12.945,loss_mel=22.283, loss_kl=2.018
2023-05-08 23:23:33,516 kenma_eng INFO ====> Epoch: 52
2023-05-08 23:23:41,487 kenma_eng INFO ====> Epoch: 53
2023-05-08 23:23:49,359 kenma_eng INFO ====> Epoch: 54
2023-05-08 23:23:57,267 kenma_eng INFO Saving model and optimizer state at epoch 55 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:23:57,714 kenma_eng INFO Saving model and optimizer state at epoch 55 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:23:58,294 kenma_eng INFO ====> Epoch: 55
2023-05-08 23:24:06,121 kenma_eng INFO ====> Epoch: 56
2023-05-08 23:24:09,605 kenma_eng INFO Train Epoch: 57 [38%]
2023-05-08 23:24:09,607 kenma_eng INFO [2200, 9.930240084489267e-05]
2023-05-08 23:24:09,607 kenma_eng INFO loss_disc=2.341, loss_gen=3.577, loss_fm=14.804,loss_mel=21.295, loss_kl=0.870
2023-05-08 23:24:14,264 kenma_eng INFO ====> Epoch: 57
2023-05-08 23:24:22,253 kenma_eng INFO ====> Epoch: 58
2023-05-08 23:24:30,207 kenma_eng INFO ====> Epoch: 59
2023-05-08 23:24:38,030 kenma_eng INFO Saving model and optimizer state at epoch 60 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:24:38,477 kenma_eng INFO Saving model and optimizer state at epoch 60 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:24:39,064 kenma_eng INFO ====> Epoch: 60
2023-05-08 23:24:47,032 kenma_eng INFO ====> Epoch: 61
2023-05-08 23:24:51,555 kenma_eng INFO Train Epoch: 62 [33%]
2023-05-08 23:24:51,557 kenma_eng INFO [2400, 9.924035235842533e-05]
2023-05-08 23:24:51,557 kenma_eng INFO loss_disc=2.740, loss_gen=3.562, loss_fm=12.825,loss_mel=20.778, loss_kl=1.276
2023-05-08 23:24:55,228 kenma_eng INFO ====> Epoch: 62
2023-05-08 23:25:03,207 kenma_eng INFO ====> Epoch: 63
2023-05-08 23:25:11,196 kenma_eng INFO ====> Epoch: 64
2023-05-08 23:25:19,180 kenma_eng INFO Saving model and optimizer state at epoch 65 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:25:19,694 kenma_eng INFO Saving model and optimizer state at epoch 65 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:25:20,280 kenma_eng INFO ====> Epoch: 65
2023-05-08 23:25:28,192 kenma_eng INFO ====> Epoch: 66
2023-05-08 23:25:33,727 kenma_eng INFO Train Epoch: 67 [0%]
2023-05-08 23:25:33,729 kenma_eng INFO [2600, 9.917834264256819e-05]
2023-05-08 23:25:33,729 kenma_eng INFO loss_disc=2.601, loss_gen=3.035, loss_fm=10.245,loss_mel=21.210, loss_kl=2.084
2023-05-08 23:25:36,402 kenma_eng INFO ====> Epoch: 67
2023-05-08 23:25:44,389 kenma_eng INFO ====> Epoch: 68
2023-05-08 23:25:52,373 kenma_eng INFO ====> Epoch: 69
2023-05-08 23:26:00,358 kenma_eng INFO Saving model and optimizer state at epoch 70 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:26:00,805 kenma_eng INFO Saving model and optimizer state at epoch 70 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:26:01,399 kenma_eng INFO ====> Epoch: 70
2023-05-08 23:26:09,359 kenma_eng INFO ====> Epoch: 71
2023-05-08 23:26:15,915 kenma_eng INFO Train Epoch: 72 [8%]
2023-05-08 23:26:15,917 kenma_eng INFO [2800, 9.911637167309565e-05]
2023-05-08 23:26:15,917 kenma_eng INFO loss_disc=2.281, loss_gen=3.312, loss_fm=16.278,loss_mel=20.006, loss_kl=0.428
2023-05-08 23:26:17,543 kenma_eng INFO ====> Epoch: 72
2023-05-08 23:26:25,536 kenma_eng INFO ====> Epoch: 73
2023-05-08 23:26:33,529 kenma_eng INFO ====> Epoch: 74
2023-05-08 23:26:41,518 kenma_eng INFO Saving model and optimizer state at epoch 75 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:26:42,031 kenma_eng INFO Saving model and optimizer state at epoch 75 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:26:42,634 kenma_eng INFO ====> Epoch: 75
2023-05-08 23:26:50,526 kenma_eng INFO ====> Epoch: 76
2023-05-08 23:26:58,105 kenma_eng INFO Train Epoch: 77 [56%]
2023-05-08 23:26:58,107 kenma_eng INFO [3000, 9.905443942579728e-05]
2023-05-08 23:26:58,107 kenma_eng INFO loss_disc=2.649, loss_gen=3.208, loss_fm=13.554,loss_mel=20.792, loss_kl=1.335
2023-05-08 23:26:58,734 kenma_eng INFO ====> Epoch: 77
2023-05-08 23:27:06,700 kenma_eng INFO ====> Epoch: 78
2023-05-08 23:27:14,694 kenma_eng INFO ====> Epoch: 79
2023-05-08 23:27:22,675 kenma_eng INFO Saving model and optimizer state at epoch 80 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:27:23,121 kenma_eng INFO Saving model and optimizer state at epoch 80 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:27:23,707 kenma_eng INFO ====> Epoch: 80
2023-05-08 23:27:31,694 kenma_eng INFO ====> Epoch: 81
2023-05-08 23:27:39,679 kenma_eng INFO ====> Epoch: 82
2023-05-08 23:27:40,296 kenma_eng INFO Train Epoch: 83 [90%]
2023-05-08 23:27:40,297 kenma_eng INFO [3200, 9.89801718082432e-05]
2023-05-08 23:27:40,298 kenma_eng INFO loss_disc=2.784, loss_gen=3.327, loss_fm=13.070,loss_mel=20.826, loss_kl=1.661
2023-05-08 23:27:47,887 kenma_eng INFO ====> Epoch: 83
2023-05-08 23:27:55,860 kenma_eng INFO ====> Epoch: 84
2023-05-08 23:28:03,843 kenma_eng INFO Saving model and optimizer state at epoch 85 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:28:04,357 kenma_eng INFO Saving model and optimizer state at epoch 85 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:28:04,947 kenma_eng INFO ====> Epoch: 85
2023-05-08 23:28:12,858 kenma_eng INFO ====> Epoch: 86
2023-05-08 23:28:20,842 kenma_eng INFO ====> Epoch: 87
2023-05-08 23:28:22,477 kenma_eng INFO Train Epoch: 88 [69%]
2023-05-08 23:28:22,478 kenma_eng INFO [3400, 9.891832466458178e-05]
2023-05-08 23:28:22,478 kenma_eng INFO loss_disc=2.293, loss_gen=3.493, loss_fm=13.095,loss_mel=20.620, loss_kl=1.292
2023-05-08 23:28:29,030 kenma_eng INFO ====> Epoch: 88
2023-05-08 23:28:37,020 kenma_eng INFO ====> Epoch: 89
2023-05-08 23:28:45,012 kenma_eng INFO Saving model and optimizer state at epoch 90 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:28:45,460 kenma_eng INFO Saving model and optimizer state at epoch 90 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:28:46,058 kenma_eng INFO ====> Epoch: 90
2023-05-08 23:28:54,021 kenma_eng INFO ====> Epoch: 91
2023-05-08 23:29:02,004 kenma_eng INFO ====> Epoch: 92
2023-05-08 23:29:04,663 kenma_eng INFO Train Epoch: 93 [54%]
2023-05-08 23:29:04,665 kenma_eng INFO [3600, 9.885651616572276e-05]
2023-05-08 23:29:04,665 kenma_eng INFO loss_disc=2.701, loss_gen=3.218, loss_fm=14.823,loss_mel=21.348, loss_kl=1.524
2023-05-08 23:29:10,207 kenma_eng INFO ====> Epoch: 93
2023-05-08 23:29:18,185 kenma_eng INFO ====> Epoch: 94
2023-05-08 23:29:26,168 kenma_eng INFO Saving model and optimizer state at epoch 95 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:29:26,687 kenma_eng INFO Saving model and optimizer state at epoch 95 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:29:27,286 kenma_eng INFO ====> Epoch: 95
2023-05-08 23:29:35,168 kenma_eng INFO ====> Epoch: 96
2023-05-08 23:29:43,162 kenma_eng INFO ====> Epoch: 97
2023-05-08 23:29:46,855 kenma_eng INFO Train Epoch: 98 [77%]
2023-05-08 23:29:46,857 kenma_eng INFO [3800, 9.879474628751914e-05]
2023-05-08 23:29:46,857 kenma_eng INFO loss_disc=2.413, loss_gen=3.459, loss_fm=11.920,loss_mel=20.858, loss_kl=1.785
2023-05-08 23:29:51,362 kenma_eng INFO ====> Epoch: 98
2023-05-08 23:29:59,345 kenma_eng INFO ====> Epoch: 99
2023-05-08 23:30:07,336 kenma_eng INFO Saving model and optimizer state at epoch 100 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:30:07,786 kenma_eng INFO Saving model and optimizer state at epoch 100 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:30:08,374 kenma_eng INFO ====> Epoch: 100
2023-05-08 23:30:16,340 kenma_eng INFO ====> Epoch: 101
2023-05-08 23:30:24,332 kenma_eng INFO ====> Epoch: 102
2023-05-08 23:30:29,044 kenma_eng INFO Train Epoch: 103 [36%]
2023-05-08 23:30:29,046 kenma_eng INFO [4000, 9.873301500583906e-05]
2023-05-08 23:30:29,046 kenma_eng INFO loss_disc=2.663, loss_gen=3.403, loss_fm=14.288,loss_mel=20.431, loss_kl=1.184
2023-05-08 23:30:32,543 kenma_eng INFO ====> Epoch: 103
2023-05-08 23:30:40,508 kenma_eng INFO ====> Epoch: 104
2023-05-08 23:30:48,502 kenma_eng INFO Saving model and optimizer state at epoch 105 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:30:49,017 kenma_eng INFO Saving model and optimizer state at epoch 105 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:30:49,617 kenma_eng INFO ====> Epoch: 105
2023-05-08 23:30:57,507 kenma_eng INFO ====> Epoch: 106
2023-05-08 23:31:05,520 kenma_eng INFO ====> Epoch: 107
2023-05-08 23:31:11,235 kenma_eng INFO Train Epoch: 108 [3%]
2023-05-08 23:31:11,237 kenma_eng INFO [4200, 9.867132229656573e-05]
2023-05-08 23:31:11,237 kenma_eng INFO loss_disc=1.916, loss_gen=3.703, loss_fm=16.316,loss_mel=18.907, loss_kl=1.553
2023-05-08 23:31:13,688 kenma_eng INFO ====> Epoch: 108
2023-05-08 23:31:21,677 kenma_eng INFO ====> Epoch: 109
2023-05-08 23:31:29,658 kenma_eng INFO Saving model and optimizer state at epoch 110 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:31:30,174 kenma_eng INFO Saving model and optimizer state at epoch 110 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:31:30,771 kenma_eng INFO ====> Epoch: 110
2023-05-08 23:31:38,672 kenma_eng INFO ====> Epoch: 111
2023-05-08 23:31:46,661 kenma_eng INFO ====> Epoch: 112
2023-05-08 23:31:53,418 kenma_eng INFO Train Epoch: 113 [10%]
2023-05-08 23:31:53,420 kenma_eng INFO [4400, 9.86096681355974e-05]
2023-05-08 23:31:53,420 kenma_eng INFO loss_disc=1.807, loss_gen=4.457, loss_fm=15.467,loss_mel=19.595, loss_kl=1.601
2023-05-08 23:31:54,848 kenma_eng INFO ====> Epoch: 113
2023-05-08 23:32:02,841 kenma_eng INFO ====> Epoch: 114
2023-05-08 23:32:10,823 kenma_eng INFO Saving model and optimizer state at epoch 115 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:32:11,338 kenma_eng INFO Saving model and optimizer state at epoch 115 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:32:11,933 kenma_eng INFO ====> Epoch: 115
2023-05-08 23:32:19,838 kenma_eng INFO ====> Epoch: 116
2023-05-08 23:32:27,835 kenma_eng INFO ====> Epoch: 117
2023-05-08 23:32:35,603 kenma_eng INFO Train Epoch: 118 [46%]
2023-05-08 23:32:35,605 kenma_eng INFO [4600, 9.854805249884741e-05]
2023-05-08 23:32:35,605 kenma_eng INFO loss_disc=1.976, loss_gen=4.144, loss_fm=14.899,loss_mel=20.415, loss_kl=0.905
2023-05-08 23:32:36,017 kenma_eng INFO ====> Epoch: 118
2023-05-08 23:32:44,002 kenma_eng INFO ====> Epoch: 119
2023-05-08 23:32:51,993 kenma_eng INFO Saving model and optimizer state at epoch 120 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:32:52,439 kenma_eng INFO Saving model and optimizer state at epoch 120 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:32:53,035 kenma_eng INFO ====> Epoch: 120
2023-05-08 23:33:01,001 kenma_eng INFO ====> Epoch: 121
2023-05-08 23:33:08,988 kenma_eng INFO ====> Epoch: 122
2023-05-08 23:33:16,979 kenma_eng INFO ====> Epoch: 123
2023-05-08 23:33:17,795 kenma_eng INFO Train Epoch: 124 [15%]
2023-05-08 23:33:17,797 kenma_eng INFO [4800, 9.847416455282387e-05]
2023-05-08 23:33:17,797 kenma_eng INFO loss_disc=2.059, loss_gen=3.385, loss_fm=13.991,loss_mel=20.368, loss_kl=1.581
2023-05-08 23:33:25,166 kenma_eng INFO ====> Epoch: 124
2023-05-08 23:33:33,154 kenma_eng INFO Saving model and optimizer state at epoch 125 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:33:33,602 kenma_eng INFO Saving model and optimizer state at epoch 125 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:33:34,191 kenma_eng INFO ====> Epoch: 125
2023-05-08 23:33:42,168 kenma_eng INFO ====> Epoch: 126
2023-05-08 23:33:50,149 kenma_eng INFO ====> Epoch: 127
2023-05-08 23:33:58,139 kenma_eng INFO ====> Epoch: 128
2023-05-08 23:33:59,983 kenma_eng INFO Train Epoch: 129 [21%]
2023-05-08 23:33:59,985 kenma_eng INFO [5000, 9.841263358464336e-05]
2023-05-08 23:33:59,985 kenma_eng INFO loss_disc=2.190, loss_gen=4.590, loss_fm=14.053,loss_mel=20.084, loss_kl=1.679
2023-05-08 23:34:06,332 kenma_eng INFO ====> Epoch: 129
2023-05-08 23:34:14,316 kenma_eng INFO Saving model and optimizer state at epoch 130 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:34:14,754 kenma_eng INFO Saving model and optimizer state at epoch 130 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:34:15,340 kenma_eng INFO ====> Epoch: 130
2023-05-08 23:34:23,386 kenma_eng INFO ====> Epoch: 131
2023-05-08 23:34:31,310 kenma_eng INFO ====> Epoch: 132
2023-05-08 23:34:39,298 kenma_eng INFO ====> Epoch: 133
2023-05-08 23:34:42,164 kenma_eng INFO Train Epoch: 134 [23%]
2023-05-08 23:34:42,166 kenma_eng INFO [5200, 9.835114106370493e-05]
2023-05-08 23:34:42,166 kenma_eng INFO loss_disc=2.070, loss_gen=3.735, loss_fm=13.641,loss_mel=19.528, loss_kl=1.520
2023-05-08 23:34:47,491 kenma_eng INFO ====> Epoch: 134
2023-05-08 23:34:55,475 kenma_eng INFO Saving model and optimizer state at epoch 135 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:34:55,917 kenma_eng INFO Saving model and optimizer state at epoch 135 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:34:56,507 kenma_eng INFO ====> Epoch: 135
2023-05-08 23:35:04,564 kenma_eng INFO ====> Epoch: 136
2023-05-08 23:35:12,480 kenma_eng INFO ====> Epoch: 137
2023-05-08 23:35:20,467 kenma_eng INFO ====> Epoch: 138
2023-05-08 23:35:24,388 kenma_eng INFO Train Epoch: 139 [23%]
2023-05-08 23:35:24,390 kenma_eng INFO [5400, 9.828968696598508e-05]
2023-05-08 23:35:24,390 kenma_eng INFO loss_disc=2.168, loss_gen=3.409, loss_fm=14.586,loss_mel=18.876, loss_kl=0.873
2023-05-08 23:35:28,651 kenma_eng INFO ====> Epoch: 139
2023-05-08 23:35:36,641 kenma_eng INFO Saving model and optimizer state at epoch 140 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:35:37,082 kenma_eng INFO Saving model and optimizer state at epoch 140 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:35:37,672 kenma_eng INFO ====> Epoch: 140
2023-05-08 23:35:45,661 kenma_eng INFO ====> Epoch: 141
2023-05-08 23:35:53,641 kenma_eng INFO ====> Epoch: 142
2023-05-08 23:36:01,637 kenma_eng INFO ====> Epoch: 143
2023-05-08 23:36:06,547 kenma_eng INFO Train Epoch: 144 [31%]
2023-05-08 23:36:06,549 kenma_eng INFO [5600, 9.822827126747529e-05]
2023-05-08 23:36:06,549 kenma_eng INFO loss_disc=1.980, loss_gen=3.789, loss_fm=14.169,loss_mel=19.720, loss_kl=0.746
2023-05-08 23:36:09,814 kenma_eng INFO ====> Epoch: 144
2023-05-08 23:36:17,808 kenma_eng INFO Saving model and optimizer state at epoch 145 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:36:18,249 kenma_eng INFO Saving model and optimizer state at epoch 145 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:36:18,847 kenma_eng INFO ====> Epoch: 145
2023-05-08 23:36:26,822 kenma_eng INFO ====> Epoch: 146
2023-05-08 23:36:34,812 kenma_eng INFO ====> Epoch: 147
2023-05-08 23:36:42,795 kenma_eng INFO ====> Epoch: 148
2023-05-08 23:36:48,738 kenma_eng INFO Train Epoch: 149 [28%]
2023-05-08 23:36:48,740 kenma_eng INFO [5800, 9.816689394418209e-05]
2023-05-08 23:36:48,740 kenma_eng INFO loss_disc=1.871, loss_gen=4.337, loss_fm=16.396,loss_mel=21.222, loss_kl=1.508
2023-05-08 23:36:50,973 kenma_eng INFO ====> Epoch: 149
2023-05-08 23:36:58,976 kenma_eng INFO Saving model and optimizer state at epoch 150 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:36:59,418 kenma_eng INFO Saving model and optimizer state at epoch 150 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:37:00,082 kenma_eng INFO ====> Epoch: 150
2023-05-08 23:37:07,986 kenma_eng INFO ====> Epoch: 151
2023-05-08 23:37:15,972 kenma_eng INFO ====> Epoch: 152
2023-05-08 23:37:23,958 kenma_eng INFO ====> Epoch: 153
2023-05-08 23:37:30,918 kenma_eng INFO Train Epoch: 154 [15%]
2023-05-08 23:37:30,920 kenma_eng INFO [6000, 9.810555497212693e-05]
2023-05-08 23:37:30,920 kenma_eng INFO loss_disc=2.035, loss_gen=3.760, loss_fm=14.298,loss_mel=19.946, loss_kl=0.859
2023-05-08 23:37:32,146 kenma_eng INFO ====> Epoch: 154
2023-05-08 23:37:40,134 kenma_eng INFO Saving model and optimizer state at epoch 155 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:37:40,581 kenma_eng INFO Saving model and optimizer state at epoch 155 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:37:41,176 kenma_eng INFO ====> Epoch: 155
2023-05-08 23:37:49,137 kenma_eng INFO ====> Epoch: 156
2023-05-08 23:37:57,140 kenma_eng INFO ====> Epoch: 157
2023-05-08 23:38:05,114 kenma_eng INFO ====> Epoch: 158
2023-05-08 23:38:13,105 kenma_eng INFO Train Epoch: 159 [74%]
2023-05-08 23:38:13,107 kenma_eng INFO [6200, 9.804425432734629e-05]
2023-05-08 23:38:13,107 kenma_eng INFO loss_disc=2.096, loss_gen=3.140, loss_fm=12.766,loss_mel=18.966, loss_kl=1.000
2023-05-08 23:38:13,311 kenma_eng INFO ====> Epoch: 159
2023-05-08 23:38:21,298 kenma_eng INFO Saving model and optimizer state at epoch 160 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:38:21,736 kenma_eng INFO Saving model and optimizer state at epoch 160 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:38:22,323 kenma_eng INFO ====> Epoch: 160
2023-05-08 23:38:30,331 kenma_eng INFO ====> Epoch: 161
2023-05-08 23:38:38,303 kenma_eng INFO ====> Epoch: 162
2023-05-08 23:38:46,283 kenma_eng INFO ====> Epoch: 163
2023-05-08 23:38:54,272 kenma_eng INFO ====> Epoch: 164
2023-05-08 23:38:55,298 kenma_eng INFO Train Epoch: 165 [85%]
2023-05-08 23:38:55,300 kenma_eng INFO [6400, 9.797074411189339e-05]
2023-05-08 23:38:55,300 kenma_eng INFO loss_disc=2.262, loss_gen=3.843, loss_fm=14.559,loss_mel=18.460, loss_kl=2.523
2023-05-08 23:39:02,463 kenma_eng INFO Saving model and optimizer state at epoch 165 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:39:02,906 kenma_eng INFO Saving model and optimizer state at epoch 165 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:39:03,495 kenma_eng INFO ====> Epoch: 165
2023-05-08 23:39:11,468 kenma_eng INFO ====> Epoch: 166
2023-05-08 23:39:19,459 kenma_eng INFO ====> Epoch: 167
2023-05-08 23:39:27,505 kenma_eng INFO ====> Epoch: 168
2023-05-08 23:39:35,435 kenma_eng INFO ====> Epoch: 169
2023-05-08 23:39:37,485 kenma_eng INFO Train Epoch: 170 [8%]
2023-05-08 23:39:37,487 kenma_eng INFO [6600, 9.790952770283884e-05]
2023-05-08 23:39:37,487 kenma_eng INFO loss_disc=1.409, loss_gen=5.128, loss_fm=18.067,loss_mel=19.149, loss_kl=1.303
2023-05-08 23:39:43,615 kenma_eng INFO Saving model and optimizer state at epoch 170 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:39:44,138 kenma_eng INFO Saving model and optimizer state at epoch 170 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:39:44,729 kenma_eng INFO ====> Epoch: 170
2023-05-08 23:39:52,633 kenma_eng INFO ====> Epoch: 171
2023-05-08 23:40:00,623 kenma_eng INFO ====> Epoch: 172
2023-05-08 23:40:08,648 kenma_eng INFO ====> Epoch: 173
2023-05-08 23:40:16,595 kenma_eng INFO ====> Epoch: 174
2023-05-08 23:40:19,755 kenma_eng INFO Train Epoch: 175 [54%]
2023-05-08 23:40:19,756 kenma_eng INFO [6800, 9.784834954447608e-05]
2023-05-08 23:40:19,757 kenma_eng INFO loss_disc=2.381, loss_gen=3.203, loss_fm=11.890,loss_mel=21.385, loss_kl=0.766
2023-05-08 23:40:24,789 kenma_eng INFO Saving model and optimizer state at epoch 175 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:40:25,234 kenma_eng INFO Saving model and optimizer state at epoch 175 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:40:25,825 kenma_eng INFO ====> Epoch: 175
2023-05-08 23:40:33,850 kenma_eng INFO ====> Epoch: 176
2023-05-08 23:40:41,790 kenma_eng INFO ====> Epoch: 177
2023-05-08 23:40:49,786 kenma_eng INFO ====> Epoch: 178
2023-05-08 23:40:57,761 kenma_eng INFO ====> Epoch: 179
2023-05-08 23:41:01,856 kenma_eng INFO Train Epoch: 180 [77%]
2023-05-08 23:41:01,858 kenma_eng INFO [7000, 9.778720961290439e-05]
2023-05-08 23:41:01,858 kenma_eng INFO loss_disc=2.326, loss_gen=3.396, loss_fm=11.155,loss_mel=19.969, loss_kl=0.927
2023-05-08 23:41:05,949 kenma_eng INFO Saving model and optimizer state at epoch 180 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:41:06,388 kenma_eng INFO Saving model and optimizer state at epoch 180 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:41:07,021 kenma_eng INFO ====> Epoch: 180
2023-05-08 23:41:14,966 kenma_eng INFO ====> Epoch: 181
2023-05-08 23:41:22,944 kenma_eng INFO ====> Epoch: 182
2023-05-08 23:41:30,931 kenma_eng INFO ====> Epoch: 183
2023-05-08 23:41:38,913 kenma_eng INFO ====> Epoch: 184
2023-05-08 23:41:44,053 kenma_eng INFO Train Epoch: 185 [0%]
2023-05-08 23:41:44,055 kenma_eng INFO [7200, 9.772610788423802e-05]
2023-05-08 23:41:44,055 kenma_eng INFO loss_disc=2.721, loss_gen=3.359, loss_fm=11.969,loss_mel=20.092, loss_kl=1.441
2023-05-08 23:41:47,110 kenma_eng INFO Saving model and optimizer state at epoch 185 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:41:47,552 kenma_eng INFO Saving model and optimizer state at epoch 185 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:41:48,181 kenma_eng INFO ====> Epoch: 185
2023-05-08 23:41:56,123 kenma_eng INFO ====> Epoch: 186
2023-05-08 23:42:04,106 kenma_eng INFO ====> Epoch: 187
2023-05-08 23:42:12,098 kenma_eng INFO ====> Epoch: 188
2023-05-08 23:42:20,079 kenma_eng INFO ====> Epoch: 189
2023-05-08 23:42:26,233 kenma_eng INFO Train Epoch: 190 [82%]
2023-05-08 23:42:26,235 kenma_eng INFO [7400, 9.766504433460612e-05]
2023-05-08 23:42:26,235 kenma_eng INFO loss_disc=2.463, loss_gen=3.253, loss_fm=12.856,loss_mel=20.189, loss_kl=1.383
2023-05-08 23:42:28,279 kenma_eng INFO Saving model and optimizer state at epoch 190 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:42:28,720 kenma_eng INFO Saving model and optimizer state at epoch 190 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:42:29,312 kenma_eng INFO ====> Epoch: 190
2023-05-08 23:42:37,292 kenma_eng INFO ====> Epoch: 191
2023-05-08 23:42:45,275 kenma_eng INFO ====> Epoch: 192
2023-05-08 23:42:53,260 kenma_eng INFO ====> Epoch: 193
2023-05-08 23:43:01,244 kenma_eng INFO ====> Epoch: 194
2023-05-08 23:43:08,421 kenma_eng INFO Train Epoch: 195 [79%]
2023-05-08 23:43:08,423 kenma_eng INFO [7600, 9.760401894015275e-05]
2023-05-08 23:43:08,423 kenma_eng INFO loss_disc=2.311, loss_gen=3.909, loss_fm=15.471,loss_mel=20.243, loss_kl=0.945
2023-05-08 23:43:09,440 kenma_eng INFO Saving model and optimizer state at epoch 195 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:43:09,885 kenma_eng INFO Saving model and optimizer state at epoch 195 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:43:10,475 kenma_eng INFO ====> Epoch: 195
2023-05-08 23:43:18,448 kenma_eng INFO ====> Epoch: 196
2023-05-08 23:43:26,436 kenma_eng INFO ====> Epoch: 197
2023-05-08 23:43:34,429 kenma_eng INFO ====> Epoch: 198
2023-05-08 23:43:42,411 kenma_eng INFO ====> Epoch: 199
2023-05-08 23:43:50,397 kenma_eng INFO Saving model and optimizer state at epoch 200 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:43:50,842 kenma_eng INFO Saving model and optimizer state at epoch 200 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:43:51,432 kenma_eng INFO ====> Epoch: 200
2023-05-08 23:43:51,633 kenma_eng INFO Train Epoch: 201 [8%]
2023-05-08 23:43:51,635 kenma_eng INFO [7800, 9.753083879807726e-05]
2023-05-08 23:43:51,635 kenma_eng INFO loss_disc=1.920, loss_gen=3.953, loss_fm=12.891,loss_mel=18.567, loss_kl=0.669
2023-05-08 23:43:59,614 kenma_eng INFO ====> Epoch: 201
2023-05-08 23:44:07,601 kenma_eng INFO ====> Epoch: 202
2023-05-08 23:44:15,593 kenma_eng INFO ====> Epoch: 203
2023-05-08 23:44:23,573 kenma_eng INFO ====> Epoch: 204
2023-05-08 23:44:31,560 kenma_eng INFO Saving model and optimizer state at epoch 205 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:44:32,007 kenma_eng INFO Saving model and optimizer state at epoch 205 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:44:32,598 kenma_eng INFO ====> Epoch: 205
2023-05-08 23:44:33,807 kenma_eng INFO Train Epoch: 206 [74%]
2023-05-08 23:44:33,809 kenma_eng INFO [8000, 9.746989726111722e-05]
2023-05-08 23:44:33,809 kenma_eng INFO loss_disc=2.143, loss_gen=3.504, loss_fm=13.130,loss_mel=20.380, loss_kl=0.445
2023-05-08 23:44:40,776 kenma_eng INFO ====> Epoch: 206
2023-05-08 23:44:48,772 kenma_eng INFO ====> Epoch: 207
2023-05-08 23:44:56,761 kenma_eng INFO ====> Epoch: 208
2023-05-08 23:45:04,741 kenma_eng INFO ====> Epoch: 209
2023-05-08 23:45:12,711 kenma_eng INFO Saving model and optimizer state at epoch 210 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:45:13,161 kenma_eng INFO Saving model and optimizer state at epoch 210 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:45:13,755 kenma_eng INFO ====> Epoch: 210
2023-05-08 23:45:15,997 kenma_eng INFO Train Epoch: 211 [56%]
2023-05-08 23:45:15,999 kenma_eng INFO [8200, 9.740899380309685e-05]
2023-05-08 23:45:16,000 kenma_eng INFO loss_disc=2.647, loss_gen=3.253, loss_fm=12.424,loss_mel=19.522, loss_kl=1.341
2023-05-08 23:45:21,938 kenma_eng INFO ====> Epoch: 211
2023-05-08 23:45:29,968 kenma_eng INFO ====> Epoch: 212
2023-05-08 23:45:37,921 kenma_eng INFO ====> Epoch: 213
2023-05-08 23:45:45,913 kenma_eng INFO ====> Epoch: 214
2023-05-08 23:45:53,892 kenma_eng INFO Saving model and optimizer state at epoch 215 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:45:54,339 kenma_eng INFO Saving model and optimizer state at epoch 215 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:45:55,005 kenma_eng INFO ====> Epoch: 215
2023-05-08 23:45:58,212 kenma_eng INFO Train Epoch: 216 [10%]
2023-05-08 23:45:58,213 kenma_eng INFO [8400, 9.734812840022278e-05]
2023-05-08 23:45:58,213 kenma_eng INFO loss_disc=1.651, loss_gen=4.591, loss_fm=16.276,loss_mel=18.822, loss_kl=1.662
2023-05-08 23:46:03,103 kenma_eng INFO ====> Epoch: 216
2023-05-08 23:46:11,152 kenma_eng INFO ====> Epoch: 217
2023-05-08 23:46:19,078 kenma_eng INFO ====> Epoch: 218
2023-05-08 23:46:27,069 kenma_eng INFO ====> Epoch: 219
2023-05-08 23:46:35,047 kenma_eng INFO Saving model and optimizer state at epoch 220 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:46:35,573 kenma_eng INFO Saving model and optimizer state at epoch 220 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:46:36,161 kenma_eng INFO ====> Epoch: 220
2023-05-08 23:46:40,377 kenma_eng INFO Train Epoch: 221 [21%]
2023-05-08 23:46:40,379 kenma_eng INFO [8600, 9.728730102871649e-05]
2023-05-08 23:46:40,379 kenma_eng INFO loss_disc=2.058, loss_gen=3.767, loss_fm=15.166,loss_mel=19.089, loss_kl=1.259
2023-05-08 23:46:44,263 kenma_eng INFO ====> Epoch: 221
2023-05-08 23:46:52,259 kenma_eng INFO ====> Epoch: 222
2023-05-08 23:47:00,242 kenma_eng INFO ====> Epoch: 223
2023-05-08 23:47:08,233 kenma_eng INFO ====> Epoch: 224
2023-05-08 23:47:16,214 kenma_eng INFO Saving model and optimizer state at epoch 225 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:47:16,656 kenma_eng INFO Saving model and optimizer state at epoch 225 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:47:17,319 kenma_eng INFO ====> Epoch: 225
2023-05-08 23:47:22,570 kenma_eng INFO Train Epoch: 226 [79%]
2023-05-08 23:47:22,572 kenma_eng INFO [8800, 9.722651166481428e-05]
2023-05-08 23:47:22,572 kenma_eng INFO loss_disc=2.134, loss_gen=4.268, loss_fm=14.685,loss_mel=20.430, loss_kl=0.855
2023-05-08 23:47:25,435 kenma_eng INFO ====> Epoch: 226
2023-05-08 23:47:33,430 kenma_eng INFO ====> Epoch: 227
2023-05-08 23:47:41,401 kenma_eng INFO ====> Epoch: 228
2023-05-08 23:47:49,399 kenma_eng INFO ====> Epoch: 229
2023-05-08 23:47:57,379 kenma_eng INFO Saving model and optimizer state at epoch 230 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:47:57,900 kenma_eng INFO Saving model and optimizer state at epoch 230 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:47:58,488 kenma_eng INFO ====> Epoch: 230
2023-05-08 23:48:04,754 kenma_eng INFO Train Epoch: 231 [46%]
2023-05-08 23:48:04,756 kenma_eng INFO [9000, 9.716576028476738e-05]
2023-05-08 23:48:04,756 kenma_eng INFO loss_disc=2.041, loss_gen=3.526, loss_fm=14.019,loss_mel=19.166, loss_kl=0.711
2023-05-08 23:48:06,597 kenma_eng INFO ====> Epoch: 231
2023-05-08 23:48:14,585 kenma_eng INFO ====> Epoch: 232
2023-05-08 23:48:22,576 kenma_eng INFO ====> Epoch: 233
2023-05-08 23:48:30,553 kenma_eng INFO ====> Epoch: 234
2023-05-08 23:48:38,542 kenma_eng INFO Saving model and optimizer state at epoch 235 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:48:38,992 kenma_eng INFO Saving model and optimizer state at epoch 235 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:48:39,660 kenma_eng INFO ====> Epoch: 235
2023-05-08 23:48:46,940 kenma_eng INFO Train Epoch: 236 [87%]
2023-05-08 23:48:46,942 kenma_eng INFO [9200, 9.710504686484176e-05]
2023-05-08 23:48:46,942 kenma_eng INFO loss_disc=2.383, loss_gen=3.182, loss_fm=11.174,loss_mel=15.138, loss_kl=0.837
2023-05-08 23:48:47,751 kenma_eng INFO ====> Epoch: 236
2023-05-08 23:48:55,741 kenma_eng INFO ====> Epoch: 237
2023-05-08 23:49:03,728 kenma_eng INFO ====> Epoch: 238
2023-05-08 23:49:11,714 kenma_eng INFO ====> Epoch: 239
2023-05-08 23:49:19,703 kenma_eng INFO Saving model and optimizer state at epoch 240 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:49:20,149 kenma_eng INFO Saving model and optimizer state at epoch 240 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:49:20,742 kenma_eng INFO ====> Epoch: 240
2023-05-08 23:49:28,723 kenma_eng INFO ====> Epoch: 241
2023-05-08 23:49:29,133 kenma_eng INFO Train Epoch: 242 [0%]
2023-05-08 23:49:29,135 kenma_eng INFO [9400, 9.703224083489565e-05]
2023-05-08 23:49:29,136 kenma_eng INFO loss_disc=2.283, loss_gen=3.906, loss_fm=14.977,loss_mel=21.196, loss_kl=1.113
2023-05-08 23:49:36,904 kenma_eng INFO ====> Epoch: 242
2023-05-08 23:49:44,893 kenma_eng INFO ====> Epoch: 243
2023-05-08 23:49:52,895 kenma_eng INFO ====> Epoch: 244
2023-05-08 23:50:00,868 kenma_eng INFO Saving model and optimizer state at epoch 245 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:50:01,309 kenma_eng INFO Saving model and optimizer state at epoch 245 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:50:01,892 kenma_eng INFO ====> Epoch: 245
2023-05-08 23:50:09,880 kenma_eng INFO ====> Epoch: 246
2023-05-08 23:50:11,310 kenma_eng INFO Train Epoch: 247 [41%]
2023-05-08 23:50:11,312 kenma_eng INFO [9600, 9.69716108437664e-05]
2023-05-08 23:50:11,319 kenma_eng INFO loss_disc=2.562, loss_gen=3.062, loss_fm=13.378,loss_mel=19.740, loss_kl=1.235
2023-05-08 23:50:18,069 kenma_eng INFO ====> Epoch: 247
2023-05-08 23:50:26,064 kenma_eng INFO ====> Epoch: 248
2023-05-08 23:50:34,048 kenma_eng INFO ====> Epoch: 249
2023-05-08 23:50:42,027 kenma_eng INFO Saving model and optimizer state at epoch 250 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:50:42,471 kenma_eng INFO Saving model and optimizer state at epoch 250 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:50:43,131 kenma_eng INFO ====> Epoch: 250
2023-05-08 23:50:51,039 kenma_eng INFO ====> Epoch: 251
2023-05-08 23:50:53,497 kenma_eng INFO Train Epoch: 252 [95%]
2023-05-08 23:50:53,499 kenma_eng INFO [9800, 9.691101873690936e-05]
2023-05-08 23:50:53,499 kenma_eng INFO loss_disc=2.281, loss_gen=4.156, loss_fm=13.992,loss_mel=18.462, loss_kl=0.613
2023-05-08 23:50:59,238 kenma_eng INFO ====> Epoch: 252
2023-05-08 23:51:07,244 kenma_eng INFO ====> Epoch: 253
2023-05-08 23:51:15,210 kenma_eng INFO ====> Epoch: 254
2023-05-08 23:51:23,192 kenma_eng INFO Saving model and optimizer state at epoch 255 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:51:23,634 kenma_eng INFO Saving model and optimizer state at epoch 255 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:51:24,218 kenma_eng INFO ====> Epoch: 255
2023-05-08 23:51:32,206 kenma_eng INFO ====> Epoch: 256
2023-05-08 23:51:35,692 kenma_eng INFO Train Epoch: 257 [36%]
2023-05-08 23:51:35,694 kenma_eng INFO [10000, 9.685046449065278e-05]
2023-05-08 23:51:35,694 kenma_eng INFO loss_disc=2.226, loss_gen=3.927, loss_fm=11.863,loss_mel=18.904, loss_kl=1.791
2023-05-08 23:51:40,405 kenma_eng INFO ====> Epoch: 257
2023-05-08 23:51:48,408 kenma_eng INFO ====> Epoch: 258
2023-05-08 23:51:56,374 kenma_eng INFO ====> Epoch: 259
2023-05-08 23:52:04,354 kenma_eng INFO Saving model and optimizer state at epoch 260 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:52:04,801 kenma_eng INFO Saving model and optimizer state at epoch 260 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:52:05,389 kenma_eng INFO ====> Epoch: 260
2023-05-08 23:52:13,367 kenma_eng INFO ====> Epoch: 261
2023-05-08 23:52:17,873 kenma_eng INFO Train Epoch: 262 [51%]
2023-05-08 23:52:17,875 kenma_eng INFO [10200, 9.678994808133967e-05]
2023-05-08 23:52:17,875 kenma_eng INFO loss_disc=2.367, loss_gen=3.552, loss_fm=12.217,loss_mel=14.802, loss_kl=0.540
2023-05-08 23:52:21,594 kenma_eng INFO ====> Epoch: 262
2023-05-08 23:52:29,549 kenma_eng INFO ====> Epoch: 263
2023-05-08 23:52:37,537 kenma_eng INFO ====> Epoch: 264
2023-05-08 23:52:45,520 kenma_eng INFO Saving model and optimizer state at epoch 265 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:52:45,968 kenma_eng INFO Saving model and optimizer state at epoch 265 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:52:46,558 kenma_eng INFO ====> Epoch: 265
2023-05-08 23:52:54,545 kenma_eng INFO ====> Epoch: 266
2023-05-08 23:53:00,067 kenma_eng INFO Train Epoch: 267 [28%]
2023-05-08 23:53:00,069 kenma_eng INFO [10400, 9.67294694853279e-05]
2023-05-08 23:53:00,069 kenma_eng INFO loss_disc=2.086, loss_gen=3.871, loss_fm=16.090,loss_mel=20.210, loss_kl=0.717
2023-05-08 23:53:02,746 kenma_eng INFO ====> Epoch: 267
2023-05-08 23:53:10,712 kenma_eng INFO ====> Epoch: 268
2023-05-08 23:53:18,698 kenma_eng INFO ====> Epoch: 269
2023-05-08 23:53:26,714 kenma_eng INFO Saving model and optimizer state at epoch 270 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:53:27,163 kenma_eng INFO Saving model and optimizer state at epoch 270 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:53:27,755 kenma_eng INFO ====> Epoch: 270
2023-05-08 23:53:35,701 kenma_eng INFO ====> Epoch: 271
2023-05-08 23:53:42,249 kenma_eng INFO Train Epoch: 272 [92%]
2023-05-08 23:53:42,252 kenma_eng INFO [10600, 9.666902867899003e-05]
2023-05-08 23:53:42,252 kenma_eng INFO loss_disc=2.008, loss_gen=3.914, loss_fm=15.602,loss_mel=20.115, loss_kl=1.201
2023-05-08 23:53:43,926 kenma_eng INFO ====> Epoch: 272
2023-05-08 23:53:51,876 kenma_eng INFO ====> Epoch: 273
2023-05-08 23:53:59,874 kenma_eng INFO ====> Epoch: 274
2023-05-08 23:54:07,851 kenma_eng INFO Saving model and optimizer state at epoch 275 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:54:08,295 kenma_eng INFO Saving model and optimizer state at epoch 275 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:54:08,887 kenma_eng INFO ====> Epoch: 275
2023-05-08 23:54:16,861 kenma_eng INFO ====> Epoch: 276
2023-05-08 23:54:24,437 kenma_eng INFO Train Epoch: 277 [64%]
2023-05-08 23:54:24,439 kenma_eng INFO [10800, 9.660862563871342e-05]
2023-05-08 23:54:24,439 kenma_eng INFO loss_disc=2.096, loss_gen=4.270, loss_fm=14.102,loss_mel=18.081, loss_kl=1.553
2023-05-08 23:54:25,127 kenma_eng INFO ====> Epoch: 277
2023-05-08 23:54:33,046 kenma_eng INFO ====> Epoch: 278
2023-05-08 23:54:41,030 kenma_eng INFO ====> Epoch: 279
2023-05-08 23:54:49,015 kenma_eng INFO Saving model and optimizer state at epoch 280 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:54:49,461 kenma_eng INFO Saving model and optimizer state at epoch 280 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:54:50,051 kenma_eng INFO ====> Epoch: 280
2023-05-08 23:54:58,029 kenma_eng INFO ====> Epoch: 281
2023-05-08 23:55:06,018 kenma_eng INFO ====> Epoch: 282
2023-05-08 23:55:06,622 kenma_eng INFO Train Epoch: 283 [64%]
2023-05-08 23:55:06,624 kenma_eng INFO [11000, 9.653619180835758e-05]
2023-05-08 23:55:06,624 kenma_eng INFO loss_disc=1.838, loss_gen=3.801, loss_fm=16.227,loss_mel=18.351, loss_kl=0.419
2023-05-08 23:55:14,199 kenma_eng INFO ====> Epoch: 283
2023-05-08 23:55:22,185 kenma_eng INFO ====> Epoch: 284
2023-05-08 23:55:30,180 kenma_eng INFO Saving model and optimizer state at epoch 285 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:55:30,624 kenma_eng INFO Saving model and optimizer state at epoch 285 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:55:31,216 kenma_eng INFO ====> Epoch: 285
2023-05-08 23:55:39,205 kenma_eng INFO ====> Epoch: 286
2023-05-08 23:55:47,177 kenma_eng INFO ====> Epoch: 287
2023-05-08 23:55:48,811 kenma_eng INFO Train Epoch: 288 [36%]
2023-05-08 23:55:48,813 kenma_eng INFO [11200, 9.647587177037196e-05]
2023-05-08 23:55:48,813 kenma_eng INFO loss_disc=2.106, loss_gen=3.536, loss_fm=15.607,loss_mel=18.624, loss_kl=0.811
2023-05-08 23:55:55,373 kenma_eng INFO ====> Epoch: 288
2023-05-08 23:56:03,350 kenma_eng INFO ====> Epoch: 289
2023-05-08 23:56:11,351 kenma_eng INFO Saving model and optimizer state at epoch 290 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:56:11,802 kenma_eng INFO Saving model and optimizer state at epoch 290 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:56:12,395 kenma_eng INFO ====> Epoch: 290
2023-05-08 23:56:20,350 kenma_eng INFO ====> Epoch: 291
2023-05-08 23:56:28,342 kenma_eng INFO ====> Epoch: 292
2023-05-08 23:56:30,998 kenma_eng INFO Train Epoch: 293 [59%]
2023-05-08 23:56:31,000 kenma_eng INFO [11400, 9.641558942298625e-05]
2023-05-08 23:56:31,000 kenma_eng INFO loss_disc=2.425, loss_gen=3.517, loss_fm=10.334,loss_mel=19.468, loss_kl=0.564
2023-05-08 23:56:36,531 kenma_eng INFO ====> Epoch: 293
2023-05-08 23:56:44,531 kenma_eng INFO ====> Epoch: 294
2023-05-08 23:56:52,503 kenma_eng INFO Saving model and optimizer state at epoch 295 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:56:52,954 kenma_eng INFO Saving model and optimizer state at epoch 295 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:56:53,544 kenma_eng INFO ====> Epoch: 295
2023-05-08 23:57:01,683 kenma_eng INFO ====> Epoch: 296
2023-05-08 23:57:10,165 kenma_eng INFO ====> Epoch: 297
2023-05-08 23:57:14,053 kenma_eng INFO Train Epoch: 298 [18%]
2023-05-08 23:57:14,057 kenma_eng INFO [11600, 9.635534474264972e-05]
2023-05-08 23:57:14,057 kenma_eng INFO loss_disc=2.473, loss_gen=3.393, loss_fm=10.933,loss_mel=18.952, loss_kl=1.169
2023-05-08 23:57:18,589 kenma_eng INFO ====> Epoch: 298
2023-05-08 23:57:26,500 kenma_eng INFO ====> Epoch: 299
2023-05-08 23:57:34,487 kenma_eng INFO Saving model and optimizer state at epoch 300 to ./logs/kenma_eng/G_2333333.pth
2023-05-08 23:57:34,943 kenma_eng INFO Saving model and optimizer state at epoch 300 to ./logs/kenma_eng/D_2333333.pth
2023-05-08 23:57:35,537 kenma_eng INFO ====> Epoch: 300
2023-05-08 23:57:35,538 kenma_eng INFO Training is done. The program is closed.
2023-05-08 23:57:35,596 kenma_eng INFO saving final ckpt:Success.