[2024-07-09 15:38:01,843][__main__][INFO] - [rank: 0] Instantiating datamodule [2024-07-09 15:38:02,439][datasets][INFO] - PyTorch version 2.3.1 available. [2024-07-09 15:38:02,753][__main__][INFO] - [rank: 0] Instantiating model [2024-07-09 15:38:02,781][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Override max_seq_len to 1024 [2024-07-09 15:38:02,813][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Loading model from checkpoints/mix_v1, config: DualARModelArgs(model_type='dual_ar', vocab_size=32000, n_layer=24, n_head=16, dim=1024, intermediate_size=4096, n_local_heads=2, head_dim=64, rope_base=1000000.0, norm_eps=1e-06, max_seq_len=1024, dropout=0.1, tie_word_embeddings=False, attention_qkv_bias=False, codebook_size=1024, num_codebooks=4, use_gradient_checkpointing=True, initializer_range=0.02, n_fast_layer=4) [2024-07-09 15:38:08,926][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] LoRA setup: LoraConfig(r=8, lora_alpha=16, lora_dropout=0.01) [2024-07-09 15:38:08,990][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Loaded weights with error: _IncompatibleKeys(missing_keys=['embeddings.lora_A', 'embeddings.lora_B', 'codebook_embeddings.lora_A', 'codebook_embeddings.lora_B', 'layers.0.attention.wqkv.lora_A', 'layers.0.attention.wqkv.lora_B', 'layers.0.attention.wo.lora_A', 'layers.0.attention.wo.lora_B', 'layers.0.feed_forward.w1.lora_A', 'layers.0.feed_forward.w1.lora_B', 'layers.0.feed_forward.w3.lora_A', 'layers.0.feed_forward.w3.lora_B', 'layers.0.feed_forward.w2.lora_A', 'layers.0.feed_forward.w2.lora_B', 'layers.1.attention.wqkv.lora_A', 'layers.1.attention.wqkv.lora_B', 'layers.1.attention.wo.lora_A', 'layers.1.attention.wo.lora_B', 'layers.1.feed_forward.w1.lora_A', 'layers.1.feed_forward.w1.lora_B', 'layers.1.feed_forward.w3.lora_A', 'layers.1.feed_forward.w3.lora_B', 'layers.1.feed_forward.w2.lora_A', 'layers.1.feed_forward.w2.lora_B', 'layers.2.attention.wqkv.lora_A', 'layers.2.attention.wqkv.lora_B', 'layers.2.attention.wo.lora_A', 'layers.2.attention.wo.lora_B', 'layers.2.feed_forward.w1.lora_A', 'layers.2.feed_forward.w1.lora_B', 'layers.2.feed_forward.w3.lora_A', 'layers.2.feed_forward.w3.lora_B', 'layers.2.feed_forward.w2.lora_A', 'layers.2.feed_forward.w2.lora_B', 'layers.3.attention.wqkv.lora_A', 'layers.3.attention.wqkv.lora_B', 'layers.3.attention.wo.lora_A', 'layers.3.attention.wo.lora_B', 'layers.3.feed_forward.w1.lora_A', 'layers.3.feed_forward.w1.lora_B', 'layers.3.feed_forward.w3.lora_A', 'layers.3.feed_forward.w3.lora_B', 'layers.3.feed_forward.w2.lora_A', 'layers.3.feed_forward.w2.lora_B', 'layers.4.attention.wqkv.lora_A', 'layers.4.attention.wqkv.lora_B', 'layers.4.attention.wo.lora_A', 'layers.4.attention.wo.lora_B', 'layers.4.feed_forward.w1.lora_A', 'layers.4.feed_forward.w1.lora_B', 'layers.4.feed_forward.w3.lora_A', 'layers.4.feed_forward.w3.lora_B', 'layers.4.feed_forward.w2.lora_A', 'layers.4.feed_forward.w2.lora_B', 'layers.5.attention.wqkv.lora_A', 'layers.5.attention.wqkv.lora_B', 'layers.5.attention.wo.lora_A', 'layers.5.attention.wo.lora_B', 'layers.5.feed_forward.w1.lora_A', 'layers.5.feed_forward.w1.lora_B', 'layers.5.feed_forward.w3.lora_A', 'layers.5.feed_forward.w3.lora_B', 'layers.5.feed_forward.w2.lora_A', 'layers.5.feed_forward.w2.lora_B', 'layers.6.attention.wqkv.lora_A', 'layers.6.attention.wqkv.lora_B', 'layers.6.attention.wo.lora_A', 'layers.6.attention.wo.lora_B', 'layers.6.feed_forward.w1.lora_A', 'layers.6.feed_forward.w1.lora_B', 'layers.6.feed_forward.w3.lora_A', 'layers.6.feed_forward.w3.lora_B', 'layers.6.feed_forward.w2.lora_A', 'layers.6.feed_forward.w2.lora_B', 'layers.7.attention.wqkv.lora_A', 'layers.7.attention.wqkv.lora_B', 'layers.7.attention.wo.lora_A', 'layers.7.attention.wo.lora_B', 'layers.7.feed_forward.w1.lora_A', 'layers.7.feed_forward.w1.lora_B', 'layers.7.feed_forward.w3.lora_A', 'layers.7.feed_forward.w3.lora_B', 'layers.7.feed_forward.w2.lora_A', 'layers.7.feed_forward.w2.lora_B', 'layers.8.attention.wqkv.lora_A', 'layers.8.attention.wqkv.lora_B', 'layers.8.attention.wo.lora_A', 'layers.8.attention.wo.lora_B', 'layers.8.feed_forward.w1.lora_A', 'layers.8.feed_forward.w1.lora_B', 'layers.8.feed_forward.w3.lora_A', 'layers.8.feed_forward.w3.lora_B', 'layers.8.feed_forward.w2.lora_A', 'layers.8.feed_forward.w2.lora_B', 'layers.9.attention.wqkv.lora_A', 'layers.9.attention.wqkv.lora_B', 'layers.9.attention.wo.lora_A', 'layers.9.attention.wo.lora_B', 'layers.9.feed_forward.w1.lora_A', 'layers.9.feed_forward.w1.lora_B', 'layers.9.feed_forward.w3.lora_A', 'layers.9.feed_forward.w3.lora_B', 'layers.9.feed_forward.w2.lora_A', 'layers.9.feed_forward.w2.lora_B', 'layers.10.attention.wqkv.lora_A', 'layers.10.attention.wqkv.lora_B', 'layers.10.attention.wo.lora_A', 'layers.10.attention.wo.lora_B', 'layers.10.feed_forward.w1.lora_A', 'layers.10.feed_forward.w1.lora_B', 'layers.10.feed_forward.w3.lora_A', 'layers.10.feed_forward.w3.lora_B', 'layers.10.feed_forward.w2.lora_A', 'layers.10.feed_forward.w2.lora_B', 'layers.11.attention.wqkv.lora_A', 'layers.11.attention.wqkv.lora_B', 'layers.11.attention.wo.lora_A', 'layers.11.attention.wo.lora_B', 'layers.11.feed_forward.w1.lora_A', 'layers.11.feed_forward.w1.lora_B', 'layers.11.feed_forward.w3.lora_A', 'layers.11.feed_forward.w3.lora_B', 'layers.11.feed_forward.w2.lora_A', 'layers.11.feed_forward.w2.lora_B', 'layers.12.attention.wqkv.lora_A', 'layers.12.attention.wqkv.lora_B', 'layers.12.attention.wo.lora_A', 'layers.12.attention.wo.lora_B', 'layers.12.feed_forward.w1.lora_A', 'layers.12.feed_forward.w1.lora_B', 'layers.12.feed_forward.w3.lora_A', 'layers.12.feed_forward.w3.lora_B', 'layers.12.feed_forward.w2.lora_A', 'layers.12.feed_forward.w2.lora_B', 'layers.13.attention.wqkv.lora_A', 'layers.13.attention.wqkv.lora_B', 'layers.13.attention.wo.lora_A', 'layers.13.attention.wo.lora_B', 'layers.13.feed_forward.w1.lora_A', 'layers.13.feed_forward.w1.lora_B', 'layers.13.feed_forward.w3.lora_A', 'layers.13.feed_forward.w3.lora_B', 'layers.13.feed_forward.w2.lora_A', 'layers.13.feed_forward.w2.lora_B', 'layers.14.attention.wqkv.lora_A', 'layers.14.attention.wqkv.lora_B', 'layers.14.attention.wo.lora_A', 'layers.14.attention.wo.lora_B', 'layers.14.feed_forward.w1.lora_A', 'layers.14.feed_forward.w1.lora_B', 'layers.14.feed_forward.w3.lora_A', 'layers.14.feed_forward.w3.lora_B', 'layers.14.feed_forward.w2.lora_A', 'layers.14.feed_forward.w2.lora_B', 'layers.15.attention.wqkv.lora_A', 'layers.15.attention.wqkv.lora_B', 'layers.15.attention.wo.lora_A', 'layers.15.attention.wo.lora_B', 'layers.15.feed_forward.w1.lora_A', 'layers.15.feed_forward.w1.lora_B', 'layers.15.feed_forward.w3.lora_A', 'layers.15.feed_forward.w3.lora_B', 'layers.15.feed_forward.w2.lora_A', 'layers.15.feed_forward.w2.lora_B', 'layers.16.attention.wqkv.lora_A', 'layers.16.attention.wqkv.lora_B', 'layers.16.attention.wo.lora_A', 'layers.16.attention.wo.lora_B', 'layers.16.feed_forward.w1.lora_A', 'layers.16.feed_forward.w1.lora_B', 'layers.16.feed_forward.w3.lora_A', 'layers.16.feed_forward.w3.lora_B', 'layers.16.feed_forward.w2.lora_A', 'layers.16.feed_forward.w2.lora_B', 'layers.17.attention.wqkv.lora_A', 'layers.17.attention.wqkv.lora_B', 'layers.17.attention.wo.lora_A', 'layers.17.attention.wo.lora_B', 'layers.17.feed_forward.w1.lora_A', 'layers.17.feed_forward.w1.lora_B', 'layers.17.feed_forward.w3.lora_A', 'layers.17.feed_forward.w3.lora_B', 'layers.17.feed_forward.w2.lora_A', 'layers.17.feed_forward.w2.lora_B', 'layers.18.attention.wqkv.lora_A', 'layers.18.attention.wqkv.lora_B', 'layers.18.attention.wo.lora_A', 'layers.18.attention.wo.lora_B', 'layers.18.feed_forward.w1.lora_A', 'layers.18.feed_forward.w1.lora_B', 'layers.18.feed_forward.w3.lora_A', 'layers.18.feed_forward.w3.lora_B', 'layers.18.feed_forward.w2.lora_A', 'layers.18.feed_forward.w2.lora_B', 'layers.19.attention.wqkv.lora_A', 'layers.19.attention.wqkv.lora_B', 'layers.19.attention.wo.lora_A', 'layers.19.attention.wo.lora_B', 'layers.19.feed_forward.w1.lora_A', 'layers.19.feed_forward.w1.lora_B', 'layers.19.feed_forward.w3.lora_A', 'layers.19.feed_forward.w3.lora_B', 'layers.19.feed_forward.w2.lora_A', 'layers.19.feed_forward.w2.lora_B', 'layers.20.attention.wqkv.lora_A', 'layers.20.attention.wqkv.lora_B', 'layers.20.attention.wo.lora_A', 'layers.20.attention.wo.lora_B', 'layers.20.feed_forward.w1.lora_A', 'layers.20.feed_forward.w1.lora_B', 'layers.20.feed_forward.w3.lora_A', 'layers.20.feed_forward.w3.lora_B', 'layers.20.feed_forward.w2.lora_A', 'layers.20.feed_forward.w2.lora_B', 'layers.21.attention.wqkv.lora_A', 'layers.21.attention.wqkv.lora_B', 'layers.21.attention.wo.lora_A', 'layers.21.attention.wo.lora_B', 'layers.21.feed_forward.w1.lora_A', 'layers.21.feed_forward.w1.lora_B', 'layers.21.feed_forward.w3.lora_A', 'layers.21.feed_forward.w3.lora_B', 'layers.21.feed_forward.w2.lora_A', 'layers.21.feed_forward.w2.lora_B', 'layers.22.attention.wqkv.lora_A', 'layers.22.attention.wqkv.lora_B', 'layers.22.attention.wo.lora_A', 'layers.22.attention.wo.lora_B', 'layers.22.feed_forward.w1.lora_A', 'layers.22.feed_forward.w1.lora_B', 'layers.22.feed_forward.w3.lora_A', 'layers.22.feed_forward.w3.lora_B', 'layers.22.feed_forward.w2.lora_A', 'layers.22.feed_forward.w2.lora_B', 'layers.23.attention.wqkv.lora_A', 'layers.23.attention.wqkv.lora_B', 'layers.23.attention.wo.lora_A', 'layers.23.attention.wo.lora_B', 'layers.23.feed_forward.w1.lora_A', 'layers.23.feed_forward.w1.lora_B', 'layers.23.feed_forward.w3.lora_A', 'layers.23.feed_forward.w3.lora_B', 'layers.23.feed_forward.w2.lora_A', 'layers.23.feed_forward.w2.lora_B', 'output.lora_A', 'output.lora_B', 'fast_embeddings.lora_A', 'fast_embeddings.lora_B', 'fast_layers.0.attention.wqkv.lora_A', 'fast_layers.0.attention.wqkv.lora_B', 'fast_layers.0.attention.wo.lora_A', 'fast_layers.0.attention.wo.lora_B', 'fast_layers.0.feed_forward.w1.lora_A', 'fast_layers.0.feed_forward.w1.lora_B', 'fast_layers.0.feed_forward.w3.lora_A', 'fast_layers.0.feed_forward.w3.lora_B', 'fast_layers.0.feed_forward.w2.lora_A', 'fast_layers.0.feed_forward.w2.lora_B', 'fast_layers.1.attention.wqkv.lora_A', 'fast_layers.1.attention.wqkv.lora_B', 'fast_layers.1.attention.wo.lora_A', 'fast_layers.1.attention.wo.lora_B', 'fast_layers.1.feed_forward.w1.lora_A', 'fast_layers.1.feed_forward.w1.lora_B', 'fast_layers.1.feed_forward.w3.lora_A', 'fast_layers.1.feed_forward.w3.lora_B', 'fast_layers.1.feed_forward.w2.lora_A', 'fast_layers.1.feed_forward.w2.lora_B', 'fast_layers.2.attention.wqkv.lora_A', 'fast_layers.2.attention.wqkv.lora_B', 'fast_layers.2.attention.wo.lora_A', 'fast_layers.2.attention.wo.lora_B', 'fast_layers.2.feed_forward.w1.lora_A', 'fast_layers.2.feed_forward.w1.lora_B', 'fast_layers.2.feed_forward.w3.lora_A', 'fast_layers.2.feed_forward.w3.lora_B', 'fast_layers.2.feed_forward.w2.lora_A', 'fast_layers.2.feed_forward.w2.lora_B', 'fast_layers.3.attention.wqkv.lora_A', 'fast_layers.3.attention.wqkv.lora_B', 'fast_layers.3.attention.wo.lora_A', 'fast_layers.3.attention.wo.lora_B', 'fast_layers.3.feed_forward.w1.lora_A', 'fast_layers.3.feed_forward.w1.lora_B', 'fast_layers.3.feed_forward.w3.lora_A', 'fast_layers.3.feed_forward.w3.lora_B', 'fast_layers.3.feed_forward.w2.lora_A', 'fast_layers.3.feed_forward.w2.lora_B', 'fast_output.lora_A', 'fast_output.lora_B'], unexpected_keys=[]) [2024-07-09 15:38:08,994][__main__][INFO] - [rank: 0] Instantiating callbacks... [2024-07-09 15:38:08,994][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:38:08,997][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:38:08,997][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:38:08,997][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:38:09,006][__main__][INFO] - [rank: 0] Instantiating loggers... [2024-07-09 15:38:09,006][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating logger [2024-07-09 15:38:09,018][__main__][INFO] - [rank: 0] Instantiating trainer [2024-07-09 15:38:09,064][__main__][INFO] - [rank: 0] Logging hyperparameters! [2024-07-09 15:38:09,135][__main__][INFO] - [rank: 0] Starting training! [2024-07-09 15:38:37,548][fish_speech.models.text2semantic.lit_module][INFO] - [rank: 0] Set weight decay: 0.005 for 432 parameters [2024-07-09 15:38:37,548][fish_speech.models.text2semantic.lit_module][INFO] - [rank: 0] Set weight decay: 0.0 for 61 parameters [2024-07-09 15:38:37,885][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:38:37,885][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:38:37,885][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:38:37,885][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:38:37,978][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 2 groups of data [2024-07-09 15:38:37,980][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 1 groups of data [2024-07-09 15:38:38,053][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 5 groups of data [2024-07-09 15:38:38,069][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 4 groups of data [2024-07-09 15:38:39,388][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:38:39,388][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:38:39,388][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:38:39,389][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:38:39,477][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 1 groups of data [2024-07-09 15:38:39,482][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 2 groups of data [2024-07-09 15:38:39,555][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 5 groups of data [2024-07-09 15:38:39,557][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 4 groups of data [2024-07-09 15:43:03,202][__main__][INFO] - [rank: 0] Instantiating datamodule [2024-07-09 15:43:03,439][datasets][INFO] - PyTorch version 2.3.1 available. [2024-07-09 15:43:03,659][__main__][INFO] - [rank: 0] Instantiating model [2024-07-09 15:43:03,688][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Override max_seq_len to 1024 [2024-07-09 15:43:03,723][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Loading model from checkpoints/mix_v1, config: DualARModelArgs(model_type='dual_ar', vocab_size=32000, n_layer=24, n_head=16, dim=1024, intermediate_size=4096, n_local_heads=2, head_dim=64, rope_base=1000000.0, norm_eps=1e-06, max_seq_len=1024, dropout=0.1, tie_word_embeddings=False, attention_qkv_bias=False, codebook_size=1024, num_codebooks=4, use_gradient_checkpointing=True, initializer_range=0.02, n_fast_layer=4) [2024-07-09 15:43:09,799][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] LoRA setup: LoraConfig(r=8, lora_alpha=16, lora_dropout=0.01) [2024-07-09 15:43:09,830][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Loaded weights with error: _IncompatibleKeys(missing_keys=['embeddings.lora_A', 'embeddings.lora_B', 'codebook_embeddings.lora_A', 'codebook_embeddings.lora_B', 'layers.0.attention.wqkv.lora_A', 'layers.0.attention.wqkv.lora_B', 'layers.0.attention.wo.lora_A', 'layers.0.attention.wo.lora_B', 'layers.0.feed_forward.w1.lora_A', 'layers.0.feed_forward.w1.lora_B', 'layers.0.feed_forward.w3.lora_A', 'layers.0.feed_forward.w3.lora_B', 'layers.0.feed_forward.w2.lora_A', 'layers.0.feed_forward.w2.lora_B', 'layers.1.attention.wqkv.lora_A', 'layers.1.attention.wqkv.lora_B', 'layers.1.attention.wo.lora_A', 'layers.1.attention.wo.lora_B', 'layers.1.feed_forward.w1.lora_A', 'layers.1.feed_forward.w1.lora_B', 'layers.1.feed_forward.w3.lora_A', 'layers.1.feed_forward.w3.lora_B', 'layers.1.feed_forward.w2.lora_A', 'layers.1.feed_forward.w2.lora_B', 'layers.2.attention.wqkv.lora_A', 'layers.2.attention.wqkv.lora_B', 'layers.2.attention.wo.lora_A', 'layers.2.attention.wo.lora_B', 'layers.2.feed_forward.w1.lora_A', 'layers.2.feed_forward.w1.lora_B', 'layers.2.feed_forward.w3.lora_A', 'layers.2.feed_forward.w3.lora_B', 'layers.2.feed_forward.w2.lora_A', 'layers.2.feed_forward.w2.lora_B', 'layers.3.attention.wqkv.lora_A', 'layers.3.attention.wqkv.lora_B', 'layers.3.attention.wo.lora_A', 'layers.3.attention.wo.lora_B', 'layers.3.feed_forward.w1.lora_A', 'layers.3.feed_forward.w1.lora_B', 'layers.3.feed_forward.w3.lora_A', 'layers.3.feed_forward.w3.lora_B', 'layers.3.feed_forward.w2.lora_A', 'layers.3.feed_forward.w2.lora_B', 'layers.4.attention.wqkv.lora_A', 'layers.4.attention.wqkv.lora_B', 'layers.4.attention.wo.lora_A', 'layers.4.attention.wo.lora_B', 'layers.4.feed_forward.w1.lora_A', 'layers.4.feed_forward.w1.lora_B', 'layers.4.feed_forward.w3.lora_A', 'layers.4.feed_forward.w3.lora_B', 'layers.4.feed_forward.w2.lora_A', 'layers.4.feed_forward.w2.lora_B', 'layers.5.attention.wqkv.lora_A', 'layers.5.attention.wqkv.lora_B', 'layers.5.attention.wo.lora_A', 'layers.5.attention.wo.lora_B', 'layers.5.feed_forward.w1.lora_A', 'layers.5.feed_forward.w1.lora_B', 'layers.5.feed_forward.w3.lora_A', 'layers.5.feed_forward.w3.lora_B', 'layers.5.feed_forward.w2.lora_A', 'layers.5.feed_forward.w2.lora_B', 'layers.6.attention.wqkv.lora_A', 'layers.6.attention.wqkv.lora_B', 'layers.6.attention.wo.lora_A', 'layers.6.attention.wo.lora_B', 'layers.6.feed_forward.w1.lora_A', 'layers.6.feed_forward.w1.lora_B', 'layers.6.feed_forward.w3.lora_A', 'layers.6.feed_forward.w3.lora_B', 'layers.6.feed_forward.w2.lora_A', 'layers.6.feed_forward.w2.lora_B', 'layers.7.attention.wqkv.lora_A', 'layers.7.attention.wqkv.lora_B', 'layers.7.attention.wo.lora_A', 'layers.7.attention.wo.lora_B', 'layers.7.feed_forward.w1.lora_A', 'layers.7.feed_forward.w1.lora_B', 'layers.7.feed_forward.w3.lora_A', 'layers.7.feed_forward.w3.lora_B', 'layers.7.feed_forward.w2.lora_A', 'layers.7.feed_forward.w2.lora_B', 'layers.8.attention.wqkv.lora_A', 'layers.8.attention.wqkv.lora_B', 'layers.8.attention.wo.lora_A', 'layers.8.attention.wo.lora_B', 'layers.8.feed_forward.w1.lora_A', 'layers.8.feed_forward.w1.lora_B', 'layers.8.feed_forward.w3.lora_A', 'layers.8.feed_forward.w3.lora_B', 'layers.8.feed_forward.w2.lora_A', 'layers.8.feed_forward.w2.lora_B', 'layers.9.attention.wqkv.lora_A', 'layers.9.attention.wqkv.lora_B', 'layers.9.attention.wo.lora_A', 'layers.9.attention.wo.lora_B', 'layers.9.feed_forward.w1.lora_A', 'layers.9.feed_forward.w1.lora_B', 'layers.9.feed_forward.w3.lora_A', 'layers.9.feed_forward.w3.lora_B', 'layers.9.feed_forward.w2.lora_A', 'layers.9.feed_forward.w2.lora_B', 'layers.10.attention.wqkv.lora_A', 'layers.10.attention.wqkv.lora_B', 'layers.10.attention.wo.lora_A', 'layers.10.attention.wo.lora_B', 'layers.10.feed_forward.w1.lora_A', 'layers.10.feed_forward.w1.lora_B', 'layers.10.feed_forward.w3.lora_A', 'layers.10.feed_forward.w3.lora_B', 'layers.10.feed_forward.w2.lora_A', 'layers.10.feed_forward.w2.lora_B', 'layers.11.attention.wqkv.lora_A', 'layers.11.attention.wqkv.lora_B', 'layers.11.attention.wo.lora_A', 'layers.11.attention.wo.lora_B', 'layers.11.feed_forward.w1.lora_A', 'layers.11.feed_forward.w1.lora_B', 'layers.11.feed_forward.w3.lora_A', 'layers.11.feed_forward.w3.lora_B', 'layers.11.feed_forward.w2.lora_A', 'layers.11.feed_forward.w2.lora_B', 'layers.12.attention.wqkv.lora_A', 'layers.12.attention.wqkv.lora_B', 'layers.12.attention.wo.lora_A', 'layers.12.attention.wo.lora_B', 'layers.12.feed_forward.w1.lora_A', 'layers.12.feed_forward.w1.lora_B', 'layers.12.feed_forward.w3.lora_A', 'layers.12.feed_forward.w3.lora_B', 'layers.12.feed_forward.w2.lora_A', 'layers.12.feed_forward.w2.lora_B', 'layers.13.attention.wqkv.lora_A', 'layers.13.attention.wqkv.lora_B', 'layers.13.attention.wo.lora_A', 'layers.13.attention.wo.lora_B', 'layers.13.feed_forward.w1.lora_A', 'layers.13.feed_forward.w1.lora_B', 'layers.13.feed_forward.w3.lora_A', 'layers.13.feed_forward.w3.lora_B', 'layers.13.feed_forward.w2.lora_A', 'layers.13.feed_forward.w2.lora_B', 'layers.14.attention.wqkv.lora_A', 'layers.14.attention.wqkv.lora_B', 'layers.14.attention.wo.lora_A', 'layers.14.attention.wo.lora_B', 'layers.14.feed_forward.w1.lora_A', 'layers.14.feed_forward.w1.lora_B', 'layers.14.feed_forward.w3.lora_A', 'layers.14.feed_forward.w3.lora_B', 'layers.14.feed_forward.w2.lora_A', 'layers.14.feed_forward.w2.lora_B', 'layers.15.attention.wqkv.lora_A', 'layers.15.attention.wqkv.lora_B', 'layers.15.attention.wo.lora_A', 'layers.15.attention.wo.lora_B', 'layers.15.feed_forward.w1.lora_A', 'layers.15.feed_forward.w1.lora_B', 'layers.15.feed_forward.w3.lora_A', 'layers.15.feed_forward.w3.lora_B', 'layers.15.feed_forward.w2.lora_A', 'layers.15.feed_forward.w2.lora_B', 'layers.16.attention.wqkv.lora_A', 'layers.16.attention.wqkv.lora_B', 'layers.16.attention.wo.lora_A', 'layers.16.attention.wo.lora_B', 'layers.16.feed_forward.w1.lora_A', 'layers.16.feed_forward.w1.lora_B', 'layers.16.feed_forward.w3.lora_A', 'layers.16.feed_forward.w3.lora_B', 'layers.16.feed_forward.w2.lora_A', 'layers.16.feed_forward.w2.lora_B', 'layers.17.attention.wqkv.lora_A', 'layers.17.attention.wqkv.lora_B', 'layers.17.attention.wo.lora_A', 'layers.17.attention.wo.lora_B', 'layers.17.feed_forward.w1.lora_A', 'layers.17.feed_forward.w1.lora_B', 'layers.17.feed_forward.w3.lora_A', 'layers.17.feed_forward.w3.lora_B', 'layers.17.feed_forward.w2.lora_A', 'layers.17.feed_forward.w2.lora_B', 'layers.18.attention.wqkv.lora_A', 'layers.18.attention.wqkv.lora_B', 'layers.18.attention.wo.lora_A', 'layers.18.attention.wo.lora_B', 'layers.18.feed_forward.w1.lora_A', 'layers.18.feed_forward.w1.lora_B', 'layers.18.feed_forward.w3.lora_A', 'layers.18.feed_forward.w3.lora_B', 'layers.18.feed_forward.w2.lora_A', 'layers.18.feed_forward.w2.lora_B', 'layers.19.attention.wqkv.lora_A', 'layers.19.attention.wqkv.lora_B', 'layers.19.attention.wo.lora_A', 'layers.19.attention.wo.lora_B', 'layers.19.feed_forward.w1.lora_A', 'layers.19.feed_forward.w1.lora_B', 'layers.19.feed_forward.w3.lora_A', 'layers.19.feed_forward.w3.lora_B', 'layers.19.feed_forward.w2.lora_A', 'layers.19.feed_forward.w2.lora_B', 'layers.20.attention.wqkv.lora_A', 'layers.20.attention.wqkv.lora_B', 'layers.20.attention.wo.lora_A', 'layers.20.attention.wo.lora_B', 'layers.20.feed_forward.w1.lora_A', 'layers.20.feed_forward.w1.lora_B', 'layers.20.feed_forward.w3.lora_A', 'layers.20.feed_forward.w3.lora_B', 'layers.20.feed_forward.w2.lora_A', 'layers.20.feed_forward.w2.lora_B', 'layers.21.attention.wqkv.lora_A', 'layers.21.attention.wqkv.lora_B', 'layers.21.attention.wo.lora_A', 'layers.21.attention.wo.lora_B', 'layers.21.feed_forward.w1.lora_A', 'layers.21.feed_forward.w1.lora_B', 'layers.21.feed_forward.w3.lora_A', 'layers.21.feed_forward.w3.lora_B', 'layers.21.feed_forward.w2.lora_A', 'layers.21.feed_forward.w2.lora_B', 'layers.22.attention.wqkv.lora_A', 'layers.22.attention.wqkv.lora_B', 'layers.22.attention.wo.lora_A', 'layers.22.attention.wo.lora_B', 'layers.22.feed_forward.w1.lora_A', 'layers.22.feed_forward.w1.lora_B', 'layers.22.feed_forward.w3.lora_A', 'layers.22.feed_forward.w3.lora_B', 'layers.22.feed_forward.w2.lora_A', 'layers.22.feed_forward.w2.lora_B', 'layers.23.attention.wqkv.lora_A', 'layers.23.attention.wqkv.lora_B', 'layers.23.attention.wo.lora_A', 'layers.23.attention.wo.lora_B', 'layers.23.feed_forward.w1.lora_A', 'layers.23.feed_forward.w1.lora_B', 'layers.23.feed_forward.w3.lora_A', 'layers.23.feed_forward.w3.lora_B', 'layers.23.feed_forward.w2.lora_A', 'layers.23.feed_forward.w2.lora_B', 'output.lora_A', 'output.lora_B', 'fast_embeddings.lora_A', 'fast_embeddings.lora_B', 'fast_layers.0.attention.wqkv.lora_A', 'fast_layers.0.attention.wqkv.lora_B', 'fast_layers.0.attention.wo.lora_A', 'fast_layers.0.attention.wo.lora_B', 'fast_layers.0.feed_forward.w1.lora_A', 'fast_layers.0.feed_forward.w1.lora_B', 'fast_layers.0.feed_forward.w3.lora_A', 'fast_layers.0.feed_forward.w3.lora_B', 'fast_layers.0.feed_forward.w2.lora_A', 'fast_layers.0.feed_forward.w2.lora_B', 'fast_layers.1.attention.wqkv.lora_A', 'fast_layers.1.attention.wqkv.lora_B', 'fast_layers.1.attention.wo.lora_A', 'fast_layers.1.attention.wo.lora_B', 'fast_layers.1.feed_forward.w1.lora_A', 'fast_layers.1.feed_forward.w1.lora_B', 'fast_layers.1.feed_forward.w3.lora_A', 'fast_layers.1.feed_forward.w3.lora_B', 'fast_layers.1.feed_forward.w2.lora_A', 'fast_layers.1.feed_forward.w2.lora_B', 'fast_layers.2.attention.wqkv.lora_A', 'fast_layers.2.attention.wqkv.lora_B', 'fast_layers.2.attention.wo.lora_A', 'fast_layers.2.attention.wo.lora_B', 'fast_layers.2.feed_forward.w1.lora_A', 'fast_layers.2.feed_forward.w1.lora_B', 'fast_layers.2.feed_forward.w3.lora_A', 'fast_layers.2.feed_forward.w3.lora_B', 'fast_layers.2.feed_forward.w2.lora_A', 'fast_layers.2.feed_forward.w2.lora_B', 'fast_layers.3.attention.wqkv.lora_A', 'fast_layers.3.attention.wqkv.lora_B', 'fast_layers.3.attention.wo.lora_A', 'fast_layers.3.attention.wo.lora_B', 'fast_layers.3.feed_forward.w1.lora_A', 'fast_layers.3.feed_forward.w1.lora_B', 'fast_layers.3.feed_forward.w3.lora_A', 'fast_layers.3.feed_forward.w3.lora_B', 'fast_layers.3.feed_forward.w2.lora_A', 'fast_layers.3.feed_forward.w2.lora_B', 'fast_output.lora_A', 'fast_output.lora_B'], unexpected_keys=[]) [2024-07-09 15:43:09,834][__main__][INFO] - [rank: 0] Instantiating callbacks... [2024-07-09 15:43:09,834][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:43:09,837][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:43:09,837][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:43:09,838][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 15:43:09,846][__main__][INFO] - [rank: 0] Instantiating loggers... [2024-07-09 15:43:09,846][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating logger [2024-07-09 15:43:09,851][__main__][INFO] - [rank: 0] Instantiating trainer [2024-07-09 15:43:09,895][__main__][INFO] - [rank: 0] Logging hyperparameters! [2024-07-09 15:43:09,926][__main__][INFO] - [rank: 0] Starting training! [2024-07-09 15:43:24,580][fish_speech.models.text2semantic.lit_module][INFO] - [rank: 0] Set weight decay: 0.005 for 432 parameters [2024-07-09 15:43:24,581][fish_speech.models.text2semantic.lit_module][INFO] - [rank: 0] Set weight decay: 0.0 for 61 parameters [2024-07-09 15:43:24,890][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:43:24,890][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:43:24,890][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:43:24,890][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:43:24,986][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 2 groups of data [2024-07-09 15:43:25,009][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 1 groups of data [2024-07-09 15:43:25,048][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 5 groups of data [2024-07-09 15:43:25,063][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 4 groups of data [2024-07-09 15:43:25,922][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:43:25,937][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:43:25,937][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 15:43:25,937][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 15:43:26,020][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 1 groups of data [2024-07-09 15:43:26,024][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 2 groups of data [2024-07-09 15:43:26,073][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 5 groups of data [2024-07-09 15:43:26,098][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 4 groups of data [2024-07-09 16:20:42,530][__main__][INFO] - [rank: 0] Instantiating datamodule [2024-07-09 16:20:43,123][datasets][INFO] - PyTorch version 2.3.1 available. [2024-07-09 16:20:43,419][__main__][INFO] - [rank: 0] Instantiating model [2024-07-09 16:20:43,452][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Override max_seq_len to 1024 [2024-07-09 16:20:43,483][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Loading model from checkpoints/fish-speech-1.2, config: DualARModelArgs(model_type='dual_ar', vocab_size=32000, n_layer=24, n_head=16, dim=1024, intermediate_size=4096, n_local_heads=2, head_dim=64, rope_base=1000000.0, norm_eps=1e-06, max_seq_len=1024, dropout=0.1, tie_word_embeddings=False, attention_qkv_bias=False, codebook_size=1024, num_codebooks=4, use_gradient_checkpointing=True, initializer_range=0.02, n_fast_layer=4) [2024-07-09 16:20:49,542][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] LoRA setup: LoraConfig(r=8, lora_alpha=16, lora_dropout=0.01) [2024-07-09 16:20:49,572][fish_speech.models.text2semantic.llama][INFO] - [rank: 0] Loaded weights with error: _IncompatibleKeys(missing_keys=['embeddings.lora_A', 'embeddings.lora_B', 'codebook_embeddings.lora_A', 'codebook_embeddings.lora_B', 'layers.0.attention.wqkv.lora_A', 'layers.0.attention.wqkv.lora_B', 'layers.0.attention.wo.lora_A', 'layers.0.attention.wo.lora_B', 'layers.0.feed_forward.w1.lora_A', 'layers.0.feed_forward.w1.lora_B', 'layers.0.feed_forward.w3.lora_A', 'layers.0.feed_forward.w3.lora_B', 'layers.0.feed_forward.w2.lora_A', 'layers.0.feed_forward.w2.lora_B', 'layers.1.attention.wqkv.lora_A', 'layers.1.attention.wqkv.lora_B', 'layers.1.attention.wo.lora_A', 'layers.1.attention.wo.lora_B', 'layers.1.feed_forward.w1.lora_A', 'layers.1.feed_forward.w1.lora_B', 'layers.1.feed_forward.w3.lora_A', 'layers.1.feed_forward.w3.lora_B', 'layers.1.feed_forward.w2.lora_A', 'layers.1.feed_forward.w2.lora_B', 'layers.2.attention.wqkv.lora_A', 'layers.2.attention.wqkv.lora_B', 'layers.2.attention.wo.lora_A', 'layers.2.attention.wo.lora_B', 'layers.2.feed_forward.w1.lora_A', 'layers.2.feed_forward.w1.lora_B', 'layers.2.feed_forward.w3.lora_A', 'layers.2.feed_forward.w3.lora_B', 'layers.2.feed_forward.w2.lora_A', 'layers.2.feed_forward.w2.lora_B', 'layers.3.attention.wqkv.lora_A', 'layers.3.attention.wqkv.lora_B', 'layers.3.attention.wo.lora_A', 'layers.3.attention.wo.lora_B', 'layers.3.feed_forward.w1.lora_A', 'layers.3.feed_forward.w1.lora_B', 'layers.3.feed_forward.w3.lora_A', 'layers.3.feed_forward.w3.lora_B', 'layers.3.feed_forward.w2.lora_A', 'layers.3.feed_forward.w2.lora_B', 'layers.4.attention.wqkv.lora_A', 'layers.4.attention.wqkv.lora_B', 'layers.4.attention.wo.lora_A', 'layers.4.attention.wo.lora_B', 'layers.4.feed_forward.w1.lora_A', 'layers.4.feed_forward.w1.lora_B', 'layers.4.feed_forward.w3.lora_A', 'layers.4.feed_forward.w3.lora_B', 'layers.4.feed_forward.w2.lora_A', 'layers.4.feed_forward.w2.lora_B', 'layers.5.attention.wqkv.lora_A', 'layers.5.attention.wqkv.lora_B', 'layers.5.attention.wo.lora_A', 'layers.5.attention.wo.lora_B', 'layers.5.feed_forward.w1.lora_A', 'layers.5.feed_forward.w1.lora_B', 'layers.5.feed_forward.w3.lora_A', 'layers.5.feed_forward.w3.lora_B', 'layers.5.feed_forward.w2.lora_A', 'layers.5.feed_forward.w2.lora_B', 'layers.6.attention.wqkv.lora_A', 'layers.6.attention.wqkv.lora_B', 'layers.6.attention.wo.lora_A', 'layers.6.attention.wo.lora_B', 'layers.6.feed_forward.w1.lora_A', 'layers.6.feed_forward.w1.lora_B', 'layers.6.feed_forward.w3.lora_A', 'layers.6.feed_forward.w3.lora_B', 'layers.6.feed_forward.w2.lora_A', 'layers.6.feed_forward.w2.lora_B', 'layers.7.attention.wqkv.lora_A', 'layers.7.attention.wqkv.lora_B', 'layers.7.attention.wo.lora_A', 'layers.7.attention.wo.lora_B', 'layers.7.feed_forward.w1.lora_A', 'layers.7.feed_forward.w1.lora_B', 'layers.7.feed_forward.w3.lora_A', 'layers.7.feed_forward.w3.lora_B', 'layers.7.feed_forward.w2.lora_A', 'layers.7.feed_forward.w2.lora_B', 'layers.8.attention.wqkv.lora_A', 'layers.8.attention.wqkv.lora_B', 'layers.8.attention.wo.lora_A', 'layers.8.attention.wo.lora_B', 'layers.8.feed_forward.w1.lora_A', 'layers.8.feed_forward.w1.lora_B', 'layers.8.feed_forward.w3.lora_A', 'layers.8.feed_forward.w3.lora_B', 'layers.8.feed_forward.w2.lora_A', 'layers.8.feed_forward.w2.lora_B', 'layers.9.attention.wqkv.lora_A', 'layers.9.attention.wqkv.lora_B', 'layers.9.attention.wo.lora_A', 'layers.9.attention.wo.lora_B', 'layers.9.feed_forward.w1.lora_A', 'layers.9.feed_forward.w1.lora_B', 'layers.9.feed_forward.w3.lora_A', 'layers.9.feed_forward.w3.lora_B', 'layers.9.feed_forward.w2.lora_A', 'layers.9.feed_forward.w2.lora_B', 'layers.10.attention.wqkv.lora_A', 'layers.10.attention.wqkv.lora_B', 'layers.10.attention.wo.lora_A', 'layers.10.attention.wo.lora_B', 'layers.10.feed_forward.w1.lora_A', 'layers.10.feed_forward.w1.lora_B', 'layers.10.feed_forward.w3.lora_A', 'layers.10.feed_forward.w3.lora_B', 'layers.10.feed_forward.w2.lora_A', 'layers.10.feed_forward.w2.lora_B', 'layers.11.attention.wqkv.lora_A', 'layers.11.attention.wqkv.lora_B', 'layers.11.attention.wo.lora_A', 'layers.11.attention.wo.lora_B', 'layers.11.feed_forward.w1.lora_A', 'layers.11.feed_forward.w1.lora_B', 'layers.11.feed_forward.w3.lora_A', 'layers.11.feed_forward.w3.lora_B', 'layers.11.feed_forward.w2.lora_A', 'layers.11.feed_forward.w2.lora_B', 'layers.12.attention.wqkv.lora_A', 'layers.12.attention.wqkv.lora_B', 'layers.12.attention.wo.lora_A', 'layers.12.attention.wo.lora_B', 'layers.12.feed_forward.w1.lora_A', 'layers.12.feed_forward.w1.lora_B', 'layers.12.feed_forward.w3.lora_A', 'layers.12.feed_forward.w3.lora_B', 'layers.12.feed_forward.w2.lora_A', 'layers.12.feed_forward.w2.lora_B', 'layers.13.attention.wqkv.lora_A', 'layers.13.attention.wqkv.lora_B', 'layers.13.attention.wo.lora_A', 'layers.13.attention.wo.lora_B', 'layers.13.feed_forward.w1.lora_A', 'layers.13.feed_forward.w1.lora_B', 'layers.13.feed_forward.w3.lora_A', 'layers.13.feed_forward.w3.lora_B', 'layers.13.feed_forward.w2.lora_A', 'layers.13.feed_forward.w2.lora_B', 'layers.14.attention.wqkv.lora_A', 'layers.14.attention.wqkv.lora_B', 'layers.14.attention.wo.lora_A', 'layers.14.attention.wo.lora_B', 'layers.14.feed_forward.w1.lora_A', 'layers.14.feed_forward.w1.lora_B', 'layers.14.feed_forward.w3.lora_A', 'layers.14.feed_forward.w3.lora_B', 'layers.14.feed_forward.w2.lora_A', 'layers.14.feed_forward.w2.lora_B', 'layers.15.attention.wqkv.lora_A', 'layers.15.attention.wqkv.lora_B', 'layers.15.attention.wo.lora_A', 'layers.15.attention.wo.lora_B', 'layers.15.feed_forward.w1.lora_A', 'layers.15.feed_forward.w1.lora_B', 'layers.15.feed_forward.w3.lora_A', 'layers.15.feed_forward.w3.lora_B', 'layers.15.feed_forward.w2.lora_A', 'layers.15.feed_forward.w2.lora_B', 'layers.16.attention.wqkv.lora_A', 'layers.16.attention.wqkv.lora_B', 'layers.16.attention.wo.lora_A', 'layers.16.attention.wo.lora_B', 'layers.16.feed_forward.w1.lora_A', 'layers.16.feed_forward.w1.lora_B', 'layers.16.feed_forward.w3.lora_A', 'layers.16.feed_forward.w3.lora_B', 'layers.16.feed_forward.w2.lora_A', 'layers.16.feed_forward.w2.lora_B', 'layers.17.attention.wqkv.lora_A', 'layers.17.attention.wqkv.lora_B', 'layers.17.attention.wo.lora_A', 'layers.17.attention.wo.lora_B', 'layers.17.feed_forward.w1.lora_A', 'layers.17.feed_forward.w1.lora_B', 'layers.17.feed_forward.w3.lora_A', 'layers.17.feed_forward.w3.lora_B', 'layers.17.feed_forward.w2.lora_A', 'layers.17.feed_forward.w2.lora_B', 'layers.18.attention.wqkv.lora_A', 'layers.18.attention.wqkv.lora_B', 'layers.18.attention.wo.lora_A', 'layers.18.attention.wo.lora_B', 'layers.18.feed_forward.w1.lora_A', 'layers.18.feed_forward.w1.lora_B', 'layers.18.feed_forward.w3.lora_A', 'layers.18.feed_forward.w3.lora_B', 'layers.18.feed_forward.w2.lora_A', 'layers.18.feed_forward.w2.lora_B', 'layers.19.attention.wqkv.lora_A', 'layers.19.attention.wqkv.lora_B', 'layers.19.attention.wo.lora_A', 'layers.19.attention.wo.lora_B', 'layers.19.feed_forward.w1.lora_A', 'layers.19.feed_forward.w1.lora_B', 'layers.19.feed_forward.w3.lora_A', 'layers.19.feed_forward.w3.lora_B', 'layers.19.feed_forward.w2.lora_A', 'layers.19.feed_forward.w2.lora_B', 'layers.20.attention.wqkv.lora_A', 'layers.20.attention.wqkv.lora_B', 'layers.20.attention.wo.lora_A', 'layers.20.attention.wo.lora_B', 'layers.20.feed_forward.w1.lora_A', 'layers.20.feed_forward.w1.lora_B', 'layers.20.feed_forward.w3.lora_A', 'layers.20.feed_forward.w3.lora_B', 'layers.20.feed_forward.w2.lora_A', 'layers.20.feed_forward.w2.lora_B', 'layers.21.attention.wqkv.lora_A', 'layers.21.attention.wqkv.lora_B', 'layers.21.attention.wo.lora_A', 'layers.21.attention.wo.lora_B', 'layers.21.feed_forward.w1.lora_A', 'layers.21.feed_forward.w1.lora_B', 'layers.21.feed_forward.w3.lora_A', 'layers.21.feed_forward.w3.lora_B', 'layers.21.feed_forward.w2.lora_A', 'layers.21.feed_forward.w2.lora_B', 'layers.22.attention.wqkv.lora_A', 'layers.22.attention.wqkv.lora_B', 'layers.22.attention.wo.lora_A', 'layers.22.attention.wo.lora_B', 'layers.22.feed_forward.w1.lora_A', 'layers.22.feed_forward.w1.lora_B', 'layers.22.feed_forward.w3.lora_A', 'layers.22.feed_forward.w3.lora_B', 'layers.22.feed_forward.w2.lora_A', 'layers.22.feed_forward.w2.lora_B', 'layers.23.attention.wqkv.lora_A', 'layers.23.attention.wqkv.lora_B', 'layers.23.attention.wo.lora_A', 'layers.23.attention.wo.lora_B', 'layers.23.feed_forward.w1.lora_A', 'layers.23.feed_forward.w1.lora_B', 'layers.23.feed_forward.w3.lora_A', 'layers.23.feed_forward.w3.lora_B', 'layers.23.feed_forward.w2.lora_A', 'layers.23.feed_forward.w2.lora_B', 'output.lora_A', 'output.lora_B', 'fast_embeddings.lora_A', 'fast_embeddings.lora_B', 'fast_layers.0.attention.wqkv.lora_A', 'fast_layers.0.attention.wqkv.lora_B', 'fast_layers.0.attention.wo.lora_A', 'fast_layers.0.attention.wo.lora_B', 'fast_layers.0.feed_forward.w1.lora_A', 'fast_layers.0.feed_forward.w1.lora_B', 'fast_layers.0.feed_forward.w3.lora_A', 'fast_layers.0.feed_forward.w3.lora_B', 'fast_layers.0.feed_forward.w2.lora_A', 'fast_layers.0.feed_forward.w2.lora_B', 'fast_layers.1.attention.wqkv.lora_A', 'fast_layers.1.attention.wqkv.lora_B', 'fast_layers.1.attention.wo.lora_A', 'fast_layers.1.attention.wo.lora_B', 'fast_layers.1.feed_forward.w1.lora_A', 'fast_layers.1.feed_forward.w1.lora_B', 'fast_layers.1.feed_forward.w3.lora_A', 'fast_layers.1.feed_forward.w3.lora_B', 'fast_layers.1.feed_forward.w2.lora_A', 'fast_layers.1.feed_forward.w2.lora_B', 'fast_layers.2.attention.wqkv.lora_A', 'fast_layers.2.attention.wqkv.lora_B', 'fast_layers.2.attention.wo.lora_A', 'fast_layers.2.attention.wo.lora_B', 'fast_layers.2.feed_forward.w1.lora_A', 'fast_layers.2.feed_forward.w1.lora_B', 'fast_layers.2.feed_forward.w3.lora_A', 'fast_layers.2.feed_forward.w3.lora_B', 'fast_layers.2.feed_forward.w2.lora_A', 'fast_layers.2.feed_forward.w2.lora_B', 'fast_layers.3.attention.wqkv.lora_A', 'fast_layers.3.attention.wqkv.lora_B', 'fast_layers.3.attention.wo.lora_A', 'fast_layers.3.attention.wo.lora_B', 'fast_layers.3.feed_forward.w1.lora_A', 'fast_layers.3.feed_forward.w1.lora_B', 'fast_layers.3.feed_forward.w3.lora_A', 'fast_layers.3.feed_forward.w3.lora_B', 'fast_layers.3.feed_forward.w2.lora_A', 'fast_layers.3.feed_forward.w2.lora_B', 'fast_output.lora_A', 'fast_output.lora_B'], unexpected_keys=[]) [2024-07-09 16:20:49,581][__main__][INFO] - [rank: 0] Instantiating callbacks... [2024-07-09 16:20:49,582][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 16:20:49,584][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 16:20:49,585][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 16:20:49,585][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating callback [2024-07-09 16:20:49,603][__main__][INFO] - [rank: 0] Instantiating loggers... [2024-07-09 16:20:49,603][fish_speech.utils.instantiators][INFO] - [rank: 0] Instantiating logger [2024-07-09 16:20:49,610][__main__][INFO] - [rank: 0] Instantiating trainer [2024-07-09 16:20:49,655][__main__][INFO] - [rank: 0] Logging hyperparameters! [2024-07-09 16:20:49,720][__main__][INFO] - [rank: 0] Starting training! [2024-07-09 16:21:04,418][fish_speech.models.text2semantic.lit_module][INFO] - [rank: 0] Set weight decay: 0.01 for 432 parameters [2024-07-09 16:21:04,418][fish_speech.models.text2semantic.lit_module][INFO] - [rank: 0] Set weight decay: 0.0 for 61 parameters [2024-07-09 16:21:04,720][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 16:21:04,721][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 16:21:04,720][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 16:21:04,720][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 16:21:04,812][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 2 groups of data [2024-07-09 16:21:04,843][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 1 groups of data [2024-07-09 16:21:04,883][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 5 groups of data [2024-07-09 16:21:04,891][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 4 groups of data [2024-07-09 16:21:05,948][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 16:21:05,948][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 16:21:05,948][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 2 / 3 files [2024-07-09 16:21:05,948][fish_speech.datasets.semantic][INFO] - [rank: 0] Reading 1 / 3 files [2024-07-09 16:21:06,037][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 1 groups of data [2024-07-09 16:21:06,040][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 2 groups of data [2024-07-09 16:21:06,105][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 5 groups of data [2024-07-09 16:21:06,110][fish_speech.datasets.semantic][INFO] - [rank: 0] Read total 4 groups of data