icefall-asr-zipformer-wenetspeech-20230615
/
logs
/modified_beam_search
/log-decode-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model-2023-06-15-14-09-16
2023-06-15 14:09:16,281 INFO [decode.py:639] Decoding started | |
2023-06-15 14:09:16,281 INFO [decode.py:645] Device: cuda:0 | |
2023-06-15 14:09:17,860 INFO [lexicon.py:168] Loading pre-compiled data/lang_char/Linv.pt | |
2023-06-15 14:09:18,104 INFO [decode.py:656] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'warm_step': 2000, 'env_info': {'k2-version': '1.24.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'c51a0b9684442a88ee37f3ce0af686a04b66855b', 'k2-git-date': 'Mon May 1 21:38:03 2023', 'lhotse-version': '1.14.0.dev+git.0f812851.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'zipformer_wenetspeech', 'icefall-git-sha1': '28d3f6d-dirty', 'icefall-git-date': 'Thu Jun 15 10:30:34 2023', 'icefall-path': '/star-kw/kangwei/code/icefall_wenetspeech', 'k2-path': '/ceph-hw/kangwei/code/k2_release/k2/k2/python/k2/__init__.py', 'lhotse-path': '/ceph-hw/kangwei/dev_tools/anaconda3/envs/rnnt2/lib/python3.8/site-packages/lhotse-1.14.0.dev0+git.0f812851.dirty-py3.8.egg/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-10-0221105906-5745685d6b-t8zzx', 'IP address': '10.177.57.19'}, 'epoch': 11, 'iter': 0, 'avg': 3, 'use_averaged_model': True, 'exp_dir': PosixPath('zipformer/exp_L_context_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'modified_beam_search', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'blank_penalty': 2.0, 'num_encoder_layers': '2,2,3,4,3,2', 'downsampling_factor': '1,2,4,8,4,2', 'feedforward_dim': '512,768,1024,1536,1024,768', 'num_heads': '4,4,4,8,4,4', 'encoder_dim': '192,256,384,512,384,256', 'query_head_dim': '32', 'value_head_dim': '12', 'pos_head_dim': '4', 'pos_dim': 48, 'encoder_unmasked_dim': '192,192,256,256,256,192', 'cnn_module_kernel': '31,31,15,15,15,31', 'decoder_dim': 512, 'joiner_dim': 512, 'causal': False, 'chunk_size': '16,32,64,-1', 'left_context_frames': '64,128,256,-1', 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 1000, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'training_subset': 'L', 'res_dir': PosixPath('zipformer/exp_L_context_2/modified_beam_search'), 'suffix': 'epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model', 'blank_id': 0, 'vocab_size': 5537} | |
2023-06-15 14:09:18,104 INFO [decode.py:658] About to create model | |
2023-06-15 14:09:18,761 INFO [decode.py:725] Calculating the averaged model over epoch range from 8 (excluded) to 11 | |
2023-06-15 14:09:29,627 INFO [decode.py:756] Number of model parameters: 75879898 | |
2023-06-15 14:09:29,627 INFO [asr_datamodule.py:398] About to get dev cuts | |
2023-06-15 14:09:29,644 INFO [asr_datamodule.py:336] About to create dev dataset | |
2023-06-15 14:09:30,256 INFO [asr_datamodule.py:354] About to create dev dataloader | |
2023-06-15 14:09:30,257 INFO [asr_datamodule.py:403] About to get TEST_NET cuts | |
2023-06-15 14:09:30,260 INFO [asr_datamodule.py:367] About to create test dataset | |
2023-06-15 14:09:30,310 WARNING [decode.py:765] Exclude cut with ID TEST_NET_Y0000000004_0ub4ZzdHzBc_S00023 from decoding, num_frames : 8. | |
2023-06-15 14:09:30,821 INFO [asr_datamodule.py:408] About to get TEST_MEETING cuts | |
2023-06-15 14:09:30,829 INFO [asr_datamodule.py:367] About to create test dataset | |
2023-06-15 14:09:40,763 INFO [decode.py:536] batch 0/?, cuts processed until now is 130 | |
2023-06-15 14:12:14,675 INFO [decode.py:536] batch 20/?, cuts processed until now is 3192 | |
2023-06-15 14:14:25,592 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([3.6815, 2.3170, 2.0954, 2.3418], device='cuda:0') | |
2023-06-15 14:14:50,834 INFO [decode.py:536] batch 40/?, cuts processed until now is 6421 | |
2023-06-15 14:17:21,000 INFO [decode.py:536] batch 60/?, cuts processed until now is 10176 | |
2023-06-15 14:17:51,032 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([1.9955, 2.2462, 2.1568, 2.1010, 2.2957, 2.2060, 2.2030, 2.4361], | |
device='cuda:0') | |
2023-06-15 14:17:57,401 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([2.3086, 3.5782, 2.9290, 4.4266], device='cuda:0') | |
2023-06-15 14:19:10,038 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([2.4956, 3.0278, 2.7592, 2.7097, 2.9157, 2.7470, 2.9556, 3.1833], | |
device='cuda:0') | |
2023-06-15 14:19:19,907 INFO [decode.py:536] batch 80/?, cuts processed until now is 13727 | |
2023-06-15 14:19:26,978 INFO [decode.py:552] The transcripts are stored in zipformer/exp_L_context_2/modified_beam_search/recogs-DEV-beam_size_4_blank_penalty_2.0-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model.txt | |
2023-06-15 14:19:27,444 INFO [utils.py:562] [DEV-beam_size_4_blank_penalty_2.0] %WER 7.31% [24146 / 330498, 2919 ins, 9793 del, 11434 sub ] | |
2023-06-15 14:19:28,459 INFO [decode.py:565] Wrote detailed error stats to zipformer/exp_L_context_2/modified_beam_search/errs-DEV-beam_size_4_blank_penalty_2.0-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model.txt | |
2023-06-15 14:19:28,462 INFO [decode.py:581] | |
For DEV, WER of different settings are: | |
beam_size_4_blank_penalty_2.0 7.31 best for DEV | |
2023-06-15 14:19:28,725 WARNING [decode.py:765] Exclude cut with ID TEST_NET_Y0000000004_0ub4ZzdHzBc_S00023 from decoding, num_frames : 8. | |
2023-06-15 14:19:37,901 INFO [decode.py:536] batch 0/?, cuts processed until now is 146 | |
2023-06-15 14:20:06,515 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([1.8901, 4.2283, 2.1833, 1.8579], device='cuda:0') | |
2023-06-15 14:21:58,178 INFO [decode.py:536] batch 20/?, cuts processed until now is 4116 | |
2023-06-15 14:24:24,030 INFO [decode.py:536] batch 40/?, cuts processed until now is 8601 | |
2023-06-15 14:24:37,389 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([2.1414, 2.8933, 2.8732, 2.6244, 3.1577, 2.0096, 3.2865, 3.0481], | |
device='cuda:0') | |
2023-06-15 14:26:46,498 INFO [decode.py:536] batch 60/?, cuts processed until now is 14082 | |
2023-06-15 14:28:26,171 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([2.0439, 2.5210, 2.1938, 2.3059, 2.4439, 2.2264, 2.4304, 2.6964], | |
device='cuda:0') | |
2023-06-15 14:29:10,704 INFO [decode.py:536] batch 80/?, cuts processed until now is 18750 | |
2023-06-15 14:30:57,741 INFO [decode.py:536] batch 100/?, cuts processed until now is 24487 | |
2023-06-15 14:31:12,687 INFO [decode.py:552] The transcripts are stored in zipformer/exp_L_context_2/modified_beam_search/recogs-TEST_NET-beam_size_4_blank_penalty_2.0-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model.txt | |
2023-06-15 14:31:13,199 INFO [utils.py:562] [TEST_NET-beam_size_4_blank_penalty_2.0] %WER 7.63% [31728 / 415746, 4091 ins, 7427 del, 20210 sub ] | |
2023-06-15 14:31:14,543 INFO [decode.py:565] Wrote detailed error stats to zipformer/exp_L_context_2/modified_beam_search/errs-TEST_NET-beam_size_4_blank_penalty_2.0-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model.txt | |
2023-06-15 14:31:14,546 INFO [decode.py:581] | |
For TEST_NET, WER of different settings are: | |
beam_size_4_blank_penalty_2.0 7.63 best for TEST_NET | |
2023-06-15 14:31:24,002 INFO [decode.py:536] batch 0/?, cuts processed until now is 93 | |
2023-06-15 14:33:52,545 INFO [decode.py:536] batch 20/?, cuts processed until now is 2345 | |
2023-06-15 14:35:14,358 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([2.1572, 4.3534, 2.5844, 2.1116], device='cuda:0') | |
2023-06-15 14:36:29,837 INFO [decode.py:536] batch 40/?, cuts processed until now is 4929 | |
2023-06-15 14:38:14,598 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([2.2305, 2.8672, 3.5681, 3.5647], device='cuda:0') | |
2023-06-15 14:38:31,696 INFO [decode.py:536] batch 60/?, cuts processed until now is 7955 | |
2023-06-15 14:38:54,580 INFO [zipformer.py:1728] name=None, attn_weights_entropy = tensor([1.8283, 3.3644, 2.8940, 4.2328], device='cuda:0') | |
2023-06-15 14:39:01,539 INFO [decode.py:552] The transcripts are stored in zipformer/exp_L_context_2/modified_beam_search/recogs-TEST_MEETING-beam_size_4_blank_penalty_2.0-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model.txt | |
2023-06-15 14:39:01,797 INFO [utils.py:562] [TEST_MEETING-beam_size_4_blank_penalty_2.0] %WER 12.43% [27395 / 220385, 3243 ins, 12585 del, 11567 sub ] | |
2023-06-15 14:39:02,486 INFO [decode.py:565] Wrote detailed error stats to zipformer/exp_L_context_2/modified_beam_search/errs-TEST_MEETING-beam_size_4_blank_penalty_2.0-epoch-11-avg-3-modified_beam_search-beam-size-4-blank-penalty-2.0-use-averaged-model.txt | |
2023-06-15 14:39:02,489 INFO [decode.py:581] | |
For TEST_MEETING, WER of different settings are: | |
beam_size_4_blank_penalty_2.0 12.43 best for TEST_MEETING | |
2023-06-15 14:39:02,489 INFO [decode.py:801] Done! | |