wgb14 commited on
Commit
2d4dd7e
1 Parent(s): 6f158b8

Upload log/fast_beam_search/log-decode-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8-2022-05-14-14-18-23

Browse files
log/fast_beam_search/log-decode-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8-2022-05-14-14-18-23 ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-05-14 14:18:23,183 INFO [decode_test.py:489] Decoding started
2
+ 2022-05-14 14:18:23,184 INFO [decode_test.py:495] Device: cuda:0
3
+ 2022-05-14 14:18:23,186 INFO [decode_test.py:505] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 500, 'reset_interval': 2000, 'valid_interval': 20000, 'feature_dim': 80, 'subsampling_factor': 4, 'encoder_dim': 512, 'nhead': 8, 'dim_feedforward': 2048, 'num_encoder_layers': 12, 'decoder_dim': 512, 'joiner_dim': 512, 'model_warm_step': 20000, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'ecfe7bd6d9189964bf3ff043038918d889a43185', 'k2-git-date': 'Tue May 10 10:57:55 2022', 'lhotse-version': '1.2.0.dev+git.a3d7b8e.clean', 'torch-version': '1.10.0', 'torch-cuda-available': True, 'torch-cuda-version': '11.1', 'python-version': '3.7', 'icefall-git-branch': 'master', 'icefall-git-sha1': 'e30e042-dirty', 'icefall-git-date': 'Fri May 13 13:03:16 2022', 'icefall-path': '/userhome/user/guanbo/icefall_master', 'k2-path': '/opt/conda/lib/python3.7/site-packages/k2-1.15.1.dev20220514+cuda11.1.torch1.10.0-py3.7-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/userhome/user/guanbo/lhotse/lhotse/__init__.py', 'hostname': 'd5e575e00d344011ec09cda0e7275cb175f4-chenx8564-0', 'IP address': '10.206.33.89'}, 'epoch': 29, 'iter': 3488000, 'avg': 20, 'exp_dir': PosixPath('pruned_transducer_stateless2/exp'), 'bpe_model': 'data/lang_bpe_500/bpe.model', 'decoding_method': 'fast_beam_search', 'beam_size': 4, 'beam': 4, 'max_contexts': 4, 'max_states': 8, 'context_size': 2, 'max_sym_per_frame': 1, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'return_cuts': True, 'num_workers': 4, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'subset': 'XL', 'small_dev': False, 'res_dir': PosixPath('pruned_transducer_stateless2/exp/fast_beam_search'), 'suffix': 'iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 500}
4
+ 2022-05-14 14:18:23,186 INFO [decode_test.py:507] About to create model
5
+ 2022-05-14 14:18:23,564 INFO [decode_test.py:524] averaging ['pruned_transducer_stateless2/exp/checkpoint-3488000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3480000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3472000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3464000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3456000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3448000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3440000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3432000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3424000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3416000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3408000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3400000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3392000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3384000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3376000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3368000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3360000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3352000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3344000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3336000.pt']
6
+ 2022-05-14 14:18:42,241 INFO [decode_test.py:549] Number of model parameters: 78648040
7
+ 2022-05-14 14:18:42,243 INFO [asr_datamodule.py:415] About to get test cuts
8
+ 2022-05-14 14:18:45,477 INFO [decode_test.py:401] batch 0/?, cuts processed until now is 118
9
+ 2022-05-14 14:19:01,829 INFO [decode_test.py:401] batch 20/?, cuts processed until now is 1859
10
+ 2022-05-14 14:19:17,730 INFO [decode_test.py:401] batch 40/?, cuts processed until now is 3579
11
+ 2022-05-14 14:19:33,751 INFO [decode_test.py:401] batch 60/?, cuts processed until now is 5872
12
+ 2022-05-14 14:19:48,834 INFO [decode_test.py:401] batch 80/?, cuts processed until now is 8436
13
+ 2022-05-14 14:20:04,170 INFO [decode_test.py:401] batch 100/?, cuts processed until now is 10038
14
+ 2022-05-14 14:20:19,157 INFO [decode_test.py:401] batch 120/?, cuts processed until now is 11949
15
+ 2022-05-14 14:20:34,041 INFO [decode_test.py:401] batch 140/?, cuts processed until now is 14046
16
+ 2022-05-14 14:20:48,688 INFO [decode_test.py:401] batch 160/?, cuts processed until now is 16010
17
+ 2022-05-14 14:21:03,607 INFO [decode_test.py:401] batch 180/?, cuts processed until now is 17567
18
+ 2022-05-14 14:21:18,195 INFO [decode_test.py:401] batch 200/?, cuts processed until now is 18926
19
+ 2022-05-14 14:21:33,733 INFO [decode_test.py:401] batch 220/?, cuts processed until now is 19923
20
+ 2022-05-14 14:21:35,201 INFO [decode_test.py:418] The transcripts are stored in pruned_transducer_stateless2/exp/fast_beam_search/recogs-test-beam_4_max_contexts_4_max_states_8-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8.txt
21
+ 2022-05-14 14:21:35,614 INFO [utils.py:406] [test-beam_4_max_contexts_4_max_states_8] %WER 10.69% [41753 / 390744, 6857 ins, 11265 del, 23631 sub ]
22
+ 2022-05-14 14:21:36,564 INFO [decode_test.py:431] Wrote detailed error stats to pruned_transducer_stateless2/exp/fast_beam_search/errs-test-beam_4_max_contexts_4_max_states_8-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8.txt
23
+ 2022-05-14 14:21:36,566 INFO [decode_test.py:452]
24
+ For test, WER of different settings are:
25
+ beam_4_max_contexts_4_max_states_8 10.69 best for test
26
+
27
+ 2022-05-14 14:21:36,589 INFO [decode_test.py:577] Done!