wgb14 commited on
Commit
6f158b8
1 Parent(s): c76f298

Upload log/fast_beam_search/log-decode-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8-2022-05-14-12-01-52

Browse files
log/fast_beam_search/log-decode-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8-2022-05-14-12-01-52 ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-05-14 12:01:52,182 INFO [decode.py:489] Decoding started
2
+ 2022-05-14 12:01:52,183 INFO [decode.py:495] Device: cuda:0
3
+ 2022-05-14 12:01:52,186 INFO [decode.py:505] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 500, 'reset_interval': 2000, 'valid_interval': 20000, 'feature_dim': 80, 'subsampling_factor': 4, 'encoder_dim': 512, 'nhead': 8, 'dim_feedforward': 2048, 'num_encoder_layers': 12, 'decoder_dim': 512, 'joiner_dim': 512, 'model_warm_step': 20000, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'ecfe7bd6d9189964bf3ff043038918d889a43185', 'k2-git-date': 'Tue May 10 10:57:55 2022', 'lhotse-version': '1.2.0.dev+git.a3d7b8e.clean', 'torch-version': '1.10.0', 'torch-cuda-available': True, 'torch-cuda-version': '11.1', 'python-version': '3.7', 'icefall-git-branch': 'master', 'icefall-git-sha1': 'e30e042-dirty', 'icefall-git-date': 'Fri May 13 13:03:16 2022', 'icefall-path': '/userhome/user/guanbo/icefall_master', 'k2-path': '/opt/conda/lib/python3.7/site-packages/k2-1.15.1.dev20220514+cuda11.1.torch1.10.0-py3.7-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/userhome/user/guanbo/lhotse/lhotse/__init__.py', 'hostname': 'ebbb46e00d32c011ec09cda0e7275cb175f4-chenx8564-0', 'IP address': '10.206.33.71'}, 'epoch': 29, 'iter': 3488000, 'avg': 20, 'exp_dir': PosixPath('pruned_transducer_stateless2/exp'), 'bpe_model': 'data/lang_bpe_500/bpe.model', 'decoding_method': 'fast_beam_search', 'beam_size': 4, 'beam': 4, 'max_contexts': 4, 'max_states': 8, 'context_size': 2, 'max_sym_per_frame': 1, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'return_cuts': True, 'num_workers': 4, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'subset': 'XL', 'small_dev': False, 'res_dir': PosixPath('pruned_transducer_stateless2/exp/fast_beam_search'), 'suffix': 'iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 500}
4
+ 2022-05-14 12:01:52,186 INFO [decode.py:507] About to create model
5
+ 2022-05-14 12:01:52,561 INFO [decode.py:524] averaging ['pruned_transducer_stateless2/exp/checkpoint-3488000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3480000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3472000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3464000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3456000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3448000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3440000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3432000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3424000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3416000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3408000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3400000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3392000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3384000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3376000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3368000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3360000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3352000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3344000.pt', 'pruned_transducer_stateless2/exp/checkpoint-3336000.pt']
6
+ 2022-05-14 12:02:11,183 INFO [decode.py:549] Number of model parameters: 78648040
7
+ 2022-05-14 12:02:11,184 INFO [asr_datamodule.py:406] About to get dev cuts
8
+ 2022-05-14 12:02:13,478 INFO [decode.py:401] batch 0/?, cuts processed until now is 99
9
+ 2022-05-14 12:02:29,739 INFO [decode.py:401] batch 20/?, cuts processed until now is 1626
10
+ 2022-05-14 12:02:45,580 INFO [decode.py:401] batch 40/?, cuts processed until now is 3172
11
+ 2022-05-14 12:03:00,035 INFO [decode.py:401] batch 60/?, cuts processed until now is 4929
12
+ 2022-05-14 12:03:10,122 INFO [decode.py:418] The transcripts are stored in pruned_transducer_stateless2/exp/fast_beam_search/recogs-dev-beam_4_max_contexts_4_max_states_8-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8.txt
13
+ 2022-05-14 12:03:10,247 INFO [utils.py:406] [dev-beam_4_max_contexts_4_max_states_8] %WER 10.50% [13415 / 127790, 2992 ins, 3504 del, 6919 sub ]
14
+ 2022-05-14 12:03:10,558 INFO [decode.py:431] Wrote detailed error stats to pruned_transducer_stateless2/exp/fast_beam_search/errs-dev-beam_4_max_contexts_4_max_states_8-iter-3488000-avg-20-beam-4-max-contexts-4-max-states-8.txt
15
+ 2022-05-14 12:03:10,560 INFO [decode.py:452]
16
+ For dev, WER of different settings are:
17
+ beam_4_max_contexts_4_max_states_8 10.5 best for dev
18
+
19
+ 2022-05-14 12:03:10,566 INFO [decode.py:577] Done!