Yuekai commited on
Commit
e2b3804
1 Parent(s): 156be29

update results

Browse files
Files changed (34) hide show
  1. .gitattributes +1 -0
  2. data/lang_char/3-gram.unpruned.arpa +3 -0
  3. data/lang_char/L.pt +2 -2
  4. data/lang_char/LG.pt +3 -0
  5. data/lang_char/L_disambig.pt +2 -2
  6. data/lang_char/Linv.pt +2 -2
  7. data/lang_char/words.txt +3 -0
  8. exp/fast_beam_search_nbest/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  9. exp/fast_beam_search_nbest/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  10. exp/fast_beam_search_nbest/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-11-47-42 +29 -0
  11. exp/fast_beam_search_nbest/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  12. exp/fast_beam_search_nbest/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  13. exp/fast_beam_search_nbest/wer-summary-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +2 -0
  14. exp/fast_beam_search_nbest/wer-summary-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +2 -0
  15. exp/fast_beam_search_nbest_LG/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt +0 -0
  16. exp/fast_beam_search_nbest_LG/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt +0 -0
  17. exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-11-50-57 +29 -0
  18. exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-13-16-58 +31 -0
  19. exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-13-24-43 +22 -0
  20. exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-13-31-36 +31 -0
  21. exp/fast_beam_search_nbest_LG/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt +0 -0
  22. exp/fast_beam_search_nbest_LG/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt +0 -0
  23. exp/fast_beam_search_nbest_LG/wer-summary-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt +2 -0
  24. exp/fast_beam_search_nbest_LG/wer-summary-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt +2 -0
  25. exp/fast_beam_search_nbest_oracle/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  26. exp/fast_beam_search_nbest_oracle/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  27. exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-11-50-42 +11 -0
  28. exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-12-55-59 +11 -0
  29. exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-12-58-10 +11 -0
  30. exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-13-06-03 +29 -0
  31. exp/fast_beam_search_nbest_oracle/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  32. exp/fast_beam_search_nbest_oracle/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +0 -0
  33. exp/fast_beam_search_nbest_oracle/wer-summary-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +2 -0
  34. exp/fast_beam_search_nbest_oracle/wer-summary-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt +2 -0
.gitattributes CHANGED
@@ -25,3 +25,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
25
  *.zip filter=lfs diff=lfs merge=lfs -text
26
  *.zstandard filter=lfs diff=lfs merge=lfs -text
27
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
25
  *.zip filter=lfs diff=lfs merge=lfs -text
26
  *.zstandard filter=lfs diff=lfs merge=lfs -text
27
  *tfevents* filter=lfs diff=lfs merge=lfs -text
28
+ data/lang_char/3-gram.unpruned.arpa filter=lfs diff=lfs merge=lfs -text
data/lang_char/3-gram.unpruned.arpa ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:78600f81833bd982677af1bfb458f740bcc7f4793b62fa894c7d84e667ec5ccb
3
+ size 152720598
data/lang_char/L.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:95cd9898f926a475a9196dd3681cb88d6a3d65cff1b4d3f31c0062e5fb179c4e
3
- size 20741301
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0b62ee0eaf7d32d01d3420ed60e7644c14b82afcb8f8f41d699e9ef98b42e1b0
3
+ size 20741223
data/lang_char/LG.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0cf2d2164aa980ae102ddb4cf1a773e799d26a9c936a423455a425ec89892970
3
+ size 281134819
data/lang_char/L_disambig.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c5a47e478ec01744df9dcb275d7b7d71ff17e2c79b531241c7d278784842eedb
3
- size 21491829
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5cda172d28f3ccd19ca15d6d0942b74e9214b3eff5852fe5ba4f6c67edf6eb67
3
+ size 21491751
data/lang_char/Linv.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1a9518b34fd91b901d83445a90eda4fdf0fc62850e3442bc5b2ca291a974d0cd
3
- size 20741303
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:55dc05bddf36629dc86b02ee7e29a7e9ffe069d305a63a47938d3303fbd1cc75
3
+ size 20741223
data/lang_char/words.txt CHANGED
@@ -230947,3 +230947,6 @@ zx 9724
230947
  龟鱼 230946
230948
  龟鱼酒店 230947
230949
  A 230948
 
 
 
 
230947
  龟鱼 230946
230948
  龟鱼酒店 230947
230949
  A 230948
230950
+ #0 230949
230951
+ <s> 230950
230952
+ </s> 230951
exp/fast_beam_search_nbest/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-11-47-42 ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 11:47:42,782 INFO [decode.py:632] Decoding started
2
+ 2022-07-13 11:47:42,782 INFO [decode.py:638] Device: cuda:0
3
+ 2022-07-13 11:47:43,401 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 11:47:43,472 INFO [decode.py:645] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 11:47:43,473 INFO [decode.py:647] About to create model
6
+ 2022-07-13 11:47:44,189 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 11:47:50,769 INFO [decode.py:736] Number of model parameters: 96910451
8
+ 2022-07-13 11:47:50,770 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
9
+ 2022-07-13 11:47:50,774 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
10
+ 2022-07-13 11:47:50,775 INFO [asr_datamodule.py:347] About to create dev dataset
11
+ 2022-07-13 11:47:50,978 INFO [asr_datamodule.py:366] About to create dev dataloader
12
+ 2022-07-13 11:47:56,644 INFO [decode.py:535] batch 0/?, cuts processed until now is 162
13
+ 2022-07-13 11:48:46,862 INFO [decode.py:552] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
14
+ 2022-07-13 11:48:46,920 INFO [utils.py:420] [dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5] %WER 5.37% [1331 / 24802, 39 ins, 63 del, 1229 sub ]
15
+ 2022-07-13 11:48:47,081 INFO [decode.py:565] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
16
+ 2022-07-13 11:48:47,082 INFO [decode.py:582]
17
+ For dev, WER of different settings are:
18
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 5.37 best for dev
19
+
20
+ 2022-07-13 11:48:54,609 INFO [decode.py:535] batch 0/?, cuts processed until now is 167
21
+ 2022-07-13 11:50:14,675 INFO [decode.py:535] batch 20/?, cuts processed until now is 3914
22
+ 2022-07-13 11:50:37,933 INFO [decode.py:552] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
23
+ 2022-07-13 11:50:38,061 INFO [utils.py:420] [test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5] %WER 5.60% [2772 / 49534, 74 ins, 129 del, 2569 sub ]
24
+ 2022-07-13 11:50:38,379 INFO [decode.py:565] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
25
+ 2022-07-13 11:50:38,387 INFO [decode.py:582]
26
+ For test, WER of different settings are:
27
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 5.6 best for test
28
+
29
+ 2022-07-13 11:50:38,387 INFO [decode.py:765] Done!
exp/fast_beam_search_nbest/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest/wer-summary-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ settings WER
2
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 5.37
exp/fast_beam_search_nbest/wer-summary-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ settings WER
2
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 5.6
exp/fast_beam_search_nbest_LG/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_LG/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-11-50-57 ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 11:50:57,761 INFO [decode.py:632] Decoding started
2
+ 2022-07-13 11:50:57,761 INFO [decode.py:638] Device: cuda:0
3
+ 2022-07-13 11:50:58,379 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 11:50:58,451 INFO [decode.py:645] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_LG', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_LG'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 11:50:58,452 INFO [decode.py:647] About to create model
6
+ 2022-07-13 11:50:59,170 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 11:51:06,804 INFO [decode.py:736] Number of model parameters: 96910451
8
+ 2022-07-13 11:51:06,804 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
9
+ 2022-07-13 11:51:06,808 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
10
+ 2022-07-13 11:51:06,810 INFO [asr_datamodule.py:347] About to create dev dataset
11
+ 2022-07-13 11:51:07,015 INFO [asr_datamodule.py:366] About to create dev dataloader
12
+ 2022-07-13 11:51:09,776 INFO [decode.py:535] batch 0/?, cuts processed until now is 162
13
+ 2022-07-13 11:51:28,274 INFO [decode.py:552] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
14
+ 2022-07-13 11:51:28,368 INFO [utils.py:420] [dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 100.19% [24850 / 24802, 48 ins, 55 del, 24747 sub ]
15
+ 2022-07-13 11:51:28,576 INFO [decode.py:565] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
16
+ 2022-07-13 11:51:28,578 INFO [decode.py:582]
17
+ For dev, WER of different settings are:
18
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 100.19 best for dev
19
+
20
+ 2022-07-13 11:51:30,894 INFO [decode.py:535] batch 0/?, cuts processed until now is 167
21
+ 2022-07-13 11:51:57,049 INFO [decode.py:535] batch 20/?, cuts processed until now is 3914
22
+ 2022-07-13 11:52:05,658 INFO [decode.py:552] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
23
+ 2022-07-13 11:52:05,825 INFO [utils.py:420] [test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 100.18% [49622 / 49534, 88 ins, 101 del, 49433 sub ]
24
+ 2022-07-13 11:52:06,217 INFO [decode.py:565] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
25
+ 2022-07-13 11:52:06,219 INFO [decode.py:582]
26
+ For test, WER of different settings are:
27
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 100.18 best for test
28
+
29
+ 2022-07-13 11:52:06,219 INFO [decode.py:765] Done!
exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-13-16-58 ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 13:16:58,674 INFO [decode.py:636] Decoding started
2
+ 2022-07-13 13:16:58,675 INFO [decode.py:642] Device: cuda:0
3
+ 2022-07-13 13:16:59,297 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 13:16:59,369 INFO [decode.py:654] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_LG', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_LG'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 13:16:59,370 INFO [decode.py:656] About to create model
6
+ 2022-07-13 13:17:00,086 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 13:17:07,172 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
8
+ 2022-07-13 13:17:07,249 INFO [decode.py:743] Loading data/lang_char/LG.pt
9
+ 2022-07-13 13:17:12,536 INFO [decode.py:756] Number of model parameters: 96910451
10
+ 2022-07-13 13:17:12,537 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
11
+ 2022-07-13 13:17:12,540 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
12
+ 2022-07-13 13:17:12,541 INFO [asr_datamodule.py:347] About to create dev dataset
13
+ 2022-07-13 13:17:12,747 INFO [asr_datamodule.py:366] About to create dev dataloader
14
+ 2022-07-13 13:17:15,883 INFO [decode.py:539] batch 0/?, cuts processed until now is 162
15
+ 2022-07-13 13:17:39,064 INFO [decode.py:556] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
16
+ 2022-07-13 13:17:39,156 INFO [utils.py:420] [dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 60.31% [14957 / 24802, 15 ins, 7379 del, 7563 sub ]
17
+ 2022-07-13 13:17:39,362 INFO [decode.py:569] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
18
+ 2022-07-13 13:17:39,377 INFO [decode.py:586]
19
+ For dev, WER of different settings are:
20
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 60.31 best for dev
21
+
22
+ 2022-07-13 13:17:42,138 INFO [decode.py:539] batch 0/?, cuts processed until now is 167
23
+ 2022-07-13 13:18:15,591 INFO [decode.py:539] batch 20/?, cuts processed until now is 3914
24
+ 2022-07-13 13:18:26,263 INFO [decode.py:556] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
25
+ 2022-07-13 13:18:26,438 INFO [utils.py:420] [test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 59.26% [29353 / 49534, 28 ins, 14460 del, 14865 sub ]
26
+ 2022-07-13 13:18:26,838 INFO [decode.py:569] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
27
+ 2022-07-13 13:18:27,411 INFO [decode.py:586]
28
+ For test, WER of different settings are:
29
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 59.26 best for test
30
+
31
+ 2022-07-13 13:18:27,412 INFO [decode.py:786] Done!
exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-13-24-43 ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 13:24:43,179 INFO [decode.py:639] Decoding started
2
+ 2022-07-13 13:24:43,179 INFO [decode.py:645] Device: cuda:0
3
+ 2022-07-13 13:24:43,828 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 13:24:43,899 INFO [decode.py:657] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_LG', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_LG'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 13:24:43,900 INFO [decode.py:659] About to create model
6
+ 2022-07-13 13:24:44,614 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 13:24:51,676 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
8
+ 2022-07-13 13:24:51,751 INFO [decode.py:746] Loading data/lang_char/LG.pt
9
+ 2022-07-13 13:24:51,963 INFO [decode.py:759] Number of model parameters: 96910451
10
+ 2022-07-13 13:24:51,963 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
11
+ 2022-07-13 13:24:51,966 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
12
+ 2022-07-13 13:24:51,968 INFO [asr_datamodule.py:347] About to create dev dataset
13
+ 2022-07-13 13:24:52,172 INFO [asr_datamodule.py:366] About to create dev dataloader
14
+ 2022-07-13 13:24:55,331 INFO [decode.py:542] batch 0/?, cuts processed until now is 162
15
+ 2022-07-13 13:25:19,732 INFO [decode.py:559] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
16
+ 2022-07-13 13:25:19,794 INFO [utils.py:420] [dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 100.00% [24801 / 24802, 0 ins, 22302 del, 2499 sub ]
17
+ 2022-07-13 13:25:19,994 INFO [decode.py:572] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
18
+ 2022-07-13 13:25:19,997 INFO [decode.py:589]
19
+ For dev, WER of different settings are:
20
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 100.0 best for dev
21
+
22
+ 2022-07-13 13:25:22,787 INFO [decode.py:542] batch 0/?, cuts processed until now is 167
exp/fast_beam_search_nbest_LG/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01-2022-07-13-13-31-36 ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 13:31:36,014 INFO [decode.py:637] Decoding started
2
+ 2022-07-13 13:31:36,015 INFO [decode.py:643] Device: cuda:0
3
+ 2022-07-13 13:31:36,622 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 13:31:36,693 INFO [decode.py:655] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_LG', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_LG'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 13:31:36,694 INFO [decode.py:657] About to create model
6
+ 2022-07-13 13:31:37,417 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 13:31:44,805 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
8
+ 2022-07-13 13:31:44,879 INFO [decode.py:744] Loading data/lang_char/LG.pt
9
+ 2022-07-13 13:31:45,090 INFO [decode.py:757] Number of model parameters: 96910451
10
+ 2022-07-13 13:31:45,090 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
11
+ 2022-07-13 13:31:45,094 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
12
+ 2022-07-13 13:31:45,095 INFO [asr_datamodule.py:347] About to create dev dataset
13
+ 2022-07-13 13:31:45,313 INFO [asr_datamodule.py:366] About to create dev dataloader
14
+ 2022-07-13 13:31:48,609 INFO [decode.py:540] batch 0/?, cuts processed until now is 162
15
+ 2022-07-13 13:32:11,655 INFO [decode.py:557] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
16
+ 2022-07-13 13:32:11,716 INFO [utils.py:420] [dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 5.59% [1387 / 24802, 65 ins, 97 del, 1225 sub ]
17
+ 2022-07-13 13:32:11,874 INFO [decode.py:570] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
18
+ 2022-07-13 13:32:11,878 INFO [decode.py:587]
19
+ For dev, WER of different settings are:
20
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 5.59 best for dev
21
+
22
+ 2022-07-13 13:32:14,583 INFO [decode.py:540] batch 0/?, cuts processed until now is 167
23
+ 2022-07-13 13:32:49,213 INFO [decode.py:540] batch 20/?, cuts processed until now is 3914
24
+ 2022-07-13 13:33:00,211 INFO [decode.py:557] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_LG/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
25
+ 2022-07-13 13:33:00,344 INFO [utils.py:420] [test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01] %WER 5.82% [2882 / 49534, 109 ins, 207 del, 2566 sub ]
26
+ 2022-07-13 13:33:00,653 INFO [decode.py:570] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_LG/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt
27
+ 2022-07-13 13:33:00,656 INFO [decode.py:587]
28
+ For test, WER of different settings are:
29
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 5.82 best for test
30
+
31
+ 2022-07-13 13:33:00,656 INFO [decode.py:787] Done!
exp/fast_beam_search_nbest_LG/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_LG/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_LG/wer-summary-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ settings WER
2
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 5.59
exp/fast_beam_search_nbest_LG/wer-summary-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-ngram-lm-scale-0.01.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ settings WER
2
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5_ngram_lm_scale_0.01 5.82
exp/fast_beam_search_nbest_oracle/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_oracle/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-11-50-42 ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 11:50:42,224 INFO [decode.py:632] Decoding started
2
+ 2022-07-13 11:50:42,225 INFO [decode.py:638] Device: cuda:0
3
+ 2022-07-13 11:50:42,843 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 11:50:42,915 INFO [decode.py:645] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_oracle', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_oracle'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 11:50:42,915 INFO [decode.py:647] About to create model
6
+ 2022-07-13 11:50:43,638 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 11:50:51,193 INFO [decode.py:736] Number of model parameters: 96910451
8
+ 2022-07-13 11:50:51,194 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
9
+ 2022-07-13 11:50:51,197 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
10
+ 2022-07-13 11:50:51,198 INFO [asr_datamodule.py:347] About to create dev dataset
11
+ 2022-07-13 11:50:51,397 INFO [asr_datamodule.py:366] About to create dev dataloader
exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-12-55-59 ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 12:55:59,291 INFO [decode.py:636] Decoding started
2
+ 2022-07-13 12:55:59,292 INFO [decode.py:642] Device: cuda:0
3
+ 2022-07-13 12:55:59,906 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 12:55:59,978 INFO [decode.py:654] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_oracle', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_oracle'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 12:55:59,978 INFO [decode.py:656] About to create model
6
+ 2022-07-13 12:56:00,699 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 12:56:07,437 INFO [decode.py:745] Number of model parameters: 96910451
8
+ 2022-07-13 12:56:07,437 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
9
+ 2022-07-13 12:56:07,440 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
10
+ 2022-07-13 12:56:07,442 INFO [asr_datamodule.py:347] About to create dev dataset
11
+ 2022-07-13 12:56:07,662 INFO [asr_datamodule.py:366] About to create dev dataloader
exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-12-58-10 ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 12:58:10,602 INFO [decode.py:636] Decoding started
2
+ 2022-07-13 12:58:10,603 INFO [decode.py:642] Device: cuda:0
3
+ 2022-07-13 12:58:11,214 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 12:58:11,286 INFO [decode.py:654] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_oracle', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_oracle'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 12:58:11,286 INFO [decode.py:656] About to create model
6
+ 2022-07-13 12:58:11,994 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 12:58:18,479 INFO [decode.py:745] Number of model parameters: 96910451
8
+ 2022-07-13 12:58:18,480 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
9
+ 2022-07-13 12:58:18,483 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
10
+ 2022-07-13 12:58:18,485 INFO [asr_datamodule.py:347] About to create dev dataset
11
+ 2022-07-13 12:58:18,690 INFO [asr_datamodule.py:366] About to create dev dataloader
exp/fast_beam_search_nbest_oracle/log-decode-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200-2022-07-13-13-06-03 ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-13 13:06:03,230 INFO [decode.py:637] Decoding started
2
+ 2022-07-13 13:06:03,231 INFO [decode.py:643] Device: cuda:0
3
+ 2022-07-13 13:06:03,845 INFO [lexicon.py:176] Loading pre-compiled data/lang_char/Linv.pt
4
+ 2022-07-13 13:06:03,918 INFO [decode.py:655] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'f910452c594ac57b9a6117ad9b353555f95d45d8', 'k2-git-date': 'Mon Jul 4 14:26:28 2022', 'lhotse-version': '1.5.0.dev+git.99bbfcb.clean', 'torch-version': '1.11.0+cu113', 'torch-cuda-available': True, 'torch-cuda-version': '11.3', 'python-version': '3.8', 'icefall-git-branch': 'aishell2', 'icefall-git-sha1': 'dc40220-dirty', 'icefall-git-date': 'Fri Jul 8 03:13:09 2022', 'icefall-path': '/workspace/icefall', 'k2-path': '/usr/lib/python3.8/site-packages/k2-1.17.dev20220707+cuda11.3.torch1.11.0-py3.8-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.8/dist-packages/lhotse/__init__.py', 'hostname': '3125630', 'IP address': '0.47.177.126'}, 'epoch': 999, 'iter': 0, 'avg': 1, 'use_averaged_model': False, 'exp_dir': PosixPath('/result/context_size_2'), 'lang_dir': PosixPath('data/lang_char'), 'decoding_method': 'fast_beam_search_nbest_oracle', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': 24, 'dim_feedforward': 1536, 'nhead': 8, 'encoder_dim': 384, 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'res_dir': PosixPath('/result/context_size_2/fast_beam_search_nbest_oracle'), 'suffix': 'epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 5237}
5
+ 2022-07-13 13:06:03,919 INFO [decode.py:657] About to create model
6
+ 2022-07-13 13:06:04,627 INFO [checkpoint.py:112] Loading checkpoint from /result/context_size_2/epoch-999.pt
7
+ 2022-07-13 13:06:11,068 INFO [decode.py:746] Number of model parameters: 96910451
8
+ 2022-07-13 13:06:11,068 INFO [asr_datamodule.py:408] About to gen cuts from aishell2_cuts_dev.jsonl.gz
9
+ 2022-07-13 13:06:11,072 INFO [asr_datamodule.py:415] About to gen cuts from aishell2_cuts_test.jsonl.gz
10
+ 2022-07-13 13:06:11,073 INFO [asr_datamodule.py:347] About to create dev dataset
11
+ 2022-07-13 13:06:11,272 INFO [asr_datamodule.py:366] About to create dev dataloader
12
+ 2022-07-13 13:06:13,829 INFO [decode.py:540] batch 0/?, cuts processed until now is 162
13
+ 2022-07-13 13:06:31,421 INFO [decode.py:557] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_oracle/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
14
+ 2022-07-13 13:06:31,477 INFO [utils.py:420] [dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5] %WER 2.04% [506 / 24802, 27 ins, 39 del, 440 sub ]
15
+ 2022-07-13 13:06:31,633 INFO [decode.py:570] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_oracle/errs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
16
+ 2022-07-13 13:06:32,087 INFO [decode.py:587]
17
+ For dev, WER of different settings are:
18
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 2.04 best for dev
19
+
20
+ 2022-07-13 13:06:34,325 INFO [decode.py:540] batch 0/?, cuts processed until now is 167
21
+ 2022-07-13 13:06:59,317 INFO [decode.py:540] batch 20/?, cuts processed until now is 3914
22
+ 2022-07-13 13:07:07,346 INFO [decode.py:557] The transcripts are stored in /result/context_size_2/fast_beam_search_nbest_oracle/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
23
+ 2022-07-13 13:07:07,457 INFO [utils.py:420] [test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5] %WER 2.20% [1088 / 49534, 80 ins, 74 del, 934 sub ]
24
+ 2022-07-13 13:07:07,753 INFO [decode.py:570] Wrote detailed error stats to /result/context_size_2/fast_beam_search_nbest_oracle/errs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt
25
+ 2022-07-13 13:07:07,754 INFO [decode.py:587]
26
+ For test, WER of different settings are:
27
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 2.2 best for test
28
+
29
+ 2022-07-13 13:07:07,754 INFO [decode.py:776] Done!
exp/fast_beam_search_nbest_oracle/recogs-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_oracle/recogs-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
The diff for this file is too large to render. See raw diff
 
exp/fast_beam_search_nbest_oracle/wer-summary-dev-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ settings WER
2
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 2.04
exp/fast_beam_search_nbest_oracle/wer-summary-test-beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5-epoch-999-avg-1-beam-20.0-max-contexts-8-max-states-64-nbest-scale-0.5-num-paths-200.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ settings WER
2
+ beam_20.0_max_contexts_8_max_states_64_num_paths_200_nbest_scale_0.5 2.2