QuandongWang commited on
Commit
7708e74
1 Parent(s): 109efbc

upload the decoding log

Browse files
log/log-1best/log-decode-2022-07-20-21-02-14 ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 21:02:14,517 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 21:02:14,517 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'd812f82-dirty', 'icefall-git-date': 'Wed Jul 20 19:47:36 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/open_source/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': '1best', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 200, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 21:02:14,924 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 21:02:14,985 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 21:02:21,902 INFO [decode.py:821] Calculating the averaged model over epoch range from 22 (excluded) to 30
6
+ 2022-07-20 21:02:24,825 INFO [decode.py:837] Number of model parameters: 103071035
7
+ 2022-07-20 21:02:24,825 INFO [asr_datamodule.py:444] About to get test-clean cuts
8
+ 2022-07-20 21:02:24,840 INFO [asr_datamodule.py:451] About to get test-other cuts
9
+ 2022-07-20 21:02:26,027 INFO [decode.py:588] batch 0/?, cuts processed until now is 15
10
+ 2022-07-20 21:02:52,011 INFO [decode.py:588] batch 100/?, cuts processed until now is 2578
11
+ 2022-07-20 21:02:52,790 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-no_rescore.txt
12
+ 2022-07-20 21:02:52,871 INFO [utils.py:416] [test-clean-no_rescore] %WER 2.93% [1542 / 52576, 148 ins, 217 del, 1177 sub ]
13
+ 2022-07-20 21:02:53,105 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-no_rescore.txt
14
+ 2022-07-20 21:02:53,106 INFO [decode.py:637]
15
+ For test-clean, WER of different settings are:
16
+ no_rescore 2.93 best for test-clean
17
+
18
+ 2022-07-20 21:02:54,150 INFO [decode.py:588] batch 0/?, cuts processed until now is 19
19
+ 2022-07-20 21:03:19,172 INFO [decode.py:588] batch 100/?, cuts processed until now is 2888
20
+ 2022-07-20 21:03:19,539 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-no_rescore.txt
21
+ 2022-07-20 21:03:19,620 INFO [utils.py:416] [test-other-no_rescore] %WER 6.37% [3332 / 52343, 258 ins, 592 del, 2482 sub ]
22
+ 2022-07-20 21:03:19,850 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-no_rescore.txt
23
+ 2022-07-20 21:03:19,851 INFO [decode.py:637]
24
+ For test-other, WER of different settings are:
25
+ no_rescore 6.37 best for test-other
26
+
27
+ 2022-07-20 21:03:19,851 INFO [decode.py:868] Done!
log/log-attention-decoder/log-decode-2022-07-20-21-02-14 ADDED
@@ -0,0 +1,1199 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 21:02:14,406 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 21:02:14,406 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'd812f82-dirty', 'icefall-git-date': 'Wed Jul 20 19:47:36 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/open_source/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'attention-decoder', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 20, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 21:02:14,821 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 21:02:14,886 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 21:02:21,301 INFO [decode.py:730] Loading pre-compiled G_4_gram.pt
6
+ 2022-07-20 21:02:26,829 INFO [decode.py:821] Calculating the averaged model over epoch range from 22 (excluded) to 30
7
+ 2022-07-20 21:02:29,635 INFO [decode.py:837] Number of model parameters: 103071035
8
+ 2022-07-20 21:02:29,635 INFO [asr_datamodule.py:444] About to get test-clean cuts
9
+ 2022-07-20 21:02:29,639 INFO [asr_datamodule.py:451] About to get test-other cuts
10
+ 2022-07-20 21:02:31,600 INFO [decode.py:588] batch 0/?, cuts processed until now is 1
11
+ 2022-07-20 21:04:08,163 INFO [decode.py:588] batch 100/?, cuts processed until now is 212
12
+ 2022-07-20 21:05:41,871 INFO [decode.py:588] batch 200/?, cuts processed until now is 459
13
+ 2022-07-20 21:07:13,153 INFO [decode.py:588] batch 300/?, cuts processed until now is 717
14
+ 2022-07-20 21:08:43,103 INFO [decode.py:588] batch 400/?, cuts processed until now is 949
15
+ 2022-07-20 21:10:19,640 INFO [decode.py:588] batch 500/?, cuts processed until now is 1163
16
+ 2022-07-20 21:11:51,100 INFO [decode.py:588] batch 600/?, cuts processed until now is 1411
17
+ 2022-07-20 21:13:20,028 INFO [decode.py:588] batch 700/?, cuts processed until now is 1652
18
+ 2022-07-20 21:14:53,913 INFO [decode.py:588] batch 800/?, cuts processed until now is 1867
19
+ 2022-07-20 21:16:20,862 INFO [decode.py:588] batch 900/?, cuts processed until now is 2122
20
+ 2022-07-20 21:17:51,185 INFO [decode.py:588] batch 1000/?, cuts processed until now is 2321
21
+ 2022-07-20 21:19:26,869 INFO [decode.py:588] batch 1100/?, cuts processed until now is 2468
22
+ 2022-07-20 21:20:57,079 INFO [decode.py:588] batch 1200/?, cuts processed until now is 2601
23
+ 2022-07-20 21:24:52,558 INFO [decode.py:637]
24
+ For test-clean, WER of different settings are:
25
+ ngram_lm_scale_0.6_attention_scale_0.5 2.59 best for test-clean
26
+ ngram_lm_scale_0.7_attention_scale_1.1 2.59
27
+ ngram_lm_scale_0.7_attention_scale_1.2 2.59
28
+ ngram_lm_scale_0.7_attention_scale_1.3 2.59
29
+ ngram_lm_scale_0.9_attention_scale_0.5 2.59
30
+ ngram_lm_scale_0.9_attention_scale_0.6 2.59
31
+ ngram_lm_scale_0.9_attention_scale_1.3 2.59
32
+ ngram_lm_scale_1.0_attention_scale_0.6 2.59
33
+ ngram_lm_scale_1.0_attention_scale_0.7 2.59
34
+ ngram_lm_scale_1.0_attention_scale_1.5 2.59
35
+ ngram_lm_scale_1.1_attention_scale_0.6 2.59
36
+ ngram_lm_scale_1.1_attention_scale_0.9 2.59
37
+ ngram_lm_scale_1.2_attention_scale_1.1 2.59
38
+ ngram_lm_scale_1.5_attention_scale_1.5 2.59
39
+ ngram_lm_scale_0.6_attention_scale_0.9 2.6
40
+ ngram_lm_scale_0.6_attention_scale_1.0 2.6
41
+ ngram_lm_scale_0.6_attention_scale_1.1 2.6
42
+ ngram_lm_scale_0.6_attention_scale_1.2 2.6
43
+ ngram_lm_scale_0.6_attention_scale_1.3 2.6
44
+ ngram_lm_scale_0.6_attention_scale_1.9 2.6
45
+ ngram_lm_scale_0.6_attention_scale_2.0 2.6
46
+ ngram_lm_scale_0.7_attention_scale_0.5 2.6
47
+ ngram_lm_scale_0.7_attention_scale_0.6 2.6
48
+ ngram_lm_scale_0.7_attention_scale_0.7 2.6
49
+ ngram_lm_scale_0.7_attention_scale_0.9 2.6
50
+ ngram_lm_scale_0.7_attention_scale_1.0 2.6
51
+ ngram_lm_scale_0.7_attention_scale_2.3 2.6
52
+ ngram_lm_scale_0.9_attention_scale_0.7 2.6
53
+ ngram_lm_scale_0.9_attention_scale_0.9 2.6
54
+ ngram_lm_scale_0.9_attention_scale_1.0 2.6
55
+ ngram_lm_scale_0.9_attention_scale_1.1 2.6
56
+ ngram_lm_scale_0.9_attention_scale_1.2 2.6
57
+ ngram_lm_scale_0.9_attention_scale_1.5 2.6
58
+ ngram_lm_scale_0.9_attention_scale_1.7 2.6
59
+ ngram_lm_scale_0.9_attention_scale_1.9 2.6
60
+ ngram_lm_scale_1.0_attention_scale_0.9 2.6
61
+ ngram_lm_scale_1.0_attention_scale_1.0 2.6
62
+ ngram_lm_scale_1.0_attention_scale_1.1 2.6
63
+ ngram_lm_scale_1.0_attention_scale_1.7 2.6
64
+ ngram_lm_scale_1.1_attention_scale_0.7 2.6
65
+ ngram_lm_scale_1.1_attention_scale_1.0 2.6
66
+ ngram_lm_scale_1.1_attention_scale_1.7 2.6
67
+ ngram_lm_scale_1.1_attention_scale_1.9 2.6
68
+ ngram_lm_scale_1.1_attention_scale_2.0 2.6
69
+ ngram_lm_scale_1.1_attention_scale_2.1 2.6
70
+ ngram_lm_scale_1.1_attention_scale_2.2 2.6
71
+ ngram_lm_scale_1.2_attention_scale_0.9 2.6
72
+ ngram_lm_scale_1.2_attention_scale_1.0 2.6
73
+ ngram_lm_scale_1.2_attention_scale_2.2 2.6
74
+ ngram_lm_scale_1.2_attention_scale_2.3 2.6
75
+ ngram_lm_scale_1.3_attention_scale_1.0 2.6
76
+ ngram_lm_scale_1.3_attention_scale_1.1 2.6
77
+ ngram_lm_scale_1.3_attention_scale_1.2 2.6
78
+ ngram_lm_scale_1.3_attention_scale_1.3 2.6
79
+ ngram_lm_scale_1.5_attention_scale_1.7 2.6
80
+ ngram_lm_scale_1.5_attention_scale_2.1 2.6
81
+ ngram_lm_scale_1.7_attention_scale_1.9 2.6
82
+ ngram_lm_scale_0.5_attention_scale_1.2 2.61
83
+ ngram_lm_scale_0.5_attention_scale_1.3 2.61
84
+ ngram_lm_scale_0.5_attention_scale_1.5 2.61
85
+ ngram_lm_scale_0.5_attention_scale_1.7 2.61
86
+ ngram_lm_scale_0.5_attention_scale_1.9 2.61
87
+ ngram_lm_scale_0.5_attention_scale_2.0 2.61
88
+ ngram_lm_scale_0.5_attention_scale_2.2 2.61
89
+ ngram_lm_scale_0.6_attention_scale_0.6 2.61
90
+ ngram_lm_scale_0.6_attention_scale_0.7 2.61
91
+ ngram_lm_scale_0.6_attention_scale_1.5 2.61
92
+ ngram_lm_scale_0.6_attention_scale_1.7 2.61
93
+ ngram_lm_scale_0.6_attention_scale_2.1 2.61
94
+ ngram_lm_scale_0.6_attention_scale_2.2 2.61
95
+ ngram_lm_scale_0.6_attention_scale_2.3 2.61
96
+ ngram_lm_scale_0.6_attention_scale_2.5 2.61
97
+ ngram_lm_scale_0.7_attention_scale_0.3 2.61
98
+ ngram_lm_scale_0.7_attention_scale_1.5 2.61
99
+ ngram_lm_scale_0.7_attention_scale_1.7 2.61
100
+ ngram_lm_scale_0.7_attention_scale_1.9 2.61
101
+ ngram_lm_scale_0.7_attention_scale_2.2 2.61
102
+ ngram_lm_scale_0.7_attention_scale_2.5 2.61
103
+ ngram_lm_scale_0.9_attention_scale_2.0 2.61
104
+ ngram_lm_scale_0.9_attention_scale_2.1 2.61
105
+ ngram_lm_scale_0.9_attention_scale_2.2 2.61
106
+ ngram_lm_scale_0.9_attention_scale_2.3 2.61
107
+ ngram_lm_scale_0.9_attention_scale_2.5 2.61
108
+ ngram_lm_scale_0.9_attention_scale_3.0 2.61
109
+ ngram_lm_scale_1.0_attention_scale_0.5 2.61
110
+ ngram_lm_scale_1.0_attention_scale_1.2 2.61
111
+ ngram_lm_scale_1.0_attention_scale_1.3 2.61
112
+ ngram_lm_scale_1.0_attention_scale_1.9 2.61
113
+ ngram_lm_scale_1.0_attention_scale_2.2 2.61
114
+ ngram_lm_scale_1.0_attention_scale_2.3 2.61
115
+ ngram_lm_scale_1.0_attention_scale_2.5 2.61
116
+ ngram_lm_scale_1.0_attention_scale_3.0 2.61
117
+ ngram_lm_scale_1.1_attention_scale_1.1 2.61
118
+ ngram_lm_scale_1.1_attention_scale_1.2 2.61
119
+ ngram_lm_scale_1.1_attention_scale_1.3 2.61
120
+ ngram_lm_scale_1.1_attention_scale_1.5 2.61
121
+ ngram_lm_scale_1.1_attention_scale_2.3 2.61
122
+ ngram_lm_scale_1.2_attention_scale_0.7 2.61
123
+ ngram_lm_scale_1.2_attention_scale_1.2 2.61
124
+ ngram_lm_scale_1.2_attention_scale_1.3 2.61
125
+ ngram_lm_scale_1.2_attention_scale_1.7 2.61
126
+ ngram_lm_scale_1.2_attention_scale_1.9 2.61
127
+ ngram_lm_scale_1.2_attention_scale_2.0 2.61
128
+ ngram_lm_scale_1.2_attention_scale_2.1 2.61
129
+ ngram_lm_scale_1.2_attention_scale_2.5 2.61
130
+ ngram_lm_scale_1.2_attention_scale_5.0 2.61
131
+ ngram_lm_scale_1.3_attention_scale_0.9 2.61
132
+ ngram_lm_scale_1.3_attention_scale_1.5 2.61
133
+ ngram_lm_scale_1.3_attention_scale_1.7 2.61
134
+ ngram_lm_scale_1.3_attention_scale_1.9 2.61
135
+ ngram_lm_scale_1.3_attention_scale_2.2 2.61
136
+ ngram_lm_scale_1.3_attention_scale_2.3 2.61
137
+ ngram_lm_scale_1.3_attention_scale_2.5 2.61
138
+ ngram_lm_scale_1.3_attention_scale_3.0 2.61
139
+ ngram_lm_scale_1.3_attention_scale_5.0 2.61
140
+ ngram_lm_scale_1.5_attention_scale_1.3 2.61
141
+ ngram_lm_scale_1.5_attention_scale_1.9 2.61
142
+ ngram_lm_scale_1.5_attention_scale_2.0 2.61
143
+ ngram_lm_scale_1.5_attention_scale_2.2 2.61
144
+ ngram_lm_scale_1.5_attention_scale_2.3 2.61
145
+ ngram_lm_scale_1.5_attention_scale_2.5 2.61
146
+ ngram_lm_scale_1.5_attention_scale_3.0 2.61
147
+ ngram_lm_scale_1.5_attention_scale_5.0 2.61
148
+ ngram_lm_scale_1.7_attention_scale_2.0 2.61
149
+ ngram_lm_scale_1.7_attention_scale_2.1 2.61
150
+ ngram_lm_scale_1.7_attention_scale_2.2 2.61
151
+ ngram_lm_scale_1.7_attention_scale_2.3 2.61
152
+ ngram_lm_scale_1.7_attention_scale_2.5 2.61
153
+ ngram_lm_scale_1.7_attention_scale_5.0 2.61
154
+ ngram_lm_scale_2.0_attention_scale_2.5 2.61
155
+ ngram_lm_scale_2.3_attention_scale_4.0 2.61
156
+ ngram_lm_scale_2.5_attention_scale_5.0 2.61
157
+ ngram_lm_scale_0.5_attention_scale_0.3 2.62
158
+ ngram_lm_scale_0.5_attention_scale_0.6 2.62
159
+ ngram_lm_scale_0.5_attention_scale_0.7 2.62
160
+ ngram_lm_scale_0.5_attention_scale_0.9 2.62
161
+ ngram_lm_scale_0.5_attention_scale_1.0 2.62
162
+ ngram_lm_scale_0.5_attention_scale_1.1 2.62
163
+ ngram_lm_scale_0.5_attention_scale_2.1 2.62
164
+ ngram_lm_scale_0.5_attention_scale_2.3 2.62
165
+ ngram_lm_scale_0.5_attention_scale_2.5 2.62
166
+ ngram_lm_scale_0.5_attention_scale_3.0 2.62
167
+ ngram_lm_scale_0.6_attention_scale_0.3 2.62
168
+ ngram_lm_scale_0.6_attention_scale_3.0 2.62
169
+ ngram_lm_scale_0.7_attention_scale_2.0 2.62
170
+ ngram_lm_scale_0.7_attention_scale_2.1 2.62
171
+ ngram_lm_scale_0.7_attention_scale_3.0 2.62
172
+ ngram_lm_scale_0.9_attention_scale_4.0 2.62
173
+ ngram_lm_scale_1.0_attention_scale_2.0 2.62
174
+ ngram_lm_scale_1.0_attention_scale_2.1 2.62
175
+ ngram_lm_scale_1.0_attention_scale_4.0 2.62
176
+ ngram_lm_scale_1.0_attention_scale_5.0 2.62
177
+ ngram_lm_scale_1.1_attention_scale_0.5 2.62
178
+ ngram_lm_scale_1.1_attention_scale_2.5 2.62
179
+ ngram_lm_scale_1.1_attention_scale_3.0 2.62
180
+ ngram_lm_scale_1.1_attention_scale_4.0 2.62
181
+ ngram_lm_scale_1.1_attention_scale_5.0 2.62
182
+ ngram_lm_scale_1.2_attention_scale_1.5 2.62
183
+ ngram_lm_scale_1.2_attention_scale_3.0 2.62
184
+ ngram_lm_scale_1.2_attention_scale_4.0 2.62
185
+ ngram_lm_scale_1.3_attention_scale_2.0 2.62
186
+ ngram_lm_scale_1.3_attention_scale_2.1 2.62
187
+ ngram_lm_scale_1.3_attention_scale_4.0 2.62
188
+ ngram_lm_scale_1.5_attention_scale_4.0 2.62
189
+ ngram_lm_scale_1.7_attention_scale_1.7 2.62
190
+ ngram_lm_scale_1.7_attention_scale_3.0 2.62
191
+ ngram_lm_scale_1.7_attention_scale_4.0 2.62
192
+ ngram_lm_scale_1.9_attention_scale_1.9 2.62
193
+ ngram_lm_scale_1.9_attention_scale_2.0 2.62
194
+ ngram_lm_scale_1.9_attention_scale_2.1 2.62
195
+ ngram_lm_scale_1.9_attention_scale_2.2 2.62
196
+ ngram_lm_scale_1.9_attention_scale_2.3 2.62
197
+ ngram_lm_scale_1.9_attention_scale_2.5 2.62
198
+ ngram_lm_scale_1.9_attention_scale_3.0 2.62
199
+ ngram_lm_scale_1.9_attention_scale_4.0 2.62
200
+ ngram_lm_scale_1.9_attention_scale_5.0 2.62
201
+ ngram_lm_scale_2.0_attention_scale_2.0 2.62
202
+ ngram_lm_scale_2.0_attention_scale_2.1 2.62
203
+ ngram_lm_scale_2.0_attention_scale_2.2 2.62
204
+ ngram_lm_scale_2.0_attention_scale_2.3 2.62
205
+ ngram_lm_scale_2.0_attention_scale_3.0 2.62
206
+ ngram_lm_scale_2.0_attention_scale_4.0 2.62
207
+ ngram_lm_scale_2.0_attention_scale_5.0 2.62
208
+ ngram_lm_scale_2.1_attention_scale_4.0 2.62
209
+ ngram_lm_scale_2.1_attention_scale_5.0 2.62
210
+ ngram_lm_scale_2.2_attention_scale_3.0 2.62
211
+ ngram_lm_scale_2.2_attention_scale_4.0 2.62
212
+ ngram_lm_scale_2.2_attention_scale_5.0 2.62
213
+ ngram_lm_scale_2.3_attention_scale_3.0 2.62
214
+ ngram_lm_scale_2.3_attention_scale_5.0 2.62
215
+ ngram_lm_scale_0.3_attention_scale_1.5 2.63
216
+ ngram_lm_scale_0.5_attention_scale_0.5 2.63
217
+ ngram_lm_scale_0.6_attention_scale_4.0 2.63
218
+ ngram_lm_scale_0.7_attention_scale_4.0 2.63
219
+ ngram_lm_scale_0.7_attention_scale_5.0 2.63
220
+ ngram_lm_scale_0.9_attention_scale_5.0 2.63
221
+ ngram_lm_scale_1.2_attention_scale_0.6 2.63
222
+ ngram_lm_scale_1.5_attention_scale_1.2 2.63
223
+ ngram_lm_scale_1.7_attention_scale_1.5 2.63
224
+ ngram_lm_scale_2.0_attention_scale_1.9 2.63
225
+ ngram_lm_scale_2.1_attention_scale_2.1 2.63
226
+ ngram_lm_scale_2.1_attention_scale_2.2 2.63
227
+ ngram_lm_scale_2.1_attention_scale_2.3 2.63
228
+ ngram_lm_scale_2.1_attention_scale_2.5 2.63
229
+ ngram_lm_scale_2.1_attention_scale_3.0 2.63
230
+ ngram_lm_scale_2.2_attention_scale_2.3 2.63
231
+ ngram_lm_scale_2.3_attention_scale_2.5 2.63
232
+ ngram_lm_scale_3.0_attention_scale_4.0 2.63
233
+ ngram_lm_scale_3.0_attention_scale_5.0 2.63
234
+ ngram_lm_scale_0.3_attention_scale_0.5 2.64
235
+ ngram_lm_scale_0.3_attention_scale_0.6 2.64
236
+ ngram_lm_scale_0.3_attention_scale_0.9 2.64
237
+ ngram_lm_scale_0.3_attention_scale_1.2 2.64
238
+ ngram_lm_scale_0.3_attention_scale_1.3 2.64
239
+ ngram_lm_scale_0.3_attention_scale_1.7 2.64
240
+ ngram_lm_scale_0.3_attention_scale_1.9 2.64
241
+ ngram_lm_scale_0.5_attention_scale_0.08 2.64
242
+ ngram_lm_scale_0.5_attention_scale_0.1 2.64
243
+ ngram_lm_scale_0.5_attention_scale_4.0 2.64
244
+ ngram_lm_scale_0.9_attention_scale_0.3 2.64
245
+ ngram_lm_scale_1.2_attention_scale_0.5 2.64
246
+ ngram_lm_scale_1.3_attention_scale_0.7 2.64
247
+ ngram_lm_scale_1.5_attention_scale_1.1 2.64
248
+ ngram_lm_scale_1.9_attention_scale_1.7 2.64
249
+ ngram_lm_scale_2.2_attention_scale_2.2 2.64
250
+ ngram_lm_scale_2.2_attention_scale_2.5 2.64
251
+ ngram_lm_scale_2.3_attention_scale_2.3 2.64
252
+ ngram_lm_scale_2.5_attention_scale_2.5 2.64
253
+ ngram_lm_scale_2.5_attention_scale_3.0 2.64
254
+ ngram_lm_scale_2.5_attention_scale_4.0 2.64
255
+ ngram_lm_scale_0.3_attention_scale_0.3 2.65
256
+ ngram_lm_scale_0.3_attention_scale_0.7 2.65
257
+ ngram_lm_scale_0.3_attention_scale_1.0 2.65
258
+ ngram_lm_scale_0.3_attention_scale_1.1 2.65
259
+ ngram_lm_scale_0.3_attention_scale_2.0 2.65
260
+ ngram_lm_scale_0.3_attention_scale_2.1 2.65
261
+ ngram_lm_scale_0.3_attention_scale_2.2 2.65
262
+ ngram_lm_scale_0.3_attention_scale_2.3 2.65
263
+ ngram_lm_scale_0.3_attention_scale_2.5 2.65
264
+ ngram_lm_scale_0.3_attention_scale_3.0 2.65
265
+ ngram_lm_scale_0.5_attention_scale_0.05 2.65
266
+ ngram_lm_scale_0.6_attention_scale_0.05 2.65
267
+ ngram_lm_scale_0.6_attention_scale_0.08 2.65
268
+ ngram_lm_scale_0.6_attention_scale_0.1 2.65
269
+ ngram_lm_scale_0.6_attention_scale_5.0 2.65
270
+ ngram_lm_scale_1.0_attention_scale_0.3 2.65
271
+ ngram_lm_scale_1.5_attention_scale_1.0 2.65
272
+ ngram_lm_scale_1.9_attention_scale_1.5 2.65
273
+ ngram_lm_scale_2.0_attention_scale_1.7 2.65
274
+ ngram_lm_scale_2.1_attention_scale_2.0 2.65
275
+ ngram_lm_scale_2.2_attention_scale_2.0 2.65
276
+ ngram_lm_scale_2.2_attention_scale_2.1 2.65
277
+ ngram_lm_scale_2.3_attention_scale_2.2 2.65
278
+ ngram_lm_scale_4.0_attention_scale_5.0 2.65
279
+ ngram_lm_scale_0.08_attention_scale_1.0 2.66
280
+ ngram_lm_scale_0.1_attention_scale_1.0 2.66
281
+ ngram_lm_scale_0.1_attention_scale_1.1 2.66
282
+ ngram_lm_scale_0.3_attention_scale_4.0 2.66
283
+ ngram_lm_scale_0.5_attention_scale_0.01 2.66
284
+ ngram_lm_scale_0.5_attention_scale_5.0 2.66
285
+ ngram_lm_scale_0.7_attention_scale_0.05 2.66
286
+ ngram_lm_scale_0.7_attention_scale_0.08 2.66
287
+ ngram_lm_scale_0.7_attention_scale_0.1 2.66
288
+ ngram_lm_scale_1.3_attention_scale_0.6 2.66
289
+ ngram_lm_scale_1.5_attention_scale_0.9 2.66
290
+ ngram_lm_scale_1.7_attention_scale_1.1 2.66
291
+ ngram_lm_scale_1.7_attention_scale_1.2 2.66
292
+ ngram_lm_scale_1.7_attention_scale_1.3 2.66
293
+ ngram_lm_scale_2.1_attention_scale_1.7 2.66
294
+ ngram_lm_scale_2.1_attention_scale_1.9 2.66
295
+ ngram_lm_scale_2.3_attention_scale_2.1 2.66
296
+ ngram_lm_scale_0.05_attention_scale_1.0 2.67
297
+ ngram_lm_scale_0.05_attention_scale_1.1 2.67
298
+ ngram_lm_scale_0.05_attention_scale_1.2 2.67
299
+ ngram_lm_scale_0.08_attention_scale_1.1 2.67
300
+ ngram_lm_scale_0.08_attention_scale_1.2 2.67
301
+ ngram_lm_scale_0.08_attention_scale_1.3 2.67
302
+ ngram_lm_scale_0.08_attention_scale_1.5 2.67
303
+ ngram_lm_scale_0.08_attention_scale_2.5 2.67
304
+ ngram_lm_scale_0.1_attention_scale_1.2 2.67
305
+ ngram_lm_scale_0.1_attention_scale_1.3 2.67
306
+ ngram_lm_scale_0.1_attention_scale_1.5 2.67
307
+ ngram_lm_scale_0.1_attention_scale_2.5 2.67
308
+ ngram_lm_scale_0.3_attention_scale_0.05 2.67
309
+ ngram_lm_scale_0.3_attention_scale_0.1 2.67
310
+ ngram_lm_scale_0.6_attention_scale_0.01 2.67
311
+ ngram_lm_scale_2.0_attention_scale_1.5 2.67
312
+ ngram_lm_scale_2.2_attention_scale_1.9 2.67
313
+ ngram_lm_scale_2.3_attention_scale_1.9 2.67
314
+ ngram_lm_scale_2.3_attention_scale_2.0 2.67
315
+ ngram_lm_scale_2.5_attention_scale_2.3 2.67
316
+ ngram_lm_scale_3.0_attention_scale_3.0 2.67
317
+ ngram_lm_scale_0.01_attention_scale_1.3 2.68
318
+ ngram_lm_scale_0.05_attention_scale_1.3 2.68
319
+ ngram_lm_scale_0.05_attention_scale_1.5 2.68
320
+ ngram_lm_scale_0.05_attention_scale_4.0 2.68
321
+ ngram_lm_scale_0.08_attention_scale_1.7 2.68
322
+ ngram_lm_scale_0.08_attention_scale_2.3 2.68
323
+ ngram_lm_scale_0.08_attention_scale_3.0 2.68
324
+ ngram_lm_scale_0.08_attention_scale_4.0 2.68
325
+ ngram_lm_scale_0.1_attention_scale_0.9 2.68
326
+ ngram_lm_scale_0.1_attention_scale_1.7 2.68
327
+ ngram_lm_scale_0.1_attention_scale_1.9 2.68
328
+ ngram_lm_scale_0.1_attention_scale_2.0 2.68
329
+ ngram_lm_scale_0.1_attention_scale_2.1 2.68
330
+ ngram_lm_scale_0.1_attention_scale_2.2 2.68
331
+ ngram_lm_scale_0.1_attention_scale_2.3 2.68
332
+ ngram_lm_scale_0.1_attention_scale_3.0 2.68
333
+ ngram_lm_scale_0.1_attention_scale_4.0 2.68
334
+ ngram_lm_scale_0.3_attention_scale_0.01 2.68
335
+ ngram_lm_scale_0.3_attention_scale_0.08 2.68
336
+ ngram_lm_scale_0.3_attention_scale_5.0 2.68
337
+ ngram_lm_scale_0.7_attention_scale_0.01 2.68
338
+ ngram_lm_scale_1.1_attention_scale_0.3 2.68
339
+ ngram_lm_scale_1.9_attention_scale_1.3 2.68
340
+ ngram_lm_scale_2.5_attention_scale_2.2 2.68
341
+ ngram_lm_scale_0.01_attention_scale_0.9 2.69
342
+ ngram_lm_scale_0.01_attention_scale_1.0 2.69
343
+ ngram_lm_scale_0.01_attention_scale_1.1 2.69
344
+ ngram_lm_scale_0.01_attention_scale_1.2 2.69
345
+ ngram_lm_scale_0.01_attention_scale_1.5 2.69
346
+ ngram_lm_scale_0.01_attention_scale_2.0 2.69
347
+ ngram_lm_scale_0.01_attention_scale_4.0 2.69
348
+ ngram_lm_scale_0.01_attention_scale_5.0 2.69
349
+ ngram_lm_scale_0.05_attention_scale_0.9 2.69
350
+ ngram_lm_scale_0.05_attention_scale_1.7 2.69
351
+ ngram_lm_scale_0.05_attention_scale_1.9 2.69
352
+ ngram_lm_scale_0.05_attention_scale_2.0 2.69
353
+ ngram_lm_scale_0.05_attention_scale_2.1 2.69
354
+ ngram_lm_scale_0.05_attention_scale_2.2 2.69
355
+ ngram_lm_scale_0.05_attention_scale_2.3 2.69
356
+ ngram_lm_scale_0.05_attention_scale_2.5 2.69
357
+ ngram_lm_scale_0.05_attention_scale_3.0 2.69
358
+ ngram_lm_scale_0.05_attention_scale_5.0 2.69
359
+ ngram_lm_scale_0.08_attention_scale_0.7 2.69
360
+ ngram_lm_scale_0.08_attention_scale_0.9 2.69
361
+ ngram_lm_scale_0.08_attention_scale_1.9 2.69
362
+ ngram_lm_scale_0.08_attention_scale_2.0 2.69
363
+ ngram_lm_scale_0.08_attention_scale_2.1 2.69
364
+ ngram_lm_scale_0.08_attention_scale_2.2 2.69
365
+ ngram_lm_scale_0.08_attention_scale_5.0 2.69
366
+ ngram_lm_scale_0.1_attention_scale_0.3 2.69
367
+ ngram_lm_scale_0.1_attention_scale_0.7 2.69
368
+ ngram_lm_scale_0.1_attention_scale_5.0 2.69
369
+ ngram_lm_scale_0.9_attention_scale_0.1 2.69
370
+ ngram_lm_scale_1.3_attention_scale_0.5 2.69
371
+ ngram_lm_scale_2.2_attention_scale_1.7 2.69
372
+ ngram_lm_scale_0.01_attention_scale_1.7 2.7
373
+ ngram_lm_scale_0.01_attention_scale_1.9 2.7
374
+ ngram_lm_scale_0.01_attention_scale_2.1 2.7
375
+ ngram_lm_scale_0.01_attention_scale_2.2 2.7
376
+ ngram_lm_scale_0.01_attention_scale_2.3 2.7
377
+ ngram_lm_scale_0.01_attention_scale_2.5 2.7
378
+ ngram_lm_scale_0.01_attention_scale_3.0 2.7
379
+ ngram_lm_scale_0.05_attention_scale_0.6 2.7
380
+ ngram_lm_scale_0.05_attention_scale_0.7 2.7
381
+ ngram_lm_scale_0.08_attention_scale_0.1 2.7
382
+ ngram_lm_scale_0.08_attention_scale_0.3 2.7
383
+ ngram_lm_scale_0.08_attention_scale_0.5 2.7
384
+ ngram_lm_scale_0.08_attention_scale_0.6 2.7
385
+ ngram_lm_scale_0.1_attention_scale_0.1 2.7
386
+ ngram_lm_scale_0.1_attention_scale_0.5 2.7
387
+ ngram_lm_scale_0.1_attention_scale_0.6 2.7
388
+ ngram_lm_scale_0.9_attention_scale_0.08 2.7
389
+ ngram_lm_scale_1.7_attention_scale_1.0 2.7
390
+ ngram_lm_scale_1.9_attention_scale_1.2 2.7
391
+ ngram_lm_scale_2.1_attention_scale_1.5 2.7
392
+ ngram_lm_scale_2.5_attention_scale_2.0 2.7
393
+ ngram_lm_scale_2.5_attention_scale_2.1 2.7
394
+ ngram_lm_scale_4.0_attention_scale_4.0 2.7
395
+ ngram_lm_scale_0.01_attention_scale_0.3 2.71
396
+ ngram_lm_scale_0.01_attention_scale_0.7 2.71
397
+ ngram_lm_scale_0.05_attention_scale_0.3 2.71
398
+ ngram_lm_scale_0.05_attention_scale_0.5 2.71
399
+ ngram_lm_scale_0.08_attention_scale_0.08 2.71
400
+ ngram_lm_scale_0.1_attention_scale_0.05 2.71
401
+ ngram_lm_scale_0.1_attention_scale_0.08 2.71
402
+ ngram_lm_scale_1.5_attention_scale_0.7 2.71
403
+ ngram_lm_scale_2.0_attention_scale_1.3 2.71
404
+ ngram_lm_scale_2.3_attention_scale_1.7 2.71
405
+ ngram_lm_scale_0.01_attention_scale_0.5 2.72
406
+ ngram_lm_scale_0.01_attention_scale_0.6 2.72
407
+ ngram_lm_scale_0.05_attention_scale_0.08 2.72
408
+ ngram_lm_scale_0.05_attention_scale_0.1 2.72
409
+ ngram_lm_scale_0.08_attention_scale_0.05 2.72
410
+ ngram_lm_scale_0.1_attention_scale_0.01 2.72
411
+ ngram_lm_scale_0.9_attention_scale_0.05 2.72
412
+ ngram_lm_scale_3.0_attention_scale_2.5 2.72
413
+ ngram_lm_scale_0.05_attention_scale_0.05 2.73
414
+ ngram_lm_scale_0.08_attention_scale_0.01 2.73
415
+ ngram_lm_scale_0.9_attention_scale_0.01 2.73
416
+ ngram_lm_scale_1.7_attention_scale_0.9 2.73
417
+ ngram_lm_scale_2.5_attention_scale_1.9 2.73
418
+ ngram_lm_scale_0.01_attention_scale_0.08 2.74
419
+ ngram_lm_scale_0.01_attention_scale_0.1 2.74
420
+ ngram_lm_scale_0.05_attention_scale_0.01 2.74
421
+ ngram_lm_scale_1.9_attention_scale_1.1 2.74
422
+ ngram_lm_scale_2.2_attention_scale_1.5 2.74
423
+ ngram_lm_scale_0.01_attention_scale_0.01 2.75
424
+ ngram_lm_scale_0.01_attention_scale_0.05 2.75
425
+ ngram_lm_scale_1.0_attention_scale_0.1 2.75
426
+ ngram_lm_scale_1.2_attention_scale_0.3 2.75
427
+ ngram_lm_scale_5.0_attention_scale_5.0 2.75
428
+ ngram_lm_scale_1.5_attention_scale_0.6 2.76
429
+ ngram_lm_scale_2.0_attention_scale_1.2 2.76
430
+ ngram_lm_scale_1.0_attention_scale_0.08 2.77
431
+ ngram_lm_scale_2.1_attention_scale_1.3 2.77
432
+ ngram_lm_scale_2.3_attention_scale_1.5 2.78
433
+ ngram_lm_scale_1.9_attention_scale_1.0 2.79
434
+ ngram_lm_scale_2.5_attention_scale_1.7 2.79
435
+ ngram_lm_scale_1.0_attention_scale_0.05 2.8
436
+ ngram_lm_scale_2.0_attention_scale_1.1 2.8
437
+ ngram_lm_scale_2.1_attention_scale_1.2 2.81
438
+ ngram_lm_scale_3.0_attention_scale_2.3 2.81
439
+ ngram_lm_scale_1.0_attention_scale_0.01 2.82
440
+ ngram_lm_scale_1.1_attention_scale_0.1 2.82
441
+ ngram_lm_scale_1.3_attention_scale_0.3 2.82
442
+ ngram_lm_scale_2.2_attention_scale_1.3 2.82
443
+ ngram_lm_scale_1.7_attention_scale_0.7 2.83
444
+ ngram_lm_scale_3.0_attention_scale_2.2 2.83
445
+ ngram_lm_scale_1.1_attention_scale_0.08 2.84
446
+ ngram_lm_scale_1.5_attention_scale_0.5 2.84
447
+ ngram_lm_scale_3.0_attention_scale_2.1 2.85
448
+ ngram_lm_scale_1.1_attention_scale_0.05 2.86
449
+ ngram_lm_scale_1.9_attention_scale_0.9 2.86
450
+ ngram_lm_scale_2.0_attention_scale_1.0 2.86
451
+ ngram_lm_scale_2.1_attention_scale_1.1 2.86
452
+ ngram_lm_scale_2.2_attention_scale_1.2 2.87
453
+ ngram_lm_scale_2.3_attention_scale_1.3 2.87
454
+ ngram_lm_scale_1.1_attention_scale_0.01 2.89
455
+ ngram_lm_scale_2.5_attention_scale_1.5 2.89
456
+ ngram_lm_scale_3.0_attention_scale_2.0 2.9
457
+ ngram_lm_scale_1.2_attention_scale_0.1 2.92
458
+ ngram_lm_scale_2.0_attention_scale_0.9 2.92
459
+ ngram_lm_scale_1.2_attention_scale_0.08 2.93
460
+ ngram_lm_scale_1.7_attention_scale_0.6 2.93
461
+ ngram_lm_scale_2.1_attention_scale_1.0 2.93
462
+ ngram_lm_scale_2.2_attention_scale_1.1 2.95
463
+ ngram_lm_scale_2.3_attention_scale_1.2 2.95
464
+ ngram_lm_scale_1.2_attention_scale_0.05 2.97
465
+ ngram_lm_scale_4.0_attention_scale_3.0 2.97
466
+ ngram_lm_scale_3.0_attention_scale_1.9 2.99
467
+ ngram_lm_scale_5.0_attention_scale_4.0 2.99
468
+ ngram_lm_scale_1.3_attention_scale_0.1 3.0
469
+ ngram_lm_scale_1.5_attention_scale_0.3 3.0
470
+ ngram_lm_scale_1.7_attention_scale_0.5 3.0
471
+ ngram_lm_scale_1.2_attention_scale_0.01 3.01
472
+ ngram_lm_scale_1.9_attention_scale_0.7 3.01
473
+ ngram_lm_scale_2.1_attention_scale_0.9 3.01
474
+ ngram_lm_scale_2.2_attention_scale_1.0 3.01
475
+ ngram_lm_scale_1.3_attention_scale_0.08 3.02
476
+ ngram_lm_scale_2.3_attention_scale_1.1 3.02
477
+ ngram_lm_scale_2.5_attention_scale_1.3 3.02
478
+ ngram_lm_scale_1.3_attention_scale_0.05 3.09
479
+ ngram_lm_scale_1.9_attention_scale_0.6 3.1
480
+ ngram_lm_scale_2.0_attention_scale_0.7 3.1
481
+ ngram_lm_scale_2.2_attention_scale_0.9 3.1
482
+ ngram_lm_scale_2.3_attention_scale_1.0 3.1
483
+ ngram_lm_scale_2.5_attention_scale_1.2 3.1
484
+ ngram_lm_scale_3.0_attention_scale_1.7 3.1
485
+ ngram_lm_scale_1.3_attention_scale_0.01 3.15
486
+ ngram_lm_scale_2.5_attention_scale_1.1 3.18
487
+ ngram_lm_scale_2.3_attention_scale_0.9 3.19
488
+ ngram_lm_scale_4.0_attention_scale_2.5 3.19
489
+ ngram_lm_scale_1.9_attention_scale_0.5 3.21
490
+ ngram_lm_scale_2.1_attention_scale_0.7 3.21
491
+ ngram_lm_scale_2.0_attention_scale_0.6 3.22
492
+ ngram_lm_scale_1.7_attention_scale_0.3 3.23
493
+ ngram_lm_scale_3.0_attention_scale_1.5 3.24
494
+ ngram_lm_scale_2.5_attention_scale_1.0 3.27
495
+ ngram_lm_scale_4.0_attention_scale_2.3 3.27
496
+ ngram_lm_scale_1.5_attention_scale_0.1 3.29
497
+ ngram_lm_scale_2.2_attention_scale_0.7 3.29
498
+ ngram_lm_scale_2.1_attention_scale_0.6 3.3
499
+ ngram_lm_scale_2.0_attention_scale_0.5 3.31
500
+ ngram_lm_scale_1.5_attention_scale_0.08 3.32
501
+ ngram_lm_scale_4.0_attention_scale_2.2 3.32
502
+ ngram_lm_scale_2.5_attention_scale_0.9 3.34
503
+ ngram_lm_scale_3.0_attention_scale_1.3 3.34
504
+ ngram_lm_scale_2.3_attention_scale_0.7 3.35
505
+ ngram_lm_scale_2.2_attention_scale_0.6 3.36
506
+ ngram_lm_scale_5.0_attention_scale_3.0 3.36
507
+ ngram_lm_scale_2.1_attention_scale_0.5 3.37
508
+ ngram_lm_scale_1.5_attention_scale_0.05 3.38
509
+ ngram_lm_scale_4.0_attention_scale_2.1 3.38
510
+ ngram_lm_scale_3.0_attention_scale_1.2 3.42
511
+ ngram_lm_scale_4.0_attention_scale_2.0 3.42
512
+ ngram_lm_scale_1.5_attention_scale_0.01 3.45
513
+ ngram_lm_scale_1.9_attention_scale_0.3 3.45
514
+ ngram_lm_scale_2.3_attention_scale_0.6 3.46
515
+ ngram_lm_scale_4.0_attention_scale_1.9 3.48
516
+ ngram_lm_scale_2.2_attention_scale_0.5 3.51
517
+ ngram_lm_scale_3.0_attention_scale_1.1 3.53
518
+ ngram_lm_scale_1.7_attention_scale_0.1 3.54
519
+ ngram_lm_scale_2.5_attention_scale_0.7 3.54
520
+ ngram_lm_scale_1.7_attention_scale_0.08 3.58
521
+ ngram_lm_scale_2.0_attention_scale_0.3 3.58
522
+ ngram_lm_scale_5.0_attention_scale_2.5 3.59
523
+ ngram_lm_scale_3.0_attention_scale_1.0 3.61
524
+ ngram_lm_scale_2.3_attention_scale_0.5 3.62
525
+ ngram_lm_scale_4.0_attention_scale_1.7 3.64
526
+ ngram_lm_scale_1.7_attention_scale_0.05 3.65
527
+ ngram_lm_scale_2.1_attention_scale_0.3 3.67
528
+ ngram_lm_scale_2.5_attention_scale_0.6 3.67
529
+ ngram_lm_scale_5.0_attention_scale_2.3 3.71
530
+ ngram_lm_scale_1.7_attention_scale_0.01 3.72
531
+ ngram_lm_scale_3.0_attention_scale_0.9 3.72
532
+ ngram_lm_scale_4.0_attention_scale_1.5 3.76
533
+ ngram_lm_scale_5.0_attention_scale_2.2 3.76
534
+ ngram_lm_scale_2.2_attention_scale_0.3 3.78
535
+ ngram_lm_scale_1.9_attention_scale_0.1 3.79
536
+ ngram_lm_scale_2.5_attention_scale_0.5 3.81
537
+ ngram_lm_scale_5.0_attention_scale_2.1 3.81
538
+ ngram_lm_scale_1.9_attention_scale_0.08 3.83
539
+ ngram_lm_scale_5.0_attention_scale_2.0 3.87
540
+ ngram_lm_scale_1.9_attention_scale_0.05 3.88
541
+ ngram_lm_scale_2.3_attention_scale_0.3 3.88
542
+ ngram_lm_scale_2.0_attention_scale_0.1 3.9
543
+ ngram_lm_scale_3.0_attention_scale_0.7 3.92
544
+ ngram_lm_scale_5.0_attention_scale_1.9 3.93
545
+ ngram_lm_scale_4.0_attention_scale_1.3 3.94
546
+ ngram_lm_scale_2.0_attention_scale_0.08 3.95
547
+ ngram_lm_scale_1.9_attention_scale_0.01 3.96
548
+ ngram_lm_scale_4.0_attention_scale_1.2 3.99
549
+ ngram_lm_scale_2.0_attention_scale_0.05 4.0
550
+ ngram_lm_scale_2.1_attention_scale_0.1 4.01
551
+ ngram_lm_scale_3.0_attention_scale_0.6 4.02
552
+ ngram_lm_scale_2.1_attention_scale_0.08 4.05
553
+ ngram_lm_scale_2.5_attention_scale_0.3 4.05
554
+ ngram_lm_scale_5.0_attention_scale_1.7 4.06
555
+ ngram_lm_scale_2.0_attention_scale_0.01 4.1
556
+ ngram_lm_scale_4.0_attention_scale_1.1 4.1
557
+ ngram_lm_scale_2.1_attention_scale_0.05 4.11
558
+ ngram_lm_scale_2.2_attention_scale_0.1 4.12
559
+ ngram_lm_scale_2.2_attention_scale_0.08 4.15
560
+ ngram_lm_scale_3.0_attention_scale_0.5 4.16
561
+ ngram_lm_scale_2.1_attention_scale_0.01 4.18
562
+ ngram_lm_scale_2.2_attention_scale_0.05 4.19
563
+ ngram_lm_scale_4.0_attention_scale_1.0 4.19
564
+ ngram_lm_scale_5.0_attention_scale_1.5 4.19
565
+ ngram_lm_scale_2.3_attention_scale_0.1 4.2
566
+ ngram_lm_scale_2.3_attention_scale_0.08 4.23
567
+ ngram_lm_scale_2.2_attention_scale_0.01 4.27
568
+ ngram_lm_scale_2.3_attention_scale_0.05 4.28
569
+ ngram_lm_scale_4.0_attention_scale_0.9 4.28
570
+ ngram_lm_scale_2.3_attention_scale_0.01 4.34
571
+ ngram_lm_scale_2.5_attention_scale_0.1 4.35
572
+ ngram_lm_scale_5.0_attention_scale_1.3 4.35
573
+ ngram_lm_scale_2.5_attention_scale_0.08 4.39
574
+ ngram_lm_scale_3.0_attention_scale_0.3 4.4
575
+ ngram_lm_scale_5.0_attention_scale_1.2 4.41
576
+ ngram_lm_scale_2.5_attention_scale_0.05 4.43
577
+ ngram_lm_scale_4.0_attention_scale_0.7 4.45
578
+ ngram_lm_scale_5.0_attention_scale_1.1 4.46
579
+ ngram_lm_scale_2.5_attention_scale_0.01 4.48
580
+ ngram_lm_scale_4.0_attention_scale_0.6 4.54
581
+ ngram_lm_scale_5.0_attention_scale_1.0 4.54
582
+ ngram_lm_scale_5.0_attention_scale_0.9 4.62
583
+ ngram_lm_scale_3.0_attention_scale_0.1 4.64
584
+ ngram_lm_scale_4.0_attention_scale_0.5 4.65
585
+ ngram_lm_scale_3.0_attention_scale_0.08 4.67
586
+ ngram_lm_scale_3.0_attention_scale_0.05 4.7
587
+ ngram_lm_scale_3.0_attention_scale_0.01 4.73
588
+ ngram_lm_scale_5.0_attention_scale_0.7 4.76
589
+ ngram_lm_scale_4.0_attention_scale_0.3 4.8
590
+ ngram_lm_scale_5.0_attention_scale_0.6 4.85
591
+ ngram_lm_scale_5.0_attention_scale_0.5 4.91
592
+ ngram_lm_scale_4.0_attention_scale_0.1 4.99
593
+ ngram_lm_scale_4.0_attention_scale_0.08 5.01
594
+ ngram_lm_scale_4.0_attention_scale_0.05 5.02
595
+ ngram_lm_scale_5.0_attention_scale_0.3 5.06
596
+ ngram_lm_scale_4.0_attention_scale_0.01 5.07
597
+ ngram_lm_scale_5.0_attention_scale_0.1 5.22
598
+ ngram_lm_scale_5.0_attention_scale_0.08 5.24
599
+ ngram_lm_scale_5.0_attention_scale_0.05 5.25
600
+ ngram_lm_scale_5.0_attention_scale_0.01 5.27
601
+
602
+ 2022-07-20 21:24:54,113 INFO [decode.py:588] batch 0/?, cuts processed until now is 2
603
+ 2022-07-20 21:26:25,190 INFO [decode.py:588] batch 100/?, cuts processed until now is 249
604
+ 2022-07-20 21:27:58,029 INFO [decode.py:588] batch 200/?, cuts processed until now is 540
605
+ 2022-07-20 21:29:23,092 INFO [decode.py:588] batch 300/?, cuts processed until now is 834
606
+ 2022-07-20 21:30:51,288 INFO [decode.py:588] batch 400/?, cuts processed until now is 1104
607
+ 2022-07-20 21:32:21,736 INFO [decode.py:588] batch 500/?, cuts processed until now is 1350
608
+ 2022-07-20 21:33:54,991 INFO [decode.py:588] batch 600/?, cuts processed until now is 1640
609
+ 2022-07-20 21:35:18,198 INFO [decode.py:588] batch 700/?, cuts processed until now is 1933
610
+ 2022-07-20 21:36:49,206 INFO [decode.py:588] batch 800/?, cuts processed until now is 2175
611
+ 2022-07-20 21:38:16,845 INFO [decode.py:588] batch 900/?, cuts processed until now is 2440
612
+ 2022-07-20 21:38:31,538 INFO [decode.py:783] Caught exception:
613
+ CUDA out of memory. Tried to allocate 740.00 MiB (GPU 0; 31.75 GiB total capacity; 28.28 GiB already allocated; 447.50 MiB free; 29.92 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
614
+
615
+ 2022-07-20 21:38:31,538 INFO [decode.py:789] num_arcs before pruning: 349313
616
+ 2022-07-20 21:38:31,538 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
617
+ 2022-07-20 21:38:31,557 INFO [decode.py:803] num_arcs after pruning: 8796
618
+ 2022-07-20 21:39:44,765 INFO [decode.py:588] batch 1000/?, cuts processed until now is 2656
619
+ 2022-07-20 21:41:20,435 INFO [decode.py:588] batch 1100/?, cuts processed until now is 2839
620
+ 2022-07-20 21:46:22,023 INFO [decode.py:637]
621
+ For test-other, WER of different settings are:
622
+ ngram_lm_scale_1.1_attention_scale_0.9 5.54 best for test-other
623
+ ngram_lm_scale_1.5_attention_scale_1.5 5.54
624
+ ngram_lm_scale_1.5_attention_scale_1.7 5.54
625
+ ngram_lm_scale_1.7_attention_scale_2.0 5.54
626
+ ngram_lm_scale_1.7_attention_scale_2.1 5.54
627
+ ngram_lm_scale_1.2_attention_scale_1.1 5.55
628
+ ngram_lm_scale_1.3_attention_scale_0.9 5.55
629
+ ngram_lm_scale_1.3_attention_scale_1.1 5.55
630
+ ngram_lm_scale_1.3_attention_scale_1.2 5.55
631
+ ngram_lm_scale_1.7_attention_scale_1.9 5.55
632
+ ngram_lm_scale_1.0_attention_scale_0.7 5.56
633
+ ngram_lm_scale_1.2_attention_scale_0.6 5.56
634
+ ngram_lm_scale_1.2_attention_scale_1.0 5.56
635
+ ngram_lm_scale_1.2_attention_scale_1.2 5.56
636
+ ngram_lm_scale_1.3_attention_scale_1.0 5.56
637
+ ngram_lm_scale_1.5_attention_scale_1.3 5.56
638
+ ngram_lm_scale_1.5_attention_scale_1.9 5.56
639
+ ngram_lm_scale_1.5_attention_scale_2.0 5.56
640
+ ngram_lm_scale_1.7_attention_scale_2.2 5.56
641
+ ngram_lm_scale_1.9_attention_scale_2.2 5.56
642
+ ngram_lm_scale_1.9_attention_scale_2.3 5.56
643
+ ngram_lm_scale_1.9_attention_scale_2.5 5.56
644
+ ngram_lm_scale_2.0_attention_scale_2.5 5.56
645
+ ngram_lm_scale_2.0_attention_scale_3.0 5.56
646
+ ngram_lm_scale_2.2_attention_scale_3.0 5.56
647
+ ngram_lm_scale_2.5_attention_scale_4.0 5.56
648
+ ngram_lm_scale_3.0_attention_scale_5.0 5.56
649
+ ngram_lm_scale_0.9_attention_scale_0.6 5.57
650
+ ngram_lm_scale_1.0_attention_scale_0.9 5.57
651
+ ngram_lm_scale_1.1_attention_scale_1.0 5.57
652
+ ngram_lm_scale_1.1_attention_scale_1.1 5.57
653
+ ngram_lm_scale_1.2_attention_scale_0.7 5.57
654
+ ngram_lm_scale_1.2_attention_scale_0.9 5.57
655
+ ngram_lm_scale_1.2_attention_scale_1.3 5.57
656
+ ngram_lm_scale_1.2_attention_scale_1.5 5.57
657
+ ngram_lm_scale_1.3_attention_scale_0.7 5.57
658
+ ngram_lm_scale_1.3_attention_scale_1.3 5.57
659
+ ngram_lm_scale_1.5_attention_scale_2.1 5.57
660
+ ngram_lm_scale_1.5_attention_scale_2.2 5.57
661
+ ngram_lm_scale_1.5_attention_scale_2.3 5.57
662
+ ngram_lm_scale_1.5_attention_scale_2.5 5.57
663
+ ngram_lm_scale_1.7_attention_scale_1.7 5.57
664
+ ngram_lm_scale_1.7_attention_scale_2.3 5.57
665
+ ngram_lm_scale_1.7_attention_scale_2.5 5.57
666
+ ngram_lm_scale_1.7_attention_scale_3.0 5.57
667
+ ngram_lm_scale_1.9_attention_scale_2.0 5.57
668
+ ngram_lm_scale_1.9_attention_scale_2.1 5.57
669
+ ngram_lm_scale_1.9_attention_scale_3.0 5.57
670
+ ngram_lm_scale_2.0_attention_scale_2.3 5.57
671
+ ngram_lm_scale_2.0_attention_scale_4.0 5.57
672
+ ngram_lm_scale_2.1_attention_scale_3.0 5.57
673
+ ngram_lm_scale_2.1_attention_scale_4.0 5.57
674
+ ngram_lm_scale_2.3_attention_scale_3.0 5.57
675
+ ngram_lm_scale_0.9_attention_scale_0.7 5.58
676
+ ngram_lm_scale_1.0_attention_scale_1.0 5.58
677
+ ngram_lm_scale_1.0_attention_scale_1.1 5.58
678
+ ngram_lm_scale_1.1_attention_scale_0.5 5.58
679
+ ngram_lm_scale_1.1_attention_scale_0.6 5.58
680
+ ngram_lm_scale_1.1_attention_scale_1.2 5.58
681
+ ngram_lm_scale_1.1_attention_scale_1.5 5.58
682
+ ngram_lm_scale_1.3_attention_scale_1.5 5.58
683
+ ngram_lm_scale_1.3_attention_scale_1.7 5.58
684
+ ngram_lm_scale_1.3_attention_scale_1.9 5.58
685
+ ngram_lm_scale_1.3_attention_scale_2.0 5.58
686
+ ngram_lm_scale_1.3_attention_scale_2.1 5.58
687
+ ngram_lm_scale_1.3_attention_scale_2.2 5.58
688
+ ngram_lm_scale_2.0_attention_scale_2.2 5.58
689
+ ngram_lm_scale_2.2_attention_scale_4.0 5.58
690
+ ngram_lm_scale_2.3_attention_scale_4.0 5.58
691
+ ngram_lm_scale_0.9_attention_scale_0.9 5.59
692
+ ngram_lm_scale_1.0_attention_scale_1.2 5.59
693
+ ngram_lm_scale_1.1_attention_scale_0.7 5.59
694
+ ngram_lm_scale_1.1_attention_scale_1.3 5.59
695
+ ngram_lm_scale_1.2_attention_scale_1.7 5.59
696
+ ngram_lm_scale_1.2_attention_scale_1.9 5.59
697
+ ngram_lm_scale_1.2_attention_scale_2.1 5.59
698
+ ngram_lm_scale_1.3_attention_scale_2.3 5.59
699
+ ngram_lm_scale_1.5_attention_scale_1.0 5.59
700
+ ngram_lm_scale_1.5_attention_scale_1.1 5.59
701
+ ngram_lm_scale_1.5_attention_scale_1.2 5.59
702
+ ngram_lm_scale_1.7_attention_scale_1.5 5.59
703
+ ngram_lm_scale_1.9_attention_scale_1.9 5.59
704
+ ngram_lm_scale_1.9_attention_scale_4.0 5.59
705
+ ngram_lm_scale_2.0_attention_scale_2.1 5.59
706
+ ngram_lm_scale_2.1_attention_scale_2.3 5.59
707
+ ngram_lm_scale_2.1_attention_scale_2.5 5.59
708
+ ngram_lm_scale_2.5_attention_scale_5.0 5.59
709
+ ngram_lm_scale_0.9_attention_scale_1.0 5.6
710
+ ngram_lm_scale_1.0_attention_scale_0.6 5.6
711
+ ngram_lm_scale_1.0_attention_scale_1.3 5.6
712
+ ngram_lm_scale_1.1_attention_scale_1.7 5.6
713
+ ngram_lm_scale_1.1_attention_scale_1.9 5.6
714
+ ngram_lm_scale_1.2_attention_scale_2.0 5.6
715
+ ngram_lm_scale_1.7_attention_scale_1.3 5.6
716
+ ngram_lm_scale_2.0_attention_scale_2.0 5.6
717
+ ngram_lm_scale_2.1_attention_scale_2.2 5.6
718
+ ngram_lm_scale_2.2_attention_scale_5.0 5.6
719
+ ngram_lm_scale_2.3_attention_scale_2.5 5.6
720
+ ngram_lm_scale_2.3_attention_scale_5.0 5.6
721
+ ngram_lm_scale_3.0_attention_scale_4.0 5.6
722
+ ngram_lm_scale_0.9_attention_scale_0.5 5.61
723
+ ngram_lm_scale_1.0_attention_scale_0.5 5.61
724
+ ngram_lm_scale_1.0_attention_scale_1.5 5.61
725
+ ngram_lm_scale_1.1_attention_scale_2.0 5.61
726
+ ngram_lm_scale_1.2_attention_scale_0.5 5.61
727
+ ngram_lm_scale_1.2_attention_scale_2.2 5.61
728
+ ngram_lm_scale_1.3_attention_scale_2.5 5.61
729
+ ngram_lm_scale_1.5_attention_scale_0.9 5.61
730
+ ngram_lm_scale_1.5_attention_scale_3.0 5.61
731
+ ngram_lm_scale_1.7_attention_scale_4.0 5.61
732
+ ngram_lm_scale_1.9_attention_scale_1.7 5.61
733
+ ngram_lm_scale_2.0_attention_scale_1.7 5.61
734
+ ngram_lm_scale_2.0_attention_scale_1.9 5.61
735
+ ngram_lm_scale_2.0_attention_scale_5.0 5.61
736
+ ngram_lm_scale_2.1_attention_scale_5.0 5.61
737
+ ngram_lm_scale_2.2_attention_scale_2.5 5.61
738
+ ngram_lm_scale_2.5_attention_scale_3.0 5.61
739
+ ngram_lm_scale_0.9_attention_scale_1.1 5.62
740
+ ngram_lm_scale_1.0_attention_scale_1.7 5.62
741
+ ngram_lm_scale_1.1_attention_scale_2.1 5.62
742
+ ngram_lm_scale_1.2_attention_scale_2.3 5.62
743
+ ngram_lm_scale_1.7_attention_scale_1.2 5.62
744
+ ngram_lm_scale_1.9_attention_scale_1.5 5.62
745
+ ngram_lm_scale_2.1_attention_scale_1.9 5.62
746
+ ngram_lm_scale_2.1_attention_scale_2.0 5.62
747
+ ngram_lm_scale_2.1_attention_scale_2.1 5.62
748
+ ngram_lm_scale_2.2_attention_scale_2.2 5.62
749
+ ngram_lm_scale_2.2_attention_scale_2.3 5.62
750
+ ngram_lm_scale_0.7_attention_scale_0.7 5.63
751
+ ngram_lm_scale_0.9_attention_scale_0.3 5.63
752
+ ngram_lm_scale_0.9_attention_scale_1.2 5.63
753
+ ngram_lm_scale_0.9_attention_scale_1.3 5.63
754
+ ngram_lm_scale_1.0_attention_scale_0.3 5.63
755
+ ngram_lm_scale_1.0_attention_scale_1.9 5.63
756
+ ngram_lm_scale_1.1_attention_scale_2.2 5.63
757
+ ngram_lm_scale_1.3_attention_scale_0.6 5.63
758
+ ngram_lm_scale_1.9_attention_scale_5.0 5.63
759
+ ngram_lm_scale_2.2_attention_scale_2.0 5.63
760
+ ngram_lm_scale_2.2_attention_scale_2.1 5.63
761
+ ngram_lm_scale_2.3_attention_scale_2.1 5.63
762
+ ngram_lm_scale_2.3_attention_scale_2.3 5.63
763
+ ngram_lm_scale_0.7_attention_scale_0.3 5.64
764
+ ngram_lm_scale_0.7_attention_scale_0.5 5.64
765
+ ngram_lm_scale_0.7_attention_scale_0.6 5.64
766
+ ngram_lm_scale_0.7_attention_scale_0.9 5.64
767
+ ngram_lm_scale_0.7_attention_scale_1.0 5.64
768
+ ngram_lm_scale_0.9_attention_scale_1.5 5.64
769
+ ngram_lm_scale_1.0_attention_scale_2.0 5.64
770
+ ngram_lm_scale_1.1_attention_scale_2.3 5.64
771
+ ngram_lm_scale_1.2_attention_scale_2.5 5.64
772
+ ngram_lm_scale_1.5_attention_scale_5.0 5.64
773
+ ngram_lm_scale_2.2_attention_scale_1.9 5.64
774
+ ngram_lm_scale_2.3_attention_scale_2.2 5.64
775
+ ngram_lm_scale_0.7_attention_scale_1.1 5.65
776
+ ngram_lm_scale_0.9_attention_scale_1.7 5.65
777
+ ngram_lm_scale_0.9_attention_scale_1.9 5.65
778
+ ngram_lm_scale_1.0_attention_scale_2.1 5.65
779
+ ngram_lm_scale_1.0_attention_scale_2.2 5.65
780
+ ngram_lm_scale_1.0_attention_scale_2.3 5.65
781
+ ngram_lm_scale_1.1_attention_scale_2.5 5.65
782
+ ngram_lm_scale_1.3_attention_scale_3.0 5.65
783
+ ngram_lm_scale_1.7_attention_scale_1.1 5.65
784
+ ngram_lm_scale_1.7_attention_scale_5.0 5.65
785
+ ngram_lm_scale_2.5_attention_scale_2.5 5.65
786
+ ngram_lm_scale_0.6_attention_scale_0.5 5.66
787
+ ngram_lm_scale_0.6_attention_scale_0.6 5.66
788
+ ngram_lm_scale_0.6_attention_scale_0.9 5.66
789
+ ngram_lm_scale_0.7_attention_scale_1.2 5.66
790
+ ngram_lm_scale_0.9_attention_scale_2.3 5.66
791
+ ngram_lm_scale_1.0_attention_scale_2.5 5.66
792
+ ngram_lm_scale_1.2_attention_scale_3.0 5.66
793
+ ngram_lm_scale_1.5_attention_scale_4.0 5.66
794
+ ngram_lm_scale_2.3_attention_scale_2.0 5.66
795
+ ngram_lm_scale_0.5_attention_scale_0.6 5.67
796
+ ngram_lm_scale_0.5_attention_scale_0.7 5.67
797
+ ngram_lm_scale_0.5_attention_scale_0.9 5.67
798
+ ngram_lm_scale_0.6_attention_scale_0.7 5.67
799
+ ngram_lm_scale_0.6_attention_scale_1.0 5.67
800
+ ngram_lm_scale_0.6_attention_scale_1.1 5.67
801
+ ngram_lm_scale_0.7_attention_scale_1.3 5.67
802
+ ngram_lm_scale_0.9_attention_scale_2.0 5.67
803
+ ngram_lm_scale_0.9_attention_scale_2.1 5.67
804
+ ngram_lm_scale_0.9_attention_scale_2.2 5.67
805
+ ngram_lm_scale_0.9_attention_scale_2.5 5.67
806
+ ngram_lm_scale_1.1_attention_scale_3.0 5.67
807
+ ngram_lm_scale_1.3_attention_scale_4.0 5.67
808
+ ngram_lm_scale_2.0_attention_scale_1.5 5.67
809
+ ngram_lm_scale_2.1_attention_scale_1.7 5.67
810
+ ngram_lm_scale_2.5_attention_scale_2.3 5.67
811
+ ngram_lm_scale_4.0_attention_scale_5.0 5.67
812
+ ngram_lm_scale_0.5_attention_scale_1.0 5.68
813
+ ngram_lm_scale_0.6_attention_scale_0.3 5.68
814
+ ngram_lm_scale_1.0_attention_scale_3.0 5.68
815
+ ngram_lm_scale_1.2_attention_scale_4.0 5.68
816
+ ngram_lm_scale_2.3_attention_scale_1.9 5.68
817
+ ngram_lm_scale_2.5_attention_scale_2.2 5.68
818
+ ngram_lm_scale_3.0_attention_scale_3.0 5.68
819
+ ngram_lm_scale_0.5_attention_scale_0.5 5.69
820
+ ngram_lm_scale_0.5_attention_scale_1.1 5.69
821
+ ngram_lm_scale_0.6_attention_scale_1.2 5.69
822
+ ngram_lm_scale_1.1_attention_scale_0.3 5.69
823
+ ngram_lm_scale_1.3_attention_scale_0.5 5.69
824
+ ngram_lm_scale_1.7_attention_scale_1.0 5.69
825
+ ngram_lm_scale_1.9_attention_scale_1.3 5.69
826
+ ngram_lm_scale_0.5_attention_scale_1.2 5.7
827
+ ngram_lm_scale_0.6_attention_scale_1.3 5.7
828
+ ngram_lm_scale_0.6_attention_scale_1.5 5.7
829
+ ngram_lm_scale_0.7_attention_scale_1.5 5.7
830
+ ngram_lm_scale_0.7_attention_scale_1.7 5.7
831
+ ngram_lm_scale_0.7_attention_scale_1.9 5.7
832
+ ngram_lm_scale_0.7_attention_scale_2.0 5.7
833
+ ngram_lm_scale_0.9_attention_scale_3.0 5.7
834
+ ngram_lm_scale_2.1_attention_scale_1.5 5.7
835
+ ngram_lm_scale_2.2_attention_scale_1.7 5.7
836
+ ngram_lm_scale_0.5_attention_scale_1.3 5.71
837
+ ngram_lm_scale_0.6_attention_scale_1.7 5.71
838
+ ngram_lm_scale_0.7_attention_scale_2.1 5.71
839
+ ngram_lm_scale_0.7_attention_scale_2.2 5.71
840
+ ngram_lm_scale_1.1_attention_scale_4.0 5.71
841
+ ngram_lm_scale_1.3_attention_scale_5.0 5.71
842
+ ngram_lm_scale_2.5_attention_scale_2.1 5.71
843
+ ngram_lm_scale_0.5_attention_scale_0.3 5.72
844
+ ngram_lm_scale_0.6_attention_scale_1.9 5.72
845
+ ngram_lm_scale_1.0_attention_scale_4.0 5.72
846
+ ngram_lm_scale_1.5_attention_scale_0.7 5.72
847
+ ngram_lm_scale_1.9_attention_scale_1.2 5.72
848
+ ngram_lm_scale_2.3_attention_scale_1.7 5.72
849
+ ngram_lm_scale_2.5_attention_scale_2.0 5.72
850
+ ngram_lm_scale_0.5_attention_scale_1.5 5.73
851
+ ngram_lm_scale_0.6_attention_scale_0.1 5.73
852
+ ngram_lm_scale_0.7_attention_scale_2.3 5.73
853
+ ngram_lm_scale_0.9_attention_scale_4.0 5.73
854
+ ngram_lm_scale_1.2_attention_scale_5.0 5.73
855
+ ngram_lm_scale_2.0_attention_scale_1.3 5.73
856
+ ngram_lm_scale_0.5_attention_scale_1.7 5.74
857
+ ngram_lm_scale_0.5_attention_scale_1.9 5.74
858
+ ngram_lm_scale_0.5_attention_scale_2.0 5.74
859
+ ngram_lm_scale_0.5_attention_scale_2.2 5.74
860
+ ngram_lm_scale_0.5_attention_scale_2.3 5.74
861
+ ngram_lm_scale_0.6_attention_scale_0.08 5.74
862
+ ngram_lm_scale_0.6_attention_scale_2.0 5.74
863
+ ngram_lm_scale_0.6_attention_scale_2.1 5.74
864
+ ngram_lm_scale_0.6_attention_scale_2.2 5.74
865
+ ngram_lm_scale_0.6_attention_scale_2.5 5.74
866
+ ngram_lm_scale_0.7_attention_scale_0.1 5.74
867
+ ngram_lm_scale_0.7_attention_scale_2.5 5.74
868
+ ngram_lm_scale_0.7_attention_scale_3.0 5.74
869
+ ngram_lm_scale_1.1_attention_scale_5.0 5.74
870
+ ngram_lm_scale_1.7_attention_scale_0.9 5.74
871
+ ngram_lm_scale_4.0_attention_scale_4.0 5.74
872
+ ngram_lm_scale_0.3_attention_scale_1.1 5.75
873
+ ngram_lm_scale_0.3_attention_scale_1.2 5.75
874
+ ngram_lm_scale_0.5_attention_scale_2.1 5.75
875
+ ngram_lm_scale_0.5_attention_scale_2.5 5.75
876
+ ngram_lm_scale_0.6_attention_scale_0.05 5.75
877
+ ngram_lm_scale_0.6_attention_scale_2.3 5.75
878
+ ngram_lm_scale_0.6_attention_scale_3.0 5.75
879
+ ngram_lm_scale_0.7_attention_scale_0.05 5.75
880
+ ngram_lm_scale_0.7_attention_scale_0.08 5.75
881
+ ngram_lm_scale_0.9_attention_scale_5.0 5.75
882
+ ngram_lm_scale_1.0_attention_scale_5.0 5.75
883
+ ngram_lm_scale_2.2_attention_scale_1.5 5.75
884
+ ngram_lm_scale_2.5_attention_scale_1.9 5.75
885
+ ngram_lm_scale_0.3_attention_scale_0.9 5.76
886
+ ngram_lm_scale_0.3_attention_scale_1.0 5.76
887
+ ngram_lm_scale_0.3_attention_scale_1.3 5.76
888
+ ngram_lm_scale_0.5_attention_scale_0.1 5.76
889
+ ngram_lm_scale_0.7_attention_scale_4.0 5.76
890
+ ngram_lm_scale_2.0_attention_scale_1.2 5.76
891
+ ngram_lm_scale_0.5_attention_scale_0.08 5.77
892
+ ngram_lm_scale_0.5_attention_scale_3.0 5.77
893
+ ngram_lm_scale_0.6_attention_scale_4.0 5.77
894
+ ngram_lm_scale_0.7_attention_scale_5.0 5.77
895
+ ngram_lm_scale_1.9_attention_scale_1.1 5.77
896
+ ngram_lm_scale_3.0_attention_scale_2.5 5.77
897
+ ngram_lm_scale_0.3_attention_scale_0.6 5.78
898
+ ngram_lm_scale_0.3_attention_scale_0.7 5.78
899
+ ngram_lm_scale_0.3_attention_scale_1.5 5.78
900
+ ngram_lm_scale_0.5_attention_scale_0.01 5.78
901
+ ngram_lm_scale_0.5_attention_scale_0.05 5.78
902
+ ngram_lm_scale_0.6_attention_scale_0.01 5.78
903
+ ngram_lm_scale_0.9_attention_scale_0.1 5.78
904
+ ngram_lm_scale_2.1_attention_scale_1.3 5.78
905
+ ngram_lm_scale_0.3_attention_scale_0.5 5.79
906
+ ngram_lm_scale_0.3_attention_scale_1.7 5.79
907
+ ngram_lm_scale_0.7_attention_scale_0.01 5.79
908
+ ngram_lm_scale_0.3_attention_scale_2.0 5.8
909
+ ngram_lm_scale_0.3_attention_scale_2.1 5.8
910
+ ngram_lm_scale_0.3_attention_scale_2.2 5.8
911
+ ngram_lm_scale_0.5_attention_scale_4.0 5.8
912
+ ngram_lm_scale_0.6_attention_scale_5.0 5.8
913
+ ngram_lm_scale_0.9_attention_scale_0.08 5.8
914
+ ngram_lm_scale_1.2_attention_scale_0.3 5.8
915
+ ngram_lm_scale_0.3_attention_scale_1.9 5.81
916
+ ngram_lm_scale_0.3_attention_scale_2.3 5.81
917
+ ngram_lm_scale_0.5_attention_scale_5.0 5.81
918
+ ngram_lm_scale_1.5_attention_scale_0.6 5.81
919
+ ngram_lm_scale_0.3_attention_scale_2.5 5.82
920
+ ngram_lm_scale_2.3_attention_scale_1.5 5.82
921
+ ngram_lm_scale_0.3_attention_scale_3.0 5.83
922
+ ngram_lm_scale_0.3_attention_scale_4.0 5.83
923
+ ngram_lm_scale_0.3_attention_scale_5.0 5.83
924
+ ngram_lm_scale_1.0_attention_scale_0.1 5.83
925
+ ngram_lm_scale_2.0_attention_scale_1.1 5.83
926
+ ngram_lm_scale_2.1_attention_scale_1.2 5.83
927
+ ngram_lm_scale_5.0_attention_scale_5.0 5.83
928
+ ngram_lm_scale_0.08_attention_scale_1.3 5.84
929
+ ngram_lm_scale_0.08_attention_scale_1.7 5.84
930
+ ngram_lm_scale_0.1_attention_scale_1.3 5.84
931
+ ngram_lm_scale_0.9_attention_scale_0.05 5.84
932
+ ngram_lm_scale_1.9_attention_scale_1.0 5.84
933
+ ngram_lm_scale_3.0_attention_scale_2.3 5.84
934
+ ngram_lm_scale_0.08_attention_scale_1.2 5.85
935
+ ngram_lm_scale_0.08_attention_scale_1.5 5.85
936
+ ngram_lm_scale_0.08_attention_scale_1.9 5.85
937
+ ngram_lm_scale_0.1_attention_scale_1.0 5.85
938
+ ngram_lm_scale_0.1_attention_scale_1.1 5.85
939
+ ngram_lm_scale_0.1_attention_scale_1.2 5.85
940
+ ngram_lm_scale_0.1_attention_scale_1.5 5.85
941
+ ngram_lm_scale_0.1_attention_scale_1.7 5.85
942
+ ngram_lm_scale_0.1_attention_scale_1.9 5.85
943
+ ngram_lm_scale_0.1_attention_scale_2.0 5.85
944
+ ngram_lm_scale_0.1_attention_scale_2.1 5.85
945
+ ngram_lm_scale_0.1_attention_scale_2.2 5.85
946
+ ngram_lm_scale_0.3_attention_scale_0.3 5.85
947
+ ngram_lm_scale_2.5_attention_scale_1.7 5.85
948
+ ngram_lm_scale_0.05_attention_scale_1.3 5.86
949
+ ngram_lm_scale_0.08_attention_scale_1.0 5.86
950
+ ngram_lm_scale_0.08_attention_scale_2.0 5.86
951
+ ngram_lm_scale_0.08_attention_scale_2.1 5.86
952
+ ngram_lm_scale_0.08_attention_scale_2.2 5.86
953
+ ngram_lm_scale_0.1_attention_scale_0.9 5.86
954
+ ngram_lm_scale_0.1_attention_scale_2.3 5.86
955
+ ngram_lm_scale_0.1_attention_scale_2.5 5.86
956
+ ngram_lm_scale_1.0_attention_scale_0.08 5.86
957
+ ngram_lm_scale_2.2_attention_scale_1.3 5.86
958
+ ngram_lm_scale_0.05_attention_scale_1.5 5.87
959
+ ngram_lm_scale_0.05_attention_scale_1.7 5.87
960
+ ngram_lm_scale_0.05_attention_scale_1.9 5.87
961
+ ngram_lm_scale_0.05_attention_scale_2.0 5.87
962
+ ngram_lm_scale_0.05_attention_scale_2.5 5.87
963
+ ngram_lm_scale_0.08_attention_scale_0.9 5.87
964
+ ngram_lm_scale_0.08_attention_scale_1.1 5.87
965
+ ngram_lm_scale_0.08_attention_scale_2.3 5.87
966
+ ngram_lm_scale_0.08_attention_scale_2.5 5.87
967
+ ngram_lm_scale_0.1_attention_scale_3.0 5.87
968
+ ngram_lm_scale_0.9_attention_scale_0.01 5.87
969
+ ngram_lm_scale_0.01_attention_scale_1.5 5.88
970
+ ngram_lm_scale_0.01_attention_scale_1.7 5.88
971
+ ngram_lm_scale_0.01_attention_scale_2.5 5.88
972
+ ngram_lm_scale_0.01_attention_scale_3.0 5.88
973
+ ngram_lm_scale_0.05_attention_scale_1.0 5.88
974
+ ngram_lm_scale_0.05_attention_scale_1.1 5.88
975
+ ngram_lm_scale_0.05_attention_scale_1.2 5.88
976
+ ngram_lm_scale_0.05_attention_scale_2.1 5.88
977
+ ngram_lm_scale_0.05_attention_scale_2.2 5.88
978
+ ngram_lm_scale_0.05_attention_scale_2.3 5.88
979
+ ngram_lm_scale_0.08_attention_scale_3.0 5.88
980
+ ngram_lm_scale_0.3_attention_scale_0.1 5.88
981
+ ngram_lm_scale_3.0_attention_scale_2.2 5.88
982
+ ngram_lm_scale_0.01_attention_scale_1.9 5.89
983
+ ngram_lm_scale_0.01_attention_scale_2.0 5.89
984
+ ngram_lm_scale_0.01_attention_scale_2.1 5.89
985
+ ngram_lm_scale_0.05_attention_scale_0.9 5.89
986
+ ngram_lm_scale_0.05_attention_scale_3.0 5.89
987
+ ngram_lm_scale_0.08_attention_scale_4.0 5.89
988
+ ngram_lm_scale_0.1_attention_scale_0.7 5.89
989
+ ngram_lm_scale_0.1_attention_scale_4.0 5.89
990
+ ngram_lm_scale_0.1_attention_scale_5.0 5.89
991
+ ngram_lm_scale_0.3_attention_scale_0.08 5.89
992
+ ngram_lm_scale_0.01_attention_scale_1.3 5.9
993
+ ngram_lm_scale_0.01_attention_scale_2.2 5.9
994
+ ngram_lm_scale_0.01_attention_scale_2.3 5.9
995
+ ngram_lm_scale_0.05_attention_scale_4.0 5.9
996
+ ngram_lm_scale_0.05_attention_scale_5.0 5.9
997
+ ngram_lm_scale_0.08_attention_scale_0.7 5.9
998
+ ngram_lm_scale_0.08_attention_scale_5.0 5.9
999
+ ngram_lm_scale_0.01_attention_scale_1.1 5.91
1000
+ ngram_lm_scale_0.01_attention_scale_1.2 5.91
1001
+ ngram_lm_scale_0.01_attention_scale_4.0 5.91
1002
+ ngram_lm_scale_0.01_attention_scale_5.0 5.91
1003
+ ngram_lm_scale_0.1_attention_scale_0.6 5.91
1004
+ ngram_lm_scale_0.3_attention_scale_0.05 5.91
1005
+ ngram_lm_scale_1.3_attention_scale_0.3 5.91
1006
+ ngram_lm_scale_1.5_attention_scale_0.5 5.91
1007
+ ngram_lm_scale_1.7_attention_scale_0.7 5.91
1008
+ ngram_lm_scale_1.9_attention_scale_0.9 5.91
1009
+ ngram_lm_scale_2.0_attention_scale_1.0 5.91
1010
+ ngram_lm_scale_2.1_attention_scale_1.1 5.91
1011
+ ngram_lm_scale_0.01_attention_scale_0.9 5.92
1012
+ ngram_lm_scale_0.01_attention_scale_1.0 5.92
1013
+ ngram_lm_scale_0.05_attention_scale_0.6 5.92
1014
+ ngram_lm_scale_0.05_attention_scale_0.7 5.92
1015
+ ngram_lm_scale_0.08_attention_scale_0.6 5.92
1016
+ ngram_lm_scale_0.1_attention_scale_0.5 5.92
1017
+ ngram_lm_scale_1.0_attention_scale_0.05 5.92
1018
+ ngram_lm_scale_0.01_attention_scale_0.7 5.94
1019
+ ngram_lm_scale_0.05_attention_scale_0.5 5.94
1020
+ ngram_lm_scale_0.08_attention_scale_0.5 5.94
1021
+ ngram_lm_scale_2.2_attention_scale_1.2 5.94
1022
+ ngram_lm_scale_2.3_attention_scale_1.3 5.95
1023
+ ngram_lm_scale_3.0_attention_scale_2.1 5.95
1024
+ ngram_lm_scale_0.01_attention_scale_0.6 5.96
1025
+ ngram_lm_scale_0.3_attention_scale_0.01 5.96
1026
+ ngram_lm_scale_1.1_attention_scale_0.1 5.97
1027
+ ngram_lm_scale_1.0_attention_scale_0.01 5.98
1028
+ ngram_lm_scale_2.5_attention_scale_1.5 5.98
1029
+ ngram_lm_scale_0.01_attention_scale_0.5 5.99
1030
+ ngram_lm_scale_0.08_attention_scale_0.3 5.99
1031
+ ngram_lm_scale_0.1_attention_scale_0.3 5.99
1032
+ ngram_lm_scale_3.0_attention_scale_2.0 6.0
1033
+ ngram_lm_scale_1.1_attention_scale_0.08 6.01
1034
+ ngram_lm_scale_1.7_attention_scale_0.6 6.01
1035
+ ngram_lm_scale_0.05_attention_scale_0.3 6.02
1036
+ ngram_lm_scale_2.1_attention_scale_1.0 6.02
1037
+ ngram_lm_scale_2.0_attention_scale_0.9 6.03
1038
+ ngram_lm_scale_2.2_attention_scale_1.1 6.03
1039
+ ngram_lm_scale_2.3_attention_scale_1.2 6.03
1040
+ ngram_lm_scale_0.01_attention_scale_0.3 6.04
1041
+ ngram_lm_scale_4.0_attention_scale_3.0 6.05
1042
+ ngram_lm_scale_3.0_attention_scale_1.9 6.07
1043
+ ngram_lm_scale_0.08_attention_scale_0.08 6.08
1044
+ ngram_lm_scale_0.08_attention_scale_0.1 6.08
1045
+ ngram_lm_scale_0.1_attention_scale_0.08 6.08
1046
+ ngram_lm_scale_0.1_attention_scale_0.1 6.08
1047
+ ngram_lm_scale_1.1_attention_scale_0.05 6.08
1048
+ ngram_lm_scale_5.0_attention_scale_4.0 6.08
1049
+ ngram_lm_scale_0.1_attention_scale_0.05 6.09
1050
+ ngram_lm_scale_0.05_attention_scale_0.1 6.11
1051
+ ngram_lm_scale_0.08_attention_scale_0.05 6.11
1052
+ ngram_lm_scale_0.05_attention_scale_0.08 6.12
1053
+ ngram_lm_scale_0.1_attention_scale_0.01 6.12
1054
+ ngram_lm_scale_2.1_attention_scale_0.9 6.12
1055
+ ngram_lm_scale_2.2_attention_scale_1.0 6.12
1056
+ ngram_lm_scale_2.5_attention_scale_1.3 6.12
1057
+ ngram_lm_scale_0.05_attention_scale_0.05 6.13
1058
+ ngram_lm_scale_0.08_attention_scale_0.01 6.13
1059
+ ngram_lm_scale_1.2_attention_scale_0.1 6.13
1060
+ ngram_lm_scale_2.3_attention_scale_1.1 6.13
1061
+ ngram_lm_scale_0.01_attention_scale_0.1 6.14
1062
+ ngram_lm_scale_1.2_attention_scale_0.08 6.15
1063
+ ngram_lm_scale_1.9_attention_scale_0.7 6.15
1064
+ ngram_lm_scale_1.1_attention_scale_0.01 6.16
1065
+ ngram_lm_scale_0.01_attention_scale_0.08 6.17
1066
+ ngram_lm_scale_1.7_attention_scale_0.5 6.17
1067
+ ngram_lm_scale_0.05_attention_scale_0.01 6.18
1068
+ ngram_lm_scale_1.5_attention_scale_0.3 6.2
1069
+ ngram_lm_scale_0.01_attention_scale_0.05 6.21
1070
+ ngram_lm_scale_1.2_attention_scale_0.05 6.21
1071
+ ngram_lm_scale_2.2_attention_scale_0.9 6.24
1072
+ ngram_lm_scale_0.01_attention_scale_0.01 6.25
1073
+ ngram_lm_scale_1.3_attention_scale_0.1 6.25
1074
+ ngram_lm_scale_2.3_attention_scale_1.0 6.25
1075
+ ngram_lm_scale_3.0_attention_scale_1.7 6.25
1076
+ ngram_lm_scale_2.0_attention_scale_0.7 6.26
1077
+ ngram_lm_scale_1.9_attention_scale_0.6 6.27
1078
+ ngram_lm_scale_2.5_attention_scale_1.2 6.27
1079
+ ngram_lm_scale_1.2_attention_scale_0.01 6.28
1080
+ ngram_lm_scale_1.3_attention_scale_0.08 6.29
1081
+ ngram_lm_scale_2.5_attention_scale_1.1 6.35
1082
+ ngram_lm_scale_2.3_attention_scale_0.9 6.36
1083
+ ngram_lm_scale_4.0_attention_scale_2.5 6.36
1084
+ ngram_lm_scale_1.3_attention_scale_0.05 6.38
1085
+ ngram_lm_scale_2.0_attention_scale_0.6 6.39
1086
+ ngram_lm_scale_2.1_attention_scale_0.7 6.39
1087
+ ngram_lm_scale_3.0_attention_scale_1.5 6.4
1088
+ ngram_lm_scale_1.9_attention_scale_0.5 6.42
1089
+ ngram_lm_scale_2.5_attention_scale_1.0 6.44
1090
+ ngram_lm_scale_1.3_attention_scale_0.01 6.45
1091
+ ngram_lm_scale_4.0_attention_scale_2.3 6.49
1092
+ ngram_lm_scale_1.7_attention_scale_0.3 6.5
1093
+ ngram_lm_scale_2.2_attention_scale_0.7 6.5
1094
+ ngram_lm_scale_2.1_attention_scale_0.6 6.51
1095
+ ngram_lm_scale_2.0_attention_scale_0.5 6.54
1096
+ ngram_lm_scale_4.0_attention_scale_2.2 6.54
1097
+ ngram_lm_scale_2.5_attention_scale_0.9 6.56
1098
+ ngram_lm_scale_3.0_attention_scale_1.3 6.59
1099
+ ngram_lm_scale_5.0_attention_scale_3.0 6.59
1100
+ ngram_lm_scale_1.5_attention_scale_0.1 6.62
1101
+ ngram_lm_scale_4.0_attention_scale_2.1 6.63
1102
+ ngram_lm_scale_2.3_attention_scale_0.7 6.64
1103
+ ngram_lm_scale_1.5_attention_scale_0.08 6.67
1104
+ ngram_lm_scale_2.2_attention_scale_0.6 6.68
1105
+ ngram_lm_scale_2.1_attention_scale_0.5 6.7
1106
+ ngram_lm_scale_3.0_attention_scale_1.2 6.72
1107
+ ngram_lm_scale_4.0_attention_scale_2.0 6.73
1108
+ ngram_lm_scale_1.5_attention_scale_0.05 6.75
1109
+ ngram_lm_scale_2.3_attention_scale_0.6 6.79
1110
+ ngram_lm_scale_4.0_attention_scale_1.9 6.82
1111
+ ngram_lm_scale_1.9_attention_scale_0.3 6.83
1112
+ ngram_lm_scale_2.2_attention_scale_0.5 6.83
1113
+ ngram_lm_scale_1.5_attention_scale_0.01 6.84
1114
+ ngram_lm_scale_3.0_attention_scale_1.1 6.84
1115
+ ngram_lm_scale_2.5_attention_scale_0.7 6.86
1116
+ ngram_lm_scale_5.0_attention_scale_2.5 6.9
1117
+ ngram_lm_scale_3.0_attention_scale_1.0 6.92
1118
+ ngram_lm_scale_1.7_attention_scale_0.1 6.93
1119
+ ngram_lm_scale_2.0_attention_scale_0.3 6.95
1120
+ ngram_lm_scale_4.0_attention_scale_1.7 6.96
1121
+ ngram_lm_scale_1.7_attention_scale_0.08 6.98
1122
+ ngram_lm_scale_2.3_attention_scale_0.5 6.98
1123
+ ngram_lm_scale_5.0_attention_scale_2.3 7.02
1124
+ ngram_lm_scale_2.5_attention_scale_0.6 7.04
1125
+ ngram_lm_scale_1.7_attention_scale_0.05 7.06
1126
+ ngram_lm_scale_3.0_attention_scale_0.9 7.07
1127
+ ngram_lm_scale_5.0_attention_scale_2.2 7.09
1128
+ ngram_lm_scale_2.1_attention_scale_0.3 7.1
1129
+ ngram_lm_scale_4.0_attention_scale_1.5 7.14
1130
+ ngram_lm_scale_1.7_attention_scale_0.01 7.15
1131
+ ngram_lm_scale_5.0_attention_scale_2.1 7.18
1132
+ ngram_lm_scale_2.5_attention_scale_0.5 7.23
1133
+ ngram_lm_scale_2.2_attention_scale_0.3 7.24
1134
+ ngram_lm_scale_5.0_attention_scale_2.0 7.25
1135
+ ngram_lm_scale_1.9_attention_scale_0.1 7.26
1136
+ ngram_lm_scale_1.9_attention_scale_0.08 7.31
1137
+ ngram_lm_scale_4.0_attention_scale_1.3 7.34
1138
+ ngram_lm_scale_2.3_attention_scale_0.3 7.35
1139
+ ngram_lm_scale_5.0_attention_scale_1.9 7.35
1140
+ ngram_lm_scale_3.0_attention_scale_0.7 7.36
1141
+ ngram_lm_scale_1.9_attention_scale_0.05 7.38
1142
+ ngram_lm_scale_2.0_attention_scale_0.1 7.39
1143
+ ngram_lm_scale_2.0_attention_scale_0.08 7.42
1144
+ ngram_lm_scale_4.0_attention_scale_1.2 7.46
1145
+ ngram_lm_scale_1.9_attention_scale_0.01 7.49
1146
+ ngram_lm_scale_2.0_attention_scale_0.05 7.49
1147
+ ngram_lm_scale_2.1_attention_scale_0.1 7.51
1148
+ ngram_lm_scale_3.0_attention_scale_0.6 7.52
1149
+ ngram_lm_scale_5.0_attention_scale_1.7 7.55
1150
+ ngram_lm_scale_2.1_attention_scale_0.08 7.56
1151
+ ngram_lm_scale_2.5_attention_scale_0.3 7.56
1152
+ ngram_lm_scale_4.0_attention_scale_1.1 7.57
1153
+ ngram_lm_scale_2.0_attention_scale_0.01 7.61
1154
+ ngram_lm_scale_2.1_attention_scale_0.05 7.62
1155
+ ngram_lm_scale_2.2_attention_scale_0.1 7.62
1156
+ ngram_lm_scale_3.0_attention_scale_0.5 7.65
1157
+ ngram_lm_scale_2.2_attention_scale_0.08 7.66
1158
+ ngram_lm_scale_4.0_attention_scale_1.0 7.69
1159
+ ngram_lm_scale_5.0_attention_scale_1.5 7.7
1160
+ ngram_lm_scale_2.3_attention_scale_0.1 7.71
1161
+ ngram_lm_scale_2.1_attention_scale_0.01 7.72
1162
+ ngram_lm_scale_2.2_attention_scale_0.05 7.72
1163
+ ngram_lm_scale_2.3_attention_scale_0.08 7.75
1164
+ ngram_lm_scale_2.2_attention_scale_0.01 7.79
1165
+ ngram_lm_scale_4.0_attention_scale_0.9 7.79
1166
+ ngram_lm_scale_2.3_attention_scale_0.05 7.81
1167
+ ngram_lm_scale_5.0_attention_scale_1.3 7.89
1168
+ ngram_lm_scale_2.3_attention_scale_0.01 7.92
1169
+ ngram_lm_scale_2.5_attention_scale_0.1 7.92
1170
+ ngram_lm_scale_2.5_attention_scale_0.08 7.96
1171
+ ngram_lm_scale_3.0_attention_scale_0.3 7.97
1172
+ ngram_lm_scale_5.0_attention_scale_1.2 7.98
1173
+ ngram_lm_scale_2.5_attention_scale_0.05 8.01
1174
+ ngram_lm_scale_4.0_attention_scale_0.7 8.04
1175
+ ngram_lm_scale_5.0_attention_scale_1.1 8.06
1176
+ ngram_lm_scale_2.5_attention_scale_0.01 8.08
1177
+ ngram_lm_scale_4.0_attention_scale_0.6 8.14
1178
+ ngram_lm_scale_5.0_attention_scale_1.0 8.17
1179
+ ngram_lm_scale_5.0_attention_scale_0.9 8.27
1180
+ ngram_lm_scale_4.0_attention_scale_0.5 8.28
1181
+ ngram_lm_scale_3.0_attention_scale_0.1 8.3
1182
+ ngram_lm_scale_3.0_attention_scale_0.08 8.34
1183
+ ngram_lm_scale_3.0_attention_scale_0.05 8.38
1184
+ ngram_lm_scale_5.0_attention_scale_0.7 8.43
1185
+ ngram_lm_scale_3.0_attention_scale_0.01 8.46
1186
+ ngram_lm_scale_4.0_attention_scale_0.3 8.48
1187
+ ngram_lm_scale_5.0_attention_scale_0.6 8.51
1188
+ ngram_lm_scale_5.0_attention_scale_0.5 8.58
1189
+ ngram_lm_scale_4.0_attention_scale_0.1 8.68
1190
+ ngram_lm_scale_4.0_attention_scale_0.08 8.71
1191
+ ngram_lm_scale_4.0_attention_scale_0.05 8.73
1192
+ ngram_lm_scale_5.0_attention_scale_0.3 8.75
1193
+ ngram_lm_scale_4.0_attention_scale_0.01 8.8
1194
+ ngram_lm_scale_5.0_attention_scale_0.1 8.97
1195
+ ngram_lm_scale_5.0_attention_scale_0.08 8.98
1196
+ ngram_lm_scale_5.0_attention_scale_0.05 9.0
1197
+ ngram_lm_scale_5.0_attention_scale_0.01 9.03
1198
+
1199
+ 2022-07-20 21:46:22,023 INFO [decode.py:868] Done!
log/log-ctc-decoding/log-decode-2022-07-21-23-23-55 ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-21 23:23:55,621 INFO [decode.py:743] Decoding started
2
+ 2022-07-21 23:23:55,622 INFO [decode.py:744] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.17', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '9e88318adfd7c80290a96f8a888d279d45dc1564', 'k2-git-date': 'Mon Jul 18 16:26:06 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'b9fda2c-dirty', 'icefall-git-date': 'Thu Jul 21 22:06:09 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/open_source/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'ctc-decoding', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'rnn_lm_exp_dir': 'rnn_lm/exp', 'rnn_lm_epoch': 7, 'rnn_lm_avg': 2, 'rnn_lm_embedding_dim': 2048, 'rnn_lm_hidden_dim': 2048, 'rnn_lm_num_layers': 4, 'rnn_lm_tie_weights': False, 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 200, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-21 23:23:56,049 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-21 23:23:56,127 INFO [decode.py:754] device: cuda:0
5
+ 2022-07-21 23:24:03,033 INFO [decode.py:916] Calculating the averaged model over epoch range from 22 (excluded) to 30
6
+ 2022-07-21 23:24:05,616 INFO [decode.py:932] Number of model parameters: 103071035
7
+ 2022-07-21 23:24:05,616 INFO [asr_datamodule.py:444] About to get test-clean cuts
8
+ 2022-07-21 23:24:05,632 INFO [asr_datamodule.py:451] About to get test-other cuts
9
+ 2022-07-21 23:24:06,872 INFO [decode.py:678] batch 0/?, cuts processed until now is 15
10
+ 2022-07-21 23:24:41,077 INFO [decode.py:678] batch 100/?, cuts processed until now is 2578
11
+ 2022-07-21 23:24:41,935 INFO [decode.py:699] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-ctc-decoding.txt
12
+ 2022-07-21 23:24:42,014 INFO [utils.py:416] [test-clean-ctc-decoding] %WER 2.98% [1568 / 52576, 162 ins, 113 del, 1293 sub ]
13
+ 2022-07-21 23:24:42,241 INFO [decode.py:711] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-ctc-decoding.txt
14
+ 2022-07-21 23:24:42,244 INFO [decode.py:727]
15
+ For test-clean, WER of different settings are:
16
+ ctc-decoding 2.98 best for test-clean
17
+
18
+ 2022-07-21 23:24:43,208 INFO [decode.py:678] batch 0/?, cuts processed until now is 19
19
+ 2022-07-21 23:25:17,339 INFO [decode.py:678] batch 100/?, cuts processed until now is 2888
20
+ 2022-07-21 23:25:17,768 INFO [decode.py:699] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-ctc-decoding.txt
21
+ 2022-07-21 23:25:17,849 INFO [utils.py:416] [test-other-ctc-decoding] %WER 7.14% [3735 / 52343, 375 ins, 280 del, 3080 sub ]
22
+ 2022-07-21 23:25:18,086 INFO [decode.py:711] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-ctc-decoding.txt
23
+ 2022-07-21 23:25:18,087 INFO [decode.py:727]
24
+ For test-other, WER of different settings are:
25
+ ctc-decoding 7.14 best for test-other
26
+
27
+ 2022-07-21 23:25:18,087 INFO [decode.py:989] Done!
log/log-ctc-greedy-search/log-decode-2022-07-20-15-54-39 ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 15:54:39,939 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 15:54:39,939 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 4, 'output_beam': 4, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': '0738a44-dirty', 'icefall-git-date': 'Mon Jul 18 14:44:26 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'ctc-greedy-search', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 200, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 15:54:40,359 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 15:54:40,416 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 15:54:46,648 INFO [decode.py:817] Calculating the averaged model over epoch range from 22 (excluded) to 30
6
+ 2022-07-20 15:54:49,543 INFO [decode.py:833] Number of model parameters: 103071035
7
+ 2022-07-20 15:54:49,544 INFO [asr_datamodule.py:444] About to get test-clean cuts
8
+ 2022-07-20 15:54:49,602 INFO [asr_datamodule.py:451] About to get test-other cuts
9
+ 2022-07-20 15:54:50,853 INFO [decode.py:588] batch 0/?, cuts processed until now is 15
10
+ 2022-07-20 15:55:16,670 INFO [decode.py:588] batch 100/?, cuts processed until now is 2578
11
+ 2022-07-20 15:55:17,385 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-ctc-greedy-search.txt
12
+ 2022-07-20 15:55:17,464 INFO [utils.py:410] [test-clean-ctc-greedy-search] %WER 2.98% [1568 / 52576, 162 ins, 113 del, 1293 sub ]
13
+ 2022-07-20 15:55:17,698 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-ctc-greedy-search.txt
14
+ 2022-07-20 15:55:17,699 INFO [decode.py:637]
15
+ For test-clean, WER of different settings are:
16
+ ctc-greedy-search 2.98 best for test-clean
17
+
18
+ 2022-07-20 15:55:18,620 INFO [decode.py:588] batch 0/?, cuts processed until now is 19
19
+ 2022-07-20 15:55:43,222 INFO [decode.py:588] batch 100/?, cuts processed until now is 2888
20
+ 2022-07-20 15:55:43,554 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-ctc-greedy-search.txt
21
+ 2022-07-20 15:55:43,643 INFO [utils.py:410] [test-other-ctc-greedy-search] %WER 7.14% [3735 / 52343, 375 ins, 280 del, 3080 sub ]
22
+ 2022-07-20 15:55:43,890 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-ctc-greedy-search.txt
23
+ 2022-07-20 15:55:43,891 INFO [decode.py:637]
24
+ For test-other, WER of different settings are:
25
+ ctc-greedy-search 7.14 best for test-other
26
+
27
+ 2022-07-20 15:55:43,891 INFO [decode.py:864] Done!
log/log-nbest-oracle/log-decode-2022-07-20-20-48-51 ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 20:48:51,656 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 20:48:51,657 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'd812f82-dirty', 'icefall-git-date': 'Wed Jul 20 19:47:36 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/open_source/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'nbest-oracle', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 200, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 20:48:52,068 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 20:48:52,133 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 20:48:58,588 INFO [decode.py:821] Calculating the averaged model over epoch range from 22 (excluded) to 30
6
+ 2022-07-20 20:49:01,210 INFO [decode.py:837] Number of model parameters: 103071035
7
+ 2022-07-20 20:49:01,211 INFO [asr_datamodule.py:444] About to get test-clean cuts
8
+ 2022-07-20 20:49:01,216 INFO [asr_datamodule.py:451] About to get test-other cuts
9
+ 2022-07-20 20:49:02,456 INFO [decode.py:588] batch 0/?, cuts processed until now is 15
10
+ 2022-07-20 20:49:33,689 INFO [decode.py:588] batch 100/?, cuts processed until now is 2578
11
+ 2022-07-20 20:49:34,651 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-oracle_100_nbest_scale_0.5.txt
12
+ 2022-07-20 20:49:34,728 INFO [utils.py:416] [test-clean-oracle_100_nbest_scale_0.5] %WER 1.53% [805 / 52576, 56 ins, 111 del, 638 sub ]
13
+ 2022-07-20 20:49:34,958 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-oracle_100_nbest_scale_0.5.txt
14
+ 2022-07-20 20:49:34,958 INFO [decode.py:637]
15
+ For test-clean, WER of different settings are:
16
+ oracle_100_nbest_scale_0.5 1.53 best for test-clean
17
+
18
+ 2022-07-20 20:49:35,889 INFO [decode.py:588] batch 0/?, cuts processed until now is 19
19
+ 2022-07-20 20:50:05,816 INFO [decode.py:588] batch 100/?, cuts processed until now is 2888
20
+ 2022-07-20 20:50:06,184 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-oracle_100_nbest_scale_0.5.txt
21
+ 2022-07-20 20:50:06,262 INFO [utils.py:416] [test-other-oracle_100_nbest_scale_0.5] %WER 3.47% [1818 / 52343, 111 ins, 321 del, 1386 sub ]
22
+ 2022-07-20 20:50:06,486 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-oracle_100_nbest_scale_0.5.txt
23
+ 2022-07-20 20:50:06,487 INFO [decode.py:637]
24
+ For test-other, WER of different settings are:
25
+ oracle_100_nbest_scale_0.5 3.47 best for test-other
26
+
27
+ 2022-07-20 20:50:06,487 INFO [decode.py:868] Done!
log/log-nbest-rescoring/log-decode-2022-07-20-21-16-48 ADDED
@@ -0,0 +1,192 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 21:16:48,757 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 21:16:48,757 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'd812f82-dirty', 'icefall-git-date': 'Wed Jul 20 19:47:36 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'nbest-rescoring', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 30, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 21:16:49,153 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 21:16:49,217 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 21:16:54,771 INFO [decode.py:730] Loading pre-compiled G_4_gram.pt
6
+ 2022-07-20 21:16:59,308 INFO [decode.py:821] Calculating the averaged model over epoch range from 22 (excluded) to 30
7
+ 2022-07-20 21:17:01,993 INFO [decode.py:837] Number of model parameters: 103071035
8
+ 2022-07-20 21:17:01,993 INFO [asr_datamodule.py:444] About to get test-clean cuts
9
+ 2022-07-20 21:17:01,997 INFO [asr_datamodule.py:451] About to get test-other cuts
10
+ 2022-07-20 21:17:03,382 INFO [decode.py:588] batch 0/?, cuts processed until now is 2
11
+ 2022-07-20 21:17:47,578 INFO [decode.py:588] batch 100/?, cuts processed until now is 326
12
+ 2022-07-20 21:18:29,336 INFO [decode.py:588] batch 200/?, cuts processed until now is 709
13
+ 2022-07-20 21:19:07,578 INFO [decode.py:588] batch 300/?, cuts processed until now is 1126
14
+ 2022-07-20 21:19:46,337 INFO [decode.py:588] batch 400/?, cuts processed until now is 1489
15
+ 2022-07-20 21:20:27,408 INFO [decode.py:588] batch 500/?, cuts processed until now is 1823
16
+ 2022-07-20 21:21:06,308 INFO [decode.py:588] batch 600/?, cuts processed until now is 2211
17
+ 2022-07-20 21:21:50,370 INFO [decode.py:588] batch 700/?, cuts processed until now is 2474
18
+ 2022-07-20 21:22:35,645 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.1.txt
19
+ 2022-07-20 21:22:35,734 INFO [utils.py:410] [test-clean-lm_scale_0.1] %WER 2.81% [1476 / 52576, 202 ins, 102 del, 1172 sub ]
20
+ 2022-07-20 21:22:36,154 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.1.txt
21
+ 2022-07-20 21:22:36,184 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.2.txt
22
+ 2022-07-20 21:22:36,264 INFO [utils.py:410] [test-clean-lm_scale_0.2] %WER 2.75% [1447 / 52576, 199 ins, 105 del, 1143 sub ]
23
+ 2022-07-20 21:22:36,496 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.2.txt
24
+ 2022-07-20 21:22:36,525 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.3.txt
25
+ 2022-07-20 21:22:36,607 INFO [utils.py:410] [test-clean-lm_scale_0.3] %WER 2.72% [1428 / 52576, 195 ins, 107 del, 1126 sub ]
26
+ 2022-07-20 21:22:36,840 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.3.txt
27
+ 2022-07-20 21:22:36,870 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.4.txt
28
+ 2022-07-20 21:22:36,949 INFO [utils.py:410] [test-clean-lm_scale_0.4] %WER 2.69% [1412 / 52576, 192 ins, 112 del, 1108 sub ]
29
+ 2022-07-20 21:22:37,184 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.4.txt
30
+ 2022-07-20 21:22:37,213 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.5.txt
31
+ 2022-07-20 21:22:37,288 INFO [utils.py:410] [test-clean-lm_scale_0.5] %WER 2.68% [1409 / 52576, 188 ins, 115 del, 1106 sub ]
32
+ 2022-07-20 21:22:37,516 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.5.txt
33
+ 2022-07-20 21:22:37,544 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.6.txt
34
+ 2022-07-20 21:22:37,621 INFO [utils.py:410] [test-clean-lm_scale_0.6] %WER 2.68% [1411 / 52576, 181 ins, 121 del, 1109 sub ]
35
+ 2022-07-20 21:22:37,845 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.6.txt
36
+ 2022-07-20 21:22:37,872 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.7.txt
37
+ 2022-07-20 21:22:37,948 INFO [utils.py:410] [test-clean-lm_scale_0.7] %WER 2.68% [1409 / 52576, 173 ins, 131 del, 1105 sub ]
38
+ 2022-07-20 21:22:38,179 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.7.txt
39
+ 2022-07-20 21:22:38,206 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.8.txt
40
+ 2022-07-20 21:22:38,282 INFO [utils.py:410] [test-clean-lm_scale_0.8] %WER 2.71% [1425 / 52576, 169 ins, 144 del, 1112 sub ]
41
+ 2022-07-20 21:22:38,656 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.8.txt
42
+ 2022-07-20 21:22:38,684 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.9.txt
43
+ 2022-07-20 21:22:38,762 INFO [utils.py:410] [test-clean-lm_scale_0.9] %WER 2.74% [1441 / 52576, 156 ins, 164 del, 1121 sub ]
44
+ 2022-07-20 21:22:38,987 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.9.txt
45
+ 2022-07-20 21:22:39,016 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.0.txt
46
+ 2022-07-20 21:22:39,102 INFO [utils.py:410] [test-clean-lm_scale_1.0] %WER 2.82% [1485 / 52576, 156 ins, 197 del, 1132 sub ]
47
+ 2022-07-20 21:22:39,328 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.0.txt
48
+ 2022-07-20 21:22:39,356 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.1.txt
49
+ 2022-07-20 21:22:39,432 INFO [utils.py:410] [test-clean-lm_scale_1.1] %WER 2.91% [1530 / 52576, 153 ins, 234 del, 1143 sub ]
50
+ 2022-07-20 21:22:39,667 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.1.txt
51
+ 2022-07-20 21:22:39,703 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.2.txt
52
+ 2022-07-20 21:22:39,779 INFO [utils.py:410] [test-clean-lm_scale_1.2] %WER 3.04% [1600 / 52576, 150 ins, 283 del, 1167 sub ]
53
+ 2022-07-20 21:22:40,004 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.2.txt
54
+ 2022-07-20 21:22:40,036 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.3.txt
55
+ 2022-07-20 21:22:40,112 INFO [utils.py:410] [test-clean-lm_scale_1.3] %WER 3.23% [1697 / 52576, 150 ins, 347 del, 1200 sub ]
56
+ 2022-07-20 21:22:40,341 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.3.txt
57
+ 2022-07-20 21:22:40,372 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.4.txt
58
+ 2022-07-20 21:22:40,448 INFO [utils.py:410] [test-clean-lm_scale_1.4] %WER 3.41% [1792 / 52576, 149 ins, 420 del, 1223 sub ]
59
+ 2022-07-20 21:22:40,674 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.4.txt
60
+ 2022-07-20 21:22:40,705 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.5.txt
61
+ 2022-07-20 21:22:40,950 INFO [utils.py:410] [test-clean-lm_scale_1.5] %WER 3.62% [1902 / 52576, 146 ins, 493 del, 1263 sub ]
62
+ 2022-07-20 21:22:41,181 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.5.txt
63
+ 2022-07-20 21:22:41,209 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.6.txt
64
+ 2022-07-20 21:22:41,286 INFO [utils.py:410] [test-clean-lm_scale_1.6] %WER 3.81% [2001 / 52576, 140 ins, 573 del, 1288 sub ]
65
+ 2022-07-20 21:22:41,511 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.6.txt
66
+ 2022-07-20 21:22:41,538 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.7.txt
67
+ 2022-07-20 21:22:41,614 INFO [utils.py:410] [test-clean-lm_scale_1.7] %WER 3.99% [2100 / 52576, 142 ins, 651 del, 1307 sub ]
68
+ 2022-07-20 21:22:41,841 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.7.txt
69
+ 2022-07-20 21:22:41,868 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.8.txt
70
+ 2022-07-20 21:22:41,949 INFO [utils.py:410] [test-clean-lm_scale_1.8] %WER 4.16% [2187 / 52576, 143 ins, 722 del, 1322 sub ]
71
+ 2022-07-20 21:22:42,176 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.8.txt
72
+ 2022-07-20 21:22:42,204 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.9.txt
73
+ 2022-07-20 21:22:42,281 INFO [utils.py:410] [test-clean-lm_scale_1.9] %WER 4.32% [2272 / 52576, 141 ins, 776 del, 1355 sub ]
74
+ 2022-07-20 21:22:42,508 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.9.txt
75
+ 2022-07-20 21:22:42,536 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_2.0.txt
76
+ 2022-07-20 21:22:42,612 INFO [utils.py:410] [test-clean-lm_scale_2.0] %WER 4.54% [2386 / 52576, 143 ins, 847 del, 1396 sub ]
77
+ 2022-07-20 21:22:42,843 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_2.0.txt
78
+ 2022-07-20 21:22:42,844 INFO [decode.py:637]
79
+ For test-clean, WER of different settings are:
80
+ lm_scale_0.5 2.68 best for test-clean
81
+ lm_scale_0.6 2.68
82
+ lm_scale_0.7 2.68
83
+ lm_scale_0.4 2.69
84
+ lm_scale_0.8 2.71
85
+ lm_scale_0.3 2.72
86
+ lm_scale_0.9 2.74
87
+ lm_scale_0.2 2.75
88
+ lm_scale_0.1 2.81
89
+ lm_scale_1.0 2.82
90
+ lm_scale_1.1 2.91
91
+ lm_scale_1.2 3.04
92
+ lm_scale_1.3 3.23
93
+ lm_scale_1.4 3.41
94
+ lm_scale_1.5 3.62
95
+ lm_scale_1.6 3.81
96
+ lm_scale_1.7 3.99
97
+ lm_scale_1.8 4.16
98
+ lm_scale_1.9 4.32
99
+ lm_scale_2.0 4.54
100
+
101
+ 2022-07-20 21:22:43,967 INFO [decode.py:588] batch 0/?, cuts processed until now is 2
102
+ 2022-07-20 21:23:26,720 INFO [decode.py:588] batch 100/?, cuts processed until now is 377
103
+ 2022-07-20 21:24:06,356 INFO [decode.py:588] batch 200/?, cuts processed until now is 811
104
+ 2022-07-20 21:24:45,879 INFO [decode.py:588] batch 300/?, cuts processed until now is 1265
105
+ 2022-07-20 21:25:24,826 INFO [decode.py:588] batch 400/?, cuts processed until now is 1684
106
+ 2022-07-20 21:26:06,471 INFO [decode.py:588] batch 500/?, cuts processed until now is 2061
107
+ 2022-07-20 21:26:50,778 INFO [decode.py:588] batch 600/?, cuts processed until now is 2495
108
+ 2022-07-20 21:27:36,625 INFO [decode.py:588] batch 700/?, cuts processed until now is 2781
109
+ 2022-07-20 21:28:29,094 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.1.txt
110
+ 2022-07-20 21:28:29,184 INFO [utils.py:410] [test-other-lm_scale_0.1] %WER 6.16% [3224 / 52343, 397 ins, 263 del, 2564 sub ]
111
+ 2022-07-20 21:28:29,431 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.1.txt
112
+ 2022-07-20 21:28:29,464 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.2.txt
113
+ 2022-07-20 21:28:29,553 INFO [utils.py:410] [test-other-lm_scale_0.2] %WER 6.01% [3148 / 52343, 379 ins, 262 del, 2507 sub ]
114
+ 2022-07-20 21:28:29,800 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.2.txt
115
+ 2022-07-20 21:28:29,830 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.3.txt
116
+ 2022-07-20 21:28:29,912 INFO [utils.py:410] [test-other-lm_scale_0.3] %WER 5.93% [3105 / 52343, 365 ins, 269 del, 2471 sub ]
117
+ 2022-07-20 21:28:30,172 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.3.txt
118
+ 2022-07-20 21:28:30,206 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.4.txt
119
+ 2022-07-20 21:28:30,286 INFO [utils.py:410] [test-other-lm_scale_0.4] %WER 5.84% [3057 / 52343, 347 ins, 291 del, 2419 sub ]
120
+ 2022-07-20 21:28:30,523 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.4.txt
121
+ 2022-07-20 21:28:30,556 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.5.txt
122
+ 2022-07-20 21:28:30,640 INFO [utils.py:410] [test-other-lm_scale_0.5] %WER 5.77% [3021 / 52343, 333 ins, 301 del, 2387 sub ]
123
+ 2022-07-20 21:28:30,885 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.5.txt
124
+ 2022-07-20 21:28:30,920 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.6.txt
125
+ 2022-07-20 21:28:31,004 INFO [utils.py:410] [test-other-lm_scale_0.6] %WER 5.78% [3027 / 52343, 326 ins, 333 del, 2368 sub ]
126
+ 2022-07-20 21:28:31,246 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.6.txt
127
+ 2022-07-20 21:28:31,285 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.7.txt
128
+ 2022-07-20 21:28:31,367 INFO [utils.py:410] [test-other-lm_scale_0.7] %WER 5.82% [3046 / 52343, 314 ins, 369 del, 2363 sub ]
129
+ 2022-07-20 21:28:31,607 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.7.txt
130
+ 2022-07-20 21:28:31,641 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.8.txt
131
+ 2022-07-20 21:28:31,732 INFO [utils.py:410] [test-other-lm_scale_0.8] %WER 5.82% [3045 / 52343, 297 ins, 412 del, 2336 sub ]
132
+ 2022-07-20 21:28:31,974 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.8.txt
133
+ 2022-07-20 21:28:32,006 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.9.txt
134
+ 2022-07-20 21:28:32,086 INFO [utils.py:410] [test-other-lm_scale_0.9] %WER 5.89% [3084 / 52343, 285 ins, 463 del, 2336 sub ]
135
+ 2022-07-20 21:28:32,543 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.9.txt
136
+ 2022-07-20 21:28:32,576 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.0.txt
137
+ 2022-07-20 21:28:32,660 INFO [utils.py:410] [test-other-lm_scale_1.0] %WER 6.04% [3159 / 52343, 273 ins, 530 del, 2356 sub ]
138
+ 2022-07-20 21:28:32,903 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.0.txt
139
+ 2022-07-20 21:28:32,936 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.1.txt
140
+ 2022-07-20 21:28:33,019 INFO [utils.py:410] [test-other-lm_scale_1.1] %WER 6.20% [3244 / 52343, 268 ins, 598 del, 2378 sub ]
141
+ 2022-07-20 21:28:33,282 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.1.txt
142
+ 2022-07-20 21:28:33,312 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.2.txt
143
+ 2022-07-20 21:28:33,397 INFO [utils.py:410] [test-other-lm_scale_1.2] %WER 6.31% [3302 / 52343, 257 ins, 667 del, 2378 sub ]
144
+ 2022-07-20 21:28:33,635 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.2.txt
145
+ 2022-07-20 21:28:33,667 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.3.txt
146
+ 2022-07-20 21:28:33,750 INFO [utils.py:410] [test-other-lm_scale_1.3] %WER 6.58% [3444 / 52343, 250 ins, 787 del, 2407 sub ]
147
+ 2022-07-20 21:28:33,989 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.3.txt
148
+ 2022-07-20 21:28:34,021 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.4.txt
149
+ 2022-07-20 21:28:34,105 INFO [utils.py:410] [test-other-lm_scale_1.4] %WER 6.88% [3603 / 52343, 246 ins, 921 del, 2436 sub ]
150
+ 2022-07-20 21:28:34,341 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.4.txt
151
+ 2022-07-20 21:28:34,373 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.5.txt
152
+ 2022-07-20 21:28:34,468 INFO [utils.py:410] [test-other-lm_scale_1.5] %WER 7.09% [3713 / 52343, 235 ins, 1031 del, 2447 sub ]
153
+ 2022-07-20 21:28:34,706 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.5.txt
154
+ 2022-07-20 21:28:34,738 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.6.txt
155
+ 2022-07-20 21:28:34,820 INFO [utils.py:410] [test-other-lm_scale_1.6] %WER 7.32% [3834 / 52343, 227 ins, 1131 del, 2476 sub ]
156
+ 2022-07-20 21:28:35,258 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.6.txt
157
+ 2022-07-20 21:28:35,287 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.7.txt
158
+ 2022-07-20 21:28:35,380 INFO [utils.py:410] [test-other-lm_scale_1.7] %WER 7.54% [3945 / 52343, 223 ins, 1227 del, 2495 sub ]
159
+ 2022-07-20 21:28:35,616 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.7.txt
160
+ 2022-07-20 21:28:35,649 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.8.txt
161
+ 2022-07-20 21:28:35,732 INFO [utils.py:410] [test-other-lm_scale_1.8] %WER 7.75% [4056 / 52343, 221 ins, 1320 del, 2515 sub ]
162
+ 2022-07-20 21:28:35,969 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.8.txt
163
+ 2022-07-20 21:28:35,998 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.9.txt
164
+ 2022-07-20 21:28:36,106 INFO [utils.py:410] [test-other-lm_scale_1.9] %WER 7.97% [4170 / 52343, 222 ins, 1401 del, 2547 sub ]
165
+ 2022-07-20 21:28:36,339 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.9.txt
166
+ 2022-07-20 21:28:36,367 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_2.0.txt
167
+ 2022-07-20 21:28:36,447 INFO [utils.py:410] [test-other-lm_scale_2.0] %WER 8.18% [4281 / 52343, 220 ins, 1479 del, 2582 sub ]
168
+ 2022-07-20 21:28:36,681 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_2.0.txt
169
+ 2022-07-20 21:28:36,681 INFO [decode.py:637]
170
+ For test-other, WER of different settings are:
171
+ lm_scale_0.5 5.77 best for test-other
172
+ lm_scale_0.6 5.78
173
+ lm_scale_0.7 5.82
174
+ lm_scale_0.8 5.82
175
+ lm_scale_0.4 5.84
176
+ lm_scale_0.9 5.89
177
+ lm_scale_0.3 5.93
178
+ lm_scale_0.2 6.01
179
+ lm_scale_1.0 6.04
180
+ lm_scale_0.1 6.16
181
+ lm_scale_1.1 6.2
182
+ lm_scale_1.2 6.31
183
+ lm_scale_1.3 6.58
184
+ lm_scale_1.4 6.88
185
+ lm_scale_1.5 7.09
186
+ lm_scale_1.6 7.32
187
+ lm_scale_1.7 7.54
188
+ lm_scale_1.8 7.75
189
+ lm_scale_1.9 7.97
190
+ lm_scale_2.0 8.18
191
+
192
+ 2022-07-20 21:28:36,682 INFO [decode.py:868] Done!
log/log-nbest/log-decode-2022-07-20-21-02-14 ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 21:02:14,473 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 21:02:14,473 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'd812f82-dirty', 'icefall-git-date': 'Wed Jul 20 19:47:36 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/open_source/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'nbest', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 20, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 21:02:14,869 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 21:02:14,927 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 21:02:22,144 INFO [decode.py:821] Calculating the averaged model over epoch range from 22 (excluded) to 30
6
+ 2022-07-20 21:02:24,985 INFO [decode.py:837] Number of model parameters: 103071035
7
+ 2022-07-20 21:02:24,986 INFO [asr_datamodule.py:444] About to get test-clean cuts
8
+ 2022-07-20 21:02:24,988 INFO [asr_datamodule.py:451] About to get test-other cuts
9
+ 2022-07-20 21:02:26,237 INFO [decode.py:588] batch 0/?, cuts processed until now is 1
10
+ 2022-07-20 21:03:00,467 INFO [decode.py:588] batch 100/?, cuts processed until now is 212
11
+ 2022-07-20 21:03:32,099 INFO [decode.py:588] batch 200/?, cuts processed until now is 459
12
+ 2022-07-20 21:04:01,741 INFO [decode.py:588] batch 300/?, cuts processed until now is 717
13
+ 2022-07-20 21:04:31,150 INFO [decode.py:588] batch 400/?, cuts processed until now is 949
14
+ 2022-07-20 21:05:03,456 INFO [decode.py:588] batch 500/?, cuts processed until now is 1163
15
+ 2022-07-20 21:05:33,288 INFO [decode.py:588] batch 600/?, cuts processed until now is 1411
16
+ 2022-07-20 21:06:01,818 INFO [decode.py:588] batch 700/?, cuts processed until now is 1652
17
+ 2022-07-20 21:06:33,143 INFO [decode.py:588] batch 800/?, cuts processed until now is 1867
18
+ 2022-07-20 21:06:59,955 INFO [decode.py:588] batch 900/?, cuts processed until now is 2122
19
+ 2022-07-20 21:07:30,299 INFO [decode.py:588] batch 1000/?, cuts processed until now is 2321
20
+ 2022-07-20 21:08:02,342 INFO [decode.py:588] batch 1100/?, cuts processed until now is 2468
21
+ 2022-07-20 21:08:32,616 INFO [decode.py:588] batch 1200/?, cuts processed until now is 2601
22
+ 2022-07-20 21:08:37,175 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-no_rescore-nbest-scale-0.5-100.txt
23
+ 2022-07-20 21:08:37,260 INFO [utils.py:416] [test-clean-no_rescore-nbest-scale-0.5-100] %WER 2.94% [1547 / 52576, 149 ins, 218 del, 1180 sub ]
24
+ 2022-07-20 21:08:37,488 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-no_rescore-nbest-scale-0.5-100.txt
25
+ 2022-07-20 21:08:37,489 INFO [decode.py:637]
26
+ For test-clean, WER of different settings are:
27
+ no_rescore-nbest-scale-0.5-100 2.94 best for test-clean
28
+
29
+ 2022-07-20 21:08:38,299 INFO [decode.py:588] batch 0/?, cuts processed until now is 2
30
+ 2022-07-20 21:09:09,190 INFO [decode.py:588] batch 100/?, cuts processed until now is 249
31
+ 2022-07-20 21:09:38,759 INFO [decode.py:588] batch 200/?, cuts processed until now is 540
32
+ 2022-07-20 21:10:06,577 INFO [decode.py:588] batch 300/?, cuts processed until now is 834
33
+ 2022-07-20 21:10:34,377 INFO [decode.py:588] batch 400/?, cuts processed until now is 1104
34
+ 2022-07-20 21:11:05,680 INFO [decode.py:588] batch 500/?, cuts processed until now is 1350
35
+ 2022-07-20 21:11:36,410 INFO [decode.py:588] batch 600/?, cuts processed until now is 1640
36
+ 2022-07-20 21:12:03,632 INFO [decode.py:588] batch 700/?, cuts processed until now is 1933
37
+ 2022-07-20 21:12:32,513 INFO [decode.py:588] batch 800/?, cuts processed until now is 2175
38
+ 2022-07-20 21:13:02,611 INFO [decode.py:588] batch 900/?, cuts processed until now is 2440
39
+ 2022-07-20 21:13:33,536 INFO [decode.py:588] batch 1000/?, cuts processed until now is 2656
40
+ 2022-07-20 21:14:04,886 INFO [decode.py:588] batch 1100/?, cuts processed until now is 2839
41
+ 2022-07-20 21:14:34,022 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-no_rescore-nbest-scale-0.5-100.txt
42
+ 2022-07-20 21:14:34,110 INFO [utils.py:416] [test-other-no_rescore-nbest-scale-0.5-100] %WER 6.39% [3346 / 52343, 259 ins, 590 del, 2497 sub ]
43
+ 2022-07-20 21:14:34,427 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-no_rescore-nbest-scale-0.5-100.txt
44
+ 2022-07-20 21:14:34,428 INFO [decode.py:637]
45
+ For test-other, WER of different settings are:
46
+ no_rescore-nbest-scale-0.5-100 6.39 best for test-other
47
+
48
+ 2022-07-20 21:14:34,428 INFO [decode.py:868] Done!
log/log-whole-lattice-rescoring/log-decode-2022-07-20-21-02-14 ADDED
@@ -0,0 +1,207 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-07-20 21:02:14,439 INFO [decode.py:653] Decoding started
2
+ 2022-07-20 21:02:14,440 INFO [decode.py:654] {'subsampling_factor': 4, 'feature_dim': 80, 'nhead': 8, 'dim_feedforward': 2048, 'encoder_dim': 512, 'num_encoder_layers': 12, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.15.1', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '350e449fe5d7b4231f77eb0add764782eed9f5d2', 'k2-git-date': 'Thu May 26 22:56:24 2022', 'lhotse-version': '1.3.0.dev+git.232f3eb.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'conformer_ctc2', 'icefall-git-sha1': 'd812f82-dirty', 'icefall-git-date': 'Wed Jul 20 19:47:36 2022', 'icefall-path': '/home/storage23/wangquandong/experiment/open_source/icefall', 'k2-path': '/home/storage23/wangquandong/tools/k2/k2/python/k2/__init__.py', 'lhotse-path': '/home/storage23/wangquandong/tools/lhotse/lhotse/__init__.py', 'hostname': 'tj1-asr-train-v100-30.kscn', 'IP address': '10.38.22.183'}, 'epoch': 30, 'iter': 0, 'avg': 8, 'method': 'whole-lattice-rescoring', 'use_averaged_model': True, 'num_decoder_layers': 6, 'num_paths': 100, 'nbest_scale': 0.5, 'exp_dir': PosixPath('pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 20, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures'}
3
+ 2022-07-20 21:02:14,840 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-07-20 21:02:14,895 INFO [decode.py:664] device: cuda:0
5
+ 2022-07-20 21:02:21,292 INFO [decode.py:730] Loading pre-compiled G_4_gram.pt
6
+ 2022-07-20 21:02:26,887 INFO [decode.py:821] Calculating the averaged model over epoch range from 22 (excluded) to 30
7
+ 2022-07-20 21:02:30,017 INFO [decode.py:837] Number of model parameters: 103071035
8
+ 2022-07-20 21:02:30,018 INFO [asr_datamodule.py:444] About to get test-clean cuts
9
+ 2022-07-20 21:02:30,022 INFO [asr_datamodule.py:451] About to get test-other cuts
10
+ 2022-07-20 21:02:31,504 INFO [decode.py:588] batch 0/?, cuts processed until now is 1
11
+ 2022-07-20 21:03:17,856 INFO [decode.py:588] batch 100/?, cuts processed until now is 212
12
+ 2022-07-20 21:03:58,002 INFO [decode.py:588] batch 200/?, cuts processed until now is 459
13
+ 2022-07-20 21:04:39,673 INFO [decode.py:588] batch 300/?, cuts processed until now is 717
14
+ 2022-07-20 21:05:25,026 INFO [decode.py:588] batch 400/?, cuts processed until now is 949
15
+ 2022-07-20 21:06:10,220 INFO [decode.py:588] batch 500/?, cuts processed until now is 1163
16
+ 2022-07-20 21:06:51,165 INFO [decode.py:588] batch 600/?, cuts processed until now is 1411
17
+ 2022-07-20 21:07:30,465 INFO [decode.py:588] batch 700/?, cuts processed until now is 1652
18
+ 2022-07-20 21:08:14,019 INFO [decode.py:588] batch 800/?, cuts processed until now is 1867
19
+ 2022-07-20 21:08:52,156 INFO [decode.py:588] batch 900/?, cuts processed until now is 2122
20
+ 2022-07-20 21:09:33,815 INFO [decode.py:588] batch 1000/?, cuts processed until now is 2321
21
+ 2022-07-20 21:10:18,006 INFO [decode.py:588] batch 1100/?, cuts processed until now is 2468
22
+ 2022-07-20 21:11:00,039 INFO [decode.py:588] batch 1200/?, cuts processed until now is 2601
23
+ 2022-07-20 21:11:06,549 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.1.txt
24
+ 2022-07-20 21:11:06,631 INFO [utils.py:416] [test-clean-lm_scale_0.1] %WER 2.82% [1482 / 52576, 214 ins, 100 del, 1168 sub ]
25
+ 2022-07-20 21:11:06,864 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.1.txt
26
+ 2022-07-20 21:11:06,893 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.2.txt
27
+ 2022-07-20 21:11:06,973 INFO [utils.py:416] [test-clean-lm_scale_0.2] %WER 2.74% [1440 / 52576, 206 ins, 102 del, 1132 sub ]
28
+ 2022-07-20 21:11:07,204 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.2.txt
29
+ 2022-07-20 21:11:07,234 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.3.txt
30
+ 2022-07-20 21:11:07,313 INFO [utils.py:416] [test-clean-lm_scale_0.3] %WER 2.70% [1421 / 52576, 200 ins, 105 del, 1116 sub ]
31
+ 2022-07-20 21:11:07,543 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.3.txt
32
+ 2022-07-20 21:11:07,572 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.4.txt
33
+ 2022-07-20 21:11:07,649 INFO [utils.py:416] [test-clean-lm_scale_0.4] %WER 2.67% [1403 / 52576, 195 ins, 109 del, 1099 sub ]
34
+ 2022-07-20 21:11:07,879 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.4.txt
35
+ 2022-07-20 21:11:07,907 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.5.txt
36
+ 2022-07-20 21:11:07,984 INFO [utils.py:416] [test-clean-lm_scale_0.5] %WER 2.66% [1400 / 52576, 188 ins, 112 del, 1100 sub ]
37
+ 2022-07-20 21:11:08,213 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.5.txt
38
+ 2022-07-20 21:11:08,241 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.6.txt
39
+ 2022-07-20 21:11:08,318 INFO [utils.py:416] [test-clean-lm_scale_0.6] %WER 2.68% [1407 / 52576, 181 ins, 119 del, 1107 sub ]
40
+ 2022-07-20 21:11:08,549 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.6.txt
41
+ 2022-07-20 21:11:08,577 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.7.txt
42
+ 2022-07-20 21:11:08,794 INFO [utils.py:416] [test-clean-lm_scale_0.7] %WER 2.68% [1409 / 52576, 173 ins, 130 del, 1106 sub ]
43
+ 2022-07-20 21:11:09,022 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.7.txt
44
+ 2022-07-20 21:11:09,050 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.8.txt
45
+ 2022-07-20 21:11:09,127 INFO [utils.py:416] [test-clean-lm_scale_0.8] %WER 2.70% [1422 / 52576, 169 ins, 142 del, 1111 sub ]
46
+ 2022-07-20 21:11:09,369 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.8.txt
47
+ 2022-07-20 21:11:09,400 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_0.9.txt
48
+ 2022-07-20 21:11:09,476 INFO [utils.py:416] [test-clean-lm_scale_0.9] %WER 2.74% [1443 / 52576, 156 ins, 167 del, 1120 sub ]
49
+ 2022-07-20 21:11:09,705 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_0.9.txt
50
+ 2022-07-20 21:11:09,735 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.0.txt
51
+ 2022-07-20 21:11:09,811 INFO [utils.py:416] [test-clean-lm_scale_1.0] %WER 2.84% [1491 / 52576, 156 ins, 203 del, 1132 sub ]
52
+ 2022-07-20 21:11:10,041 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.0.txt
53
+ 2022-07-20 21:11:10,070 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.1.txt
54
+ 2022-07-20 21:11:10,147 INFO [utils.py:416] [test-clean-lm_scale_1.1] %WER 2.94% [1545 / 52576, 151 ins, 246 del, 1148 sub ]
55
+ 2022-07-20 21:11:10,377 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.1.txt
56
+ 2022-07-20 21:11:10,407 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.2.txt
57
+ 2022-07-20 21:11:10,485 INFO [utils.py:416] [test-clean-lm_scale_1.2] %WER 3.09% [1623 / 52576, 146 ins, 300 del, 1177 sub ]
58
+ 2022-07-20 21:11:10,714 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.2.txt
59
+ 2022-07-20 21:11:10,743 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.3.txt
60
+ 2022-07-20 21:11:10,820 INFO [utils.py:416] [test-clean-lm_scale_1.3] %WER 3.30% [1734 / 52576, 149 ins, 376 del, 1209 sub ]
61
+ 2022-07-20 21:11:11,203 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.3.txt
62
+ 2022-07-20 21:11:11,232 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.4.txt
63
+ 2022-07-20 21:11:11,310 INFO [utils.py:416] [test-clean-lm_scale_1.4] %WER 3.51% [1846 / 52576, 148 ins, 462 del, 1236 sub ]
64
+ 2022-07-20 21:11:11,539 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.4.txt
65
+ 2022-07-20 21:11:11,567 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.5.txt
66
+ 2022-07-20 21:11:11,644 INFO [utils.py:416] [test-clean-lm_scale_1.5] %WER 3.79% [1992 / 52576, 144 ins, 567 del, 1281 sub ]
67
+ 2022-07-20 21:11:11,873 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.5.txt
68
+ 2022-07-20 21:11:11,901 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.6.txt
69
+ 2022-07-20 21:11:11,978 INFO [utils.py:416] [test-clean-lm_scale_1.6] %WER 4.07% [2140 / 52576, 131 ins, 698 del, 1311 sub ]
70
+ 2022-07-20 21:11:12,209 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.6.txt
71
+ 2022-07-20 21:11:12,237 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.7.txt
72
+ 2022-07-20 21:11:12,315 INFO [utils.py:416] [test-clean-lm_scale_1.7] %WER 4.38% [2303 / 52576, 131 ins, 826 del, 1346 sub ]
73
+ 2022-07-20 21:11:12,544 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.7.txt
74
+ 2022-07-20 21:11:12,572 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.8.txt
75
+ 2022-07-20 21:11:12,649 INFO [utils.py:416] [test-clean-lm_scale_1.8] %WER 4.69% [2465 / 52576, 131 ins, 955 del, 1379 sub ]
76
+ 2022-07-20 21:11:12,881 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.8.txt
77
+ 2022-07-20 21:11:12,910 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_1.9.txt
78
+ 2022-07-20 21:11:12,989 INFO [utils.py:416] [test-clean-lm_scale_1.9] %WER 4.99% [2625 / 52576, 133 ins, 1068 del, 1424 sub ]
79
+ 2022-07-20 21:11:13,366 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_1.9.txt
80
+ 2022-07-20 21:11:13,405 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-clean-lm_scale_2.0.txt
81
+ 2022-07-20 21:11:13,494 INFO [utils.py:416] [test-clean-lm_scale_2.0] %WER 5.31% [2794 / 52576, 130 ins, 1197 del, 1467 sub ]
82
+ 2022-07-20 21:11:13,727 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-clean-lm_scale_2.0.txt
83
+ 2022-07-20 21:11:13,728 INFO [decode.py:637]
84
+ For test-clean, WER of different settings are:
85
+ lm_scale_0.5 2.66 best for test-clean
86
+ lm_scale_0.4 2.67
87
+ lm_scale_0.6 2.68
88
+ lm_scale_0.7 2.68
89
+ lm_scale_0.3 2.7
90
+ lm_scale_0.8 2.7
91
+ lm_scale_0.2 2.74
92
+ lm_scale_0.9 2.74
93
+ lm_scale_0.1 2.82
94
+ lm_scale_1.0 2.84
95
+ lm_scale_1.1 2.94
96
+ lm_scale_1.2 3.09
97
+ lm_scale_1.3 3.3
98
+ lm_scale_1.4 3.51
99
+ lm_scale_1.5 3.79
100
+ lm_scale_1.6 4.07
101
+ lm_scale_1.7 4.38
102
+ lm_scale_1.8 4.69
103
+ lm_scale_1.9 4.99
104
+ lm_scale_2.0 5.31
105
+
106
+ 2022-07-20 21:11:14,658 INFO [decode.py:588] batch 0/?, cuts processed until now is 2
107
+ 2022-07-20 21:11:54,814 INFO [decode.py:588] batch 100/?, cuts processed until now is 249
108
+ 2022-07-20 21:12:32,635 INFO [decode.py:588] batch 200/?, cuts processed until now is 540
109
+ 2022-07-20 21:13:09,226 INFO [decode.py:588] batch 300/?, cuts processed until now is 834
110
+ 2022-07-20 21:13:44,899 INFO [decode.py:588] batch 400/?, cuts processed until now is 1104
111
+ 2022-07-20 21:14:26,260 INFO [decode.py:588] batch 500/?, cuts processed until now is 1350
112
+ 2022-07-20 21:15:05,158 INFO [decode.py:588] batch 600/?, cuts processed until now is 1640
113
+ 2022-07-20 21:15:40,242 INFO [decode.py:588] batch 700/?, cuts processed until now is 1933
114
+ 2022-07-20 21:16:18,132 INFO [decode.py:588] batch 800/?, cuts processed until now is 2175
115
+ 2022-07-20 21:16:56,866 INFO [decode.py:588] batch 900/?, cuts processed until now is 2440
116
+ 2022-07-20 21:17:04,052 INFO [decode.py:783] Caught exception:
117
+ CUDA out of memory. Tried to allocate 740.00 MiB (GPU 0; 31.75 GiB total capacity; 28.28 GiB already allocated; 445.50 MiB free; 29.93 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
118
+
119
+ 2022-07-20 21:17:04,052 INFO [decode.py:789] num_arcs before pruning: 349313
120
+ 2022-07-20 21:17:04,052 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
121
+ 2022-07-20 21:17:04,072 INFO [decode.py:803] num_arcs after pruning: 8796
122
+ 2022-07-20 21:17:41,206 INFO [decode.py:588] batch 1000/?, cuts processed until now is 2656
123
+ 2022-07-20 21:18:21,669 INFO [decode.py:588] batch 1100/?, cuts processed until now is 2839
124
+ 2022-07-20 21:18:59,437 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.1.txt
125
+ 2022-07-20 21:18:59,529 INFO [utils.py:416] [test-other-lm_scale_0.1] %WER 6.24% [3266 / 52343, 440 ins, 240 del, 2586 sub ]
126
+ 2022-07-20 21:18:59,774 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.1.txt
127
+ 2022-07-20 21:18:59,807 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.2.txt
128
+ 2022-07-20 21:18:59,890 INFO [utils.py:416] [test-other-lm_scale_0.2] %WER 6.04% [3161 / 52343, 414 ins, 245 del, 2502 sub ]
129
+ 2022-07-20 21:19:00,128 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.2.txt
130
+ 2022-07-20 21:19:00,161 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.3.txt
131
+ 2022-07-20 21:19:00,243 INFO [utils.py:416] [test-other-lm_scale_0.3] %WER 5.94% [3107 / 52343, 396 ins, 249 del, 2462 sub ]
132
+ 2022-07-20 21:19:00,480 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.3.txt
133
+ 2022-07-20 21:19:00,512 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.4.txt
134
+ 2022-07-20 21:19:00,594 INFO [utils.py:416] [test-other-lm_scale_0.4] %WER 5.85% [3060 / 52343, 379 ins, 268 del, 2413 sub ]
135
+ 2022-07-20 21:19:01,006 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.4.txt
136
+ 2022-07-20 21:19:01,035 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.5.txt
137
+ 2022-07-20 21:19:01,118 INFO [utils.py:416] [test-other-lm_scale_0.5] %WER 5.77% [3022 / 52343, 351 ins, 286 del, 2385 sub ]
138
+ 2022-07-20 21:19:01,356 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.5.txt
139
+ 2022-07-20 21:19:01,385 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.6.txt
140
+ 2022-07-20 21:19:01,467 INFO [utils.py:416] [test-other-lm_scale_0.6] %WER 5.76% [3017 / 52343, 332 ins, 320 del, 2365 sub ]
141
+ 2022-07-20 21:19:01,708 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.6.txt
142
+ 2022-07-20 21:19:01,736 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.7.txt
143
+ 2022-07-20 21:19:01,816 INFO [utils.py:416] [test-other-lm_scale_0.7] %WER 5.77% [3020 / 52343, 318 ins, 348 del, 2354 sub ]
144
+ 2022-07-20 21:19:02,052 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.7.txt
145
+ 2022-07-20 21:19:02,081 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.8.txt
146
+ 2022-07-20 21:19:02,162 INFO [utils.py:416] [test-other-lm_scale_0.8] %WER 5.78% [3028 / 52343, 294 ins, 398 del, 2336 sub ]
147
+ 2022-07-20 21:19:02,398 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.8.txt
148
+ 2022-07-20 21:19:02,427 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_0.9.txt
149
+ 2022-07-20 21:19:02,507 INFO [utils.py:416] [test-other-lm_scale_0.9] %WER 5.86% [3066 / 52343, 278 ins, 464 del, 2324 sub ]
150
+ 2022-07-20 21:19:02,743 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_0.9.txt
151
+ 2022-07-20 21:19:02,771 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.0.txt
152
+ 2022-07-20 21:19:02,852 INFO [utils.py:416] [test-other-lm_scale_1.0] %WER 6.02% [3149 / 52343, 268 ins, 540 del, 2341 sub ]
153
+ 2022-07-20 21:19:03,087 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.0.txt
154
+ 2022-07-20 21:19:03,115 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.1.txt
155
+ 2022-07-20 21:19:03,196 INFO [utils.py:416] [test-other-lm_scale_1.1] %WER 6.20% [3247 / 52343, 258 ins, 636 del, 2353 sub ]
156
+ 2022-07-20 21:19:03,431 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.1.txt
157
+ 2022-07-20 21:19:03,460 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.2.txt
158
+ 2022-07-20 21:19:03,686 INFO [utils.py:416] [test-other-lm_scale_1.2] %WER 6.40% [3351 / 52343, 248 ins, 749 del, 2354 sub ]
159
+ 2022-07-20 21:19:03,923 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.2.txt
160
+ 2022-07-20 21:19:03,951 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.3.txt
161
+ 2022-07-20 21:19:04,036 INFO [utils.py:416] [test-other-lm_scale_1.3] %WER 6.67% [3489 / 52343, 241 ins, 891 del, 2357 sub ]
162
+ 2022-07-20 21:19:04,280 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.3.txt
163
+ 2022-07-20 21:19:04,309 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.4.txt
164
+ 2022-07-20 21:19:04,392 INFO [utils.py:416] [test-other-lm_scale_1.4] %WER 7.08% [3705 / 52343, 234 ins, 1071 del, 2400 sub ]
165
+ 2022-07-20 21:19:04,628 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.4.txt
166
+ 2022-07-20 21:19:04,656 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.5.txt
167
+ 2022-07-20 21:19:04,737 INFO [utils.py:416] [test-other-lm_scale_1.5] %WER 7.42% [3886 / 52343, 224 ins, 1253 del, 2409 sub ]
168
+ 2022-07-20 21:19:04,973 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.5.txt
169
+ 2022-07-20 21:19:05,001 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.6.txt
170
+ 2022-07-20 21:19:05,081 INFO [utils.py:416] [test-other-lm_scale_1.6] %WER 7.84% [4102 / 52343, 214 ins, 1437 del, 2451 sub ]
171
+ 2022-07-20 21:19:05,317 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.6.txt
172
+ 2022-07-20 21:19:05,346 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.7.txt
173
+ 2022-07-20 21:19:05,426 INFO [utils.py:416] [test-other-lm_scale_1.7] %WER 8.29% [4338 / 52343, 213 ins, 1641 del, 2484 sub ]
174
+ 2022-07-20 21:19:05,667 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.7.txt
175
+ 2022-07-20 21:19:05,696 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.8.txt
176
+ 2022-07-20 21:19:05,777 INFO [utils.py:416] [test-other-lm_scale_1.8] %WER 8.78% [4596 / 52343, 209 ins, 1859 del, 2528 sub ]
177
+ 2022-07-20 21:19:06,031 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.8.txt
178
+ 2022-07-20 21:19:06,082 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_1.9.txt
179
+ 2022-07-20 21:19:06,310 INFO [utils.py:416] [test-other-lm_scale_1.9] %WER 9.27% [4854 / 52343, 205 ins, 2085 del, 2564 sub ]
180
+ 2022-07-20 21:19:06,549 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_1.9.txt
181
+ 2022-07-20 21:19:06,578 INFO [decode.py:609] The transcripts are stored in pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/recogs-test-other-lm_scale_2.0.txt
182
+ 2022-07-20 21:19:06,659 INFO [utils.py:416] [test-other-lm_scale_2.0] %WER 9.72% [5090 / 52343, 204 ins, 2272 del, 2614 sub ]
183
+ 2022-07-20 21:19:06,896 INFO [decode.py:621] Wrote detailed error stats to pruned_transducer_stateless_smooth/quandongtest_8gpu_ctc_attnetion2-exp/errs-test-other-lm_scale_2.0.txt
184
+ 2022-07-20 21:19:06,897 INFO [decode.py:637]
185
+ For test-other, WER of different settings are:
186
+ lm_scale_0.6 5.76 best for test-other
187
+ lm_scale_0.5 5.77
188
+ lm_scale_0.7 5.77
189
+ lm_scale_0.8 5.78
190
+ lm_scale_0.4 5.85
191
+ lm_scale_0.9 5.86
192
+ lm_scale_0.3 5.94
193
+ lm_scale_1.0 6.02
194
+ lm_scale_0.2 6.04
195
+ lm_scale_1.1 6.2
196
+ lm_scale_0.1 6.24
197
+ lm_scale_1.2 6.4
198
+ lm_scale_1.3 6.67
199
+ lm_scale_1.4 7.08
200
+ lm_scale_1.5 7.42
201
+ lm_scale_1.6 7.84
202
+ lm_scale_1.7 8.29
203
+ lm_scale_1.8 8.78
204
+ lm_scale_1.9 9.27
205
+ lm_scale_2.0 9.72
206
+
207
+ 2022-07-20 21:19:06,897 INFO [decode.py:868] Done!