AmirHussein
commited on
Commit
•
647a6e0
1
Parent(s):
d6e0834
- .gitattributes +1 -0
- decoding-results/log-attention-decoder/log-decode-2022-06-24-16-33-12 +52 -0
- decoding-results/log-attention-decoder/log-decode-2022-06-24-16-40-43 +0 -0
- decoding-results/log-attention-decoder/log-decode-2022-06-24-17-04-13 +1317 -0
- decoding-results/log-attention-decoder/log-decode-2022-06-24-17-22-16 +1428 -0
- decoding-results/log-attention-decoder/log-decode-2022-06-27-18-54-02 +6 -0
- decoding-results/log-attention-decoder/log-decode-2022-06-27-19-04-48 +1308 -0
- decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-26-22-37-17 +202 -0
- decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-26-23-11-51 +202 -0
- decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-26-23-21-46 +0 -0
- decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-27-18-46-45 +299 -0
.gitattributes
CHANGED
@@ -38,3 +38,4 @@ data/lang_bpe_5000/*.pt filter=lfs diff=lfs merge=lfs -text
|
|
38 |
data/lang_bpe_5000/*.vocab filter=lfs diff=lfs merge=lfs -text
|
39 |
data/lang_bpe_5000/*.arpa filter=lfs diff=lfs merge=lfs -text
|
40 |
data/lang_bpe_5000/*.txt filter=lfs diff=lfs merge=lfs -text
|
|
|
|
38 |
data/lang_bpe_5000/*.vocab filter=lfs diff=lfs merge=lfs -text
|
39 |
data/lang_bpe_5000/*.arpa filter=lfs diff=lfs merge=lfs -text
|
40 |
data/lang_bpe_5000/*.txt filter=lfs diff=lfs merge=lfs -text
|
41 |
+
decoding-results/* filter=lfs diff=lfs merge=lfs -text
|
decoding-results/log-attention-decoder/log-decode-2022-06-24-16-33-12
ADDED
@@ -0,0 +1,52 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-24 16:33:12,810 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-24 16:33:12,810 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.11', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '', 'k2-git-date': '', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': '7e72d78-dirty', 'icefall-git-date': 'Sat May 28 19:13:53 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3mgpu003', 'IP address': '10.141.0.1'}, 'epoch': 39, 'avg': 10, 'method': 'attention-decoder', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 20, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': False, 'drop_last': True, 'return_cuts': True, 'num_workers': 8, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'enable_musan': False}
|
3 |
+
2022-06-24 16:33:13,088 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-24 16:33:13,121 INFO [decode.py:559] device: cuda:0
|
5 |
+
2022-06-24 16:33:17,106 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-24 16:33:17,873 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-30.pt', 'conformer_ctc/exp_5000_att0.8/epoch-31.pt', 'conformer_ctc/exp_5000_att0.8/epoch-32.pt', 'conformer_ctc/exp_5000_att0.8/epoch-33.pt', 'conformer_ctc/exp_5000_att0.8/epoch-34.pt', 'conformer_ctc/exp_5000_att0.8/epoch-35.pt', 'conformer_ctc/exp_5000_att0.8/epoch-36.pt', 'conformer_ctc/exp_5000_att0.8/epoch-37.pt', 'conformer_ctc/exp_5000_att0.8/epoch-38.pt', 'conformer_ctc/exp_5000_att0.8/epoch-39.pt']
|
7 |
+
2022-06-24 16:37:06,479 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-24 16:37:06,480 INFO [asr_datamodule.py:374] About to get test cuts
|
9 |
+
2022-06-24 16:37:06,523 INFO [asr_datamodule.py:367] About to get dev cuts
|
10 |
+
2022-06-24 16:37:11,701 INFO [decode.py:483] batch 0/?, cuts processed until now is 2
|
11 |
+
2022-06-24 16:37:57,309 INFO [decode.py:733] Caught exception:
|
12 |
+
CUDA out of memory. Tried to allocate 452.00 MiB (GPU 0; 15.90 GiB total capacity; 13.98 GiB already allocated; 181.75 MiB free; 14.89 GiB reserved in total by PyTorch)
|
13 |
+
Exception raised from malloc at /opt/conda/conda-bld/pytorch_1616554788289/work/c10/cuda/CUDACachingAllocator.cpp:288 (most recent call first):
|
14 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab1a77c2f2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
15 |
+
frame #1: <unknown function> + 0x1bc21 (0x2aab1a518c21 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
16 |
+
frame #2: <unknown function> + 0x1c944 (0x2aab1a519944 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
17 |
+
frame #3: <unknown function> + 0x1cf63 (0x2aab1a519f63 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
18 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x5e (0x2aab381f0b0e in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
19 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x11e (0x2aab37f2099e in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
20 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0xa9 (0x2aab37efba69 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
21 |
+
frame #7: <unknown function> + 0x1dfd1d (0x2aab38018d1d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
22 |
+
frame #8: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x786 (0x2aab38026836 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
23 |
+
frame #9: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x328 (0x2aab38019108 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
24 |
+
frame #10: <unknown function> + 0x70d49 (0x2aab33cabd49 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/_k2.cpython-38-x86_64-linux-gnu.so)
|
25 |
+
frame #11: <unknown function> + 0x240e3 (0x2aab33c5f0e3 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/_k2.cpython-38-x86_64-linux-gnu.so)
|
26 |
+
<omitting python frames>
|
27 |
+
frame #41: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
28 |
+
|
29 |
+
|
30 |
+
2022-06-24 16:37:57,310 INFO [decode.py:734] num_arcs before pruning: 1010153
|
31 |
+
2022-06-24 16:37:57,310 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
32 |
+
2022-06-24 16:37:57,329 INFO [decode.py:747] num_arcs after pruning: 2602
|
33 |
+
2022-06-24 16:38:39,372 INFO [decode.py:733] Caught exception:
|
34 |
+
CUDA out of memory. Tried to allocate 1.85 GiB (GPU 0; 15.90 GiB total capacity; 12.31 GiB already allocated; 419.75 MiB free; 14.66 GiB reserved in total by PyTorch)
|
35 |
+
Exception raised from malloc at /opt/conda/conda-bld/pytorch_1616554788289/work/c10/cuda/CUDACachingAllocator.cpp:288 (most recent call first):
|
36 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab1a77c2f2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
37 |
+
frame #1: <unknown function> + 0x1bc21 (0x2aab1a518c21 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
38 |
+
frame #2: <unknown function> + 0x1c944 (0x2aab1a519944 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
39 |
+
frame #3: <unknown function> + 0x1cf63 (0x2aab1a519f63 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
40 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x5e (0x2aab381f0b0e in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
41 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x11e (0x2aab37f2099e in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
42 |
+
frame #6: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x6ab (0x2aab3802675b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
43 |
+
frame #7: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x328 (0x2aab38019108 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/libk2context.so)
|
44 |
+
frame #8: <unknown function> + 0x70d49 (0x2aab33cabd49 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/_k2.cpython-38-x86_64-linux-gnu.so)
|
45 |
+
frame #9: <unknown function> + 0x240e3 (0x2aab33c5f0e3 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/_k2.cpython-38-x86_64-linux-gnu.so)
|
46 |
+
<omitting python frames>
|
47 |
+
frame #39: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
48 |
+
|
49 |
+
|
50 |
+
2022-06-24 16:38:39,372 INFO [decode.py:734] num_arcs before pruning: 735461
|
51 |
+
2022-06-24 16:38:39,372 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
52 |
+
2022-06-24 16:38:39,387 INFO [decode.py:747] num_arcs after pruning: 3691
|
decoding-results/log-attention-decoder/log-decode-2022-06-24-16-40-43
ADDED
The diff for this file is too large to render.
See raw diff
|
|
decoding-results/log-attention-decoder/log-decode-2022-06-24-17-04-13
ADDED
@@ -0,0 +1,1317 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-24 17:04:13,754 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-24 17:04:13,754 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.11', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '', 'k2-git-date': '', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-cuda-available': False, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': '7e72d78-dirty', 'icefall-git-date': 'Sat May 28 19:13:53 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3mgpu003', 'IP address': '10.141.0.1'}, 'epoch': 39, 'avg': 10, 'method': 'attention-decoder', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 20, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': False, 'drop_last': True, 'return_cuts': True, 'num_workers': 20, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'enable_musan': False}
|
3 |
+
2022-06-24 17:04:14,031 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-24 17:04:14,064 INFO [decode.py:559] device: cpu
|
5 |
+
2022-06-24 17:04:19,929 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-24 17:04:23,746 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-30.pt', 'conformer_ctc/exp_5000_att0.8/epoch-31.pt', 'conformer_ctc/exp_5000_att0.8/epoch-32.pt', 'conformer_ctc/exp_5000_att0.8/epoch-33.pt', 'conformer_ctc/exp_5000_att0.8/epoch-34.pt', 'conformer_ctc/exp_5000_att0.8/epoch-35.pt', 'conformer_ctc/exp_5000_att0.8/epoch-36.pt', 'conformer_ctc/exp_5000_att0.8/epoch-37.pt', 'conformer_ctc/exp_5000_att0.8/epoch-38.pt', 'conformer_ctc/exp_5000_att0.8/epoch-39.pt']
|
7 |
+
2022-06-24 17:04:30,866 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-24 17:04:30,866 INFO [asr_datamodule.py:374] About to get test cuts
|
9 |
+
2022-06-24 17:04:30,868 INFO [asr_datamodule.py:367] About to get dev cuts
|
10 |
+
2022-06-24 17:04:41,864 INFO [decode.py:483] batch 0/?, cuts processed until now is 2
|
11 |
+
2022-06-24 18:34:46,534 INFO [decode.py:483] batch 100/?, cuts processed until now is 250
|
12 |
+
2022-06-24 18:47:04,383 INFO [decode.py:733] Caught exception:
|
13 |
+
|
14 |
+
Some bad things happened. Please read the above error messages and stack
|
15 |
+
trace. If you are using Python, the following command may be helpful:
|
16 |
+
|
17 |
+
gdb --args python /path/to/your/code.py
|
18 |
+
|
19 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
20 |
+
a debug version of k2.).
|
21 |
+
|
22 |
+
If you are unable to fix it, please open an issue at:
|
23 |
+
|
24 |
+
https://github.com/k2-fsa/k2/issues/new
|
25 |
+
|
26 |
+
|
27 |
+
2022-06-24 18:47:04,383 INFO [decode.py:734] num_arcs before pruning: 2310868
|
28 |
+
2022-06-24 18:47:04,384 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
29 |
+
2022-06-24 18:47:04,733 INFO [decode.py:747] num_arcs after pruning: 5012
|
30 |
+
2022-06-24 20:10:31,760 INFO [decode.py:483] batch 200/?, cuts processed until now is 516
|
31 |
+
2022-06-24 21:18:41,734 INFO [decode.py:733] Caught exception:
|
32 |
+
|
33 |
+
Some bad things happened. Please read the above error messages and stack
|
34 |
+
trace. If you are using Python, the following command may be helpful:
|
35 |
+
|
36 |
+
gdb --args python /path/to/your/code.py
|
37 |
+
|
38 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
39 |
+
a debug version of k2.).
|
40 |
+
|
41 |
+
If you are unable to fix it, please open an issue at:
|
42 |
+
|
43 |
+
https://github.com/k2-fsa/k2/issues/new
|
44 |
+
|
45 |
+
|
46 |
+
2022-06-24 21:18:41,735 INFO [decode.py:734] num_arcs before pruning: 1083317
|
47 |
+
2022-06-24 21:18:41,736 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
48 |
+
2022-06-24 21:18:41,887 INFO [decode.py:747] num_arcs after pruning: 12010
|
49 |
+
2022-06-24 21:41:53,864 INFO [decode.py:483] batch 300/?, cuts processed until now is 791
|
50 |
+
2022-06-24 23:01:45,578 INFO [decode.py:483] batch 400/?, cuts processed until now is 1047
|
51 |
+
2022-06-25 00:48:28,171 INFO [decode.py:483] batch 500/?, cuts processed until now is 1298
|
52 |
+
2022-06-25 01:51:39,834 INFO [decode.py:483] batch 600/?, cuts processed until now is 1572
|
53 |
+
2022-06-25 02:49:30,229 INFO [decode.py:483] batch 700/?, cuts processed until now is 1843
|
54 |
+
2022-06-25 03:53:42,536 INFO [decode.py:483] batch 800/?, cuts processed until now is 2097
|
55 |
+
2022-06-25 05:16:00,106 INFO [decode.py:483] batch 900/?, cuts processed until now is 2366
|
56 |
+
2022-06-25 06:57:14,063 INFO [decode.py:483] batch 1000/?, cuts processed until now is 2626
|
57 |
+
2022-06-25 08:35:50,225 INFO [decode.py:483] batch 1100/?, cuts processed until now is 2887
|
58 |
+
2022-06-25 10:09:36,309 INFO [decode.py:483] batch 1200/?, cuts processed until now is 3155
|
59 |
+
2022-06-25 11:48:04,046 INFO [decode.py:483] batch 1300/?, cuts processed until now is 3410
|
60 |
+
2022-06-25 13:12:06,293 INFO [decode.py:483] batch 1400/?, cuts processed until now is 3670
|
61 |
+
2022-06-25 14:30:59,601 INFO [decode.py:483] batch 1500/?, cuts processed until now is 3933
|
62 |
+
2022-06-25 15:45:48,988 INFO [decode.py:733] Caught exception:
|
63 |
+
|
64 |
+
Some bad things happened. Please read the above error messages and stack
|
65 |
+
trace. If you are using Python, the following command may be helpful:
|
66 |
+
|
67 |
+
gdb --args python /path/to/your/code.py
|
68 |
+
|
69 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
70 |
+
a debug version of k2.).
|
71 |
+
|
72 |
+
If you are unable to fix it, please open an issue at:
|
73 |
+
|
74 |
+
https://github.com/k2-fsa/k2/issues/new
|
75 |
+
|
76 |
+
|
77 |
+
2022-06-25 15:45:48,990 INFO [decode.py:734] num_arcs before pruning: 1453325
|
78 |
+
2022-06-25 15:45:48,990 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
79 |
+
2022-06-25 15:45:49,219 INFO [decode.py:747] num_arcs after pruning: 4879
|
80 |
+
2022-06-25 16:01:22,610 INFO [decode.py:733] Caught exception:
|
81 |
+
|
82 |
+
Some bad things happened. Please read the above error messages and stack
|
83 |
+
trace. If you are using Python, the following command may be helpful:
|
84 |
+
|
85 |
+
gdb --args python /path/to/your/code.py
|
86 |
+
|
87 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
88 |
+
a debug version of k2.).
|
89 |
+
|
90 |
+
If you are unable to fix it, please open an issue at:
|
91 |
+
|
92 |
+
https://github.com/k2-fsa/k2/issues/new
|
93 |
+
|
94 |
+
|
95 |
+
2022-06-25 16:01:22,611 INFO [decode.py:734] num_arcs before pruning: 1195732
|
96 |
+
2022-06-25 16:01:22,611 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
97 |
+
2022-06-25 16:01:22,789 INFO [decode.py:747] num_arcs after pruning: 5734
|
98 |
+
2022-06-25 16:06:11,843 INFO [decode.py:483] batch 1600/?, cuts processed until now is 4197
|
99 |
+
2022-06-25 18:07:06,556 INFO [decode.py:483] batch 1700/?, cuts processed until now is 4440
|
100 |
+
2022-06-25 19:32:17,442 INFO [decode.py:483] batch 1800/?, cuts processed until now is 4683
|
101 |
+
2022-06-25 20:30:10,100 INFO [decode.py:483] batch 1900/?, cuts processed until now is 4911
|
102 |
+
2022-06-25 21:27:10,127 INFO [decode.py:483] batch 2000/?, cuts processed until now is 5136
|
103 |
+
2022-06-25 22:15:23,452 INFO [decode.py:483] batch 2100/?, cuts processed until now is 5362
|
104 |
+
2022-06-25 22:20:43,949 INFO [decode.py:532]
|
105 |
+
For test, WER of different settings are:
|
106 |
+
ngram_lm_scale_0.01_attention_scale_0.5 14.97 best for test
|
107 |
+
ngram_lm_scale_0.05_attention_scale_0.5 14.98
|
108 |
+
ngram_lm_scale_0.01_attention_scale_0.3 14.99
|
109 |
+
ngram_lm_scale_0.01_attention_scale_0.6 15.0
|
110 |
+
ngram_lm_scale_0.01_attention_scale_0.7 15.02
|
111 |
+
ngram_lm_scale_0.05_attention_scale_0.3 15.02
|
112 |
+
ngram_lm_scale_0.05_attention_scale_0.6 15.02
|
113 |
+
ngram_lm_scale_0.08_attention_scale_0.5 15.02
|
114 |
+
ngram_lm_scale_0.1_attention_scale_0.5 15.04
|
115 |
+
ngram_lm_scale_0.08_attention_scale_0.6 15.05
|
116 |
+
ngram_lm_scale_0.05_attention_scale_0.7 15.06
|
117 |
+
ngram_lm_scale_0.08_attention_scale_0.3 15.07
|
118 |
+
ngram_lm_scale_0.1_attention_scale_0.3 15.07
|
119 |
+
ngram_lm_scale_0.01_attention_scale_0.9 15.08
|
120 |
+
ngram_lm_scale_0.08_attention_scale_0.7 15.08
|
121 |
+
ngram_lm_scale_0.1_attention_scale_0.6 15.08
|
122 |
+
ngram_lm_scale_0.01_attention_scale_0.1 15.09
|
123 |
+
ngram_lm_scale_0.05_attention_scale_0.9 15.09
|
124 |
+
ngram_lm_scale_0.01_attention_scale_1.0 15.1
|
125 |
+
ngram_lm_scale_0.05_attention_scale_0.1 15.1
|
126 |
+
ngram_lm_scale_0.1_attention_scale_0.7 15.1
|
127 |
+
ngram_lm_scale_0.01_attention_scale_0.08 15.11
|
128 |
+
ngram_lm_scale_0.05_attention_scale_1.0 15.11
|
129 |
+
ngram_lm_scale_0.08_attention_scale_0.9 15.11
|
130 |
+
ngram_lm_scale_0.01_attention_scale_1.1 15.12
|
131 |
+
ngram_lm_scale_0.05_attention_scale_0.08 15.13
|
132 |
+
ngram_lm_scale_0.05_attention_scale_1.1 15.13
|
133 |
+
ngram_lm_scale_0.08_attention_scale_1.0 15.13
|
134 |
+
ngram_lm_scale_0.1_attention_scale_0.9 15.13
|
135 |
+
ngram_lm_scale_0.01_attention_scale_0.05 15.14
|
136 |
+
ngram_lm_scale_0.1_attention_scale_1.0 15.14
|
137 |
+
ngram_lm_scale_0.01_attention_scale_1.2 15.15
|
138 |
+
ngram_lm_scale_0.08_attention_scale_0.08 15.15
|
139 |
+
ngram_lm_scale_0.08_attention_scale_0.1 15.15
|
140 |
+
ngram_lm_scale_0.01_attention_scale_1.3 15.16
|
141 |
+
ngram_lm_scale_0.08_attention_scale_1.1 15.16
|
142 |
+
ngram_lm_scale_0.1_attention_scale_0.1 15.16
|
143 |
+
ngram_lm_scale_0.05_attention_scale_0.05 15.17
|
144 |
+
ngram_lm_scale_0.05_attention_scale_1.2 15.17
|
145 |
+
ngram_lm_scale_0.1_attention_scale_0.08 15.17
|
146 |
+
ngram_lm_scale_0.1_attention_scale_1.1 15.17
|
147 |
+
ngram_lm_scale_0.05_attention_scale_1.3 15.18
|
148 |
+
ngram_lm_scale_0.08_attention_scale_0.05 15.19
|
149 |
+
ngram_lm_scale_0.08_attention_scale_1.2 15.19
|
150 |
+
ngram_lm_scale_0.1_attention_scale_1.2 15.2
|
151 |
+
ngram_lm_scale_0.01_attention_scale_0.01 15.21
|
152 |
+
ngram_lm_scale_0.01_attention_scale_1.5 15.21
|
153 |
+
ngram_lm_scale_0.08_attention_scale_1.3 15.21
|
154 |
+
ngram_lm_scale_0.1_attention_scale_0.05 15.21
|
155 |
+
ngram_lm_scale_0.05_attention_scale_1.5 15.22
|
156 |
+
ngram_lm_scale_0.1_attention_scale_1.3 15.22
|
157 |
+
ngram_lm_scale_0.01_attention_scale_1.7 15.23
|
158 |
+
ngram_lm_scale_0.05_attention_scale_1.7 15.25
|
159 |
+
ngram_lm_scale_0.08_attention_scale_1.5 15.25
|
160 |
+
ngram_lm_scale_0.01_attention_scale_1.9 15.26
|
161 |
+
ngram_lm_scale_0.05_attention_scale_0.01 15.26
|
162 |
+
ngram_lm_scale_0.1_attention_scale_1.5 15.26
|
163 |
+
ngram_lm_scale_0.01_attention_scale_2.0 15.27
|
164 |
+
ngram_lm_scale_0.08_attention_scale_0.01 15.27
|
165 |
+
ngram_lm_scale_0.3_attention_scale_0.7 15.27
|
166 |
+
ngram_lm_scale_0.3_attention_scale_0.9 15.27
|
167 |
+
ngram_lm_scale_0.1_attention_scale_0.01 15.28
|
168 |
+
ngram_lm_scale_0.3_attention_scale_1.0 15.28
|
169 |
+
ngram_lm_scale_0.08_attention_scale_1.7 15.29
|
170 |
+
ngram_lm_scale_0.3_attention_scale_0.6 15.29
|
171 |
+
ngram_lm_scale_0.05_attention_scale_1.9 15.3
|
172 |
+
ngram_lm_scale_0.1_attention_scale_1.7 15.3
|
173 |
+
ngram_lm_scale_0.3_attention_scale_0.5 15.3
|
174 |
+
ngram_lm_scale_0.01_attention_scale_2.1 15.31
|
175 |
+
ngram_lm_scale_0.01_attention_scale_2.2 15.31
|
176 |
+
ngram_lm_scale_0.05_attention_scale_2.0 15.31
|
177 |
+
ngram_lm_scale_0.08_attention_scale_1.9 15.31
|
178 |
+
ngram_lm_scale_0.3_attention_scale_1.1 15.31
|
179 |
+
ngram_lm_scale_0.01_attention_scale_2.3 15.32
|
180 |
+
ngram_lm_scale_0.05_attention_scale_2.1 15.32
|
181 |
+
ngram_lm_scale_0.05_attention_scale_2.2 15.33
|
182 |
+
ngram_lm_scale_0.05_attention_scale_2.3 15.33
|
183 |
+
ngram_lm_scale_0.08_attention_scale_2.0 15.33
|
184 |
+
ngram_lm_scale_0.1_attention_scale_1.9 15.33
|
185 |
+
ngram_lm_scale_0.1_attention_scale_2.0 15.33
|
186 |
+
ngram_lm_scale_0.01_attention_scale_2.5 15.34
|
187 |
+
ngram_lm_scale_0.08_attention_scale_2.1 15.34
|
188 |
+
ngram_lm_scale_0.1_attention_scale_2.1 15.34
|
189 |
+
ngram_lm_scale_0.3_attention_scale_1.2 15.34
|
190 |
+
ngram_lm_scale_0.08_attention_scale_2.2 15.35
|
191 |
+
ngram_lm_scale_0.1_attention_scale_2.2 15.35
|
192 |
+
ngram_lm_scale_0.05_attention_scale_2.5 15.36
|
193 |
+
ngram_lm_scale_0.08_attention_scale_2.3 15.36
|
194 |
+
ngram_lm_scale_0.1_attention_scale_2.3 15.36
|
195 |
+
ngram_lm_scale_0.3_attention_scale_1.3 15.36
|
196 |
+
ngram_lm_scale_0.08_attention_scale_2.5 15.37
|
197 |
+
ngram_lm_scale_0.3_attention_scale_0.3 15.38
|
198 |
+
ngram_lm_scale_0.01_attention_scale_3.0 15.39
|
199 |
+
ngram_lm_scale_0.1_attention_scale_2.5 15.39
|
200 |
+
ngram_lm_scale_0.05_attention_scale_3.0 15.41
|
201 |
+
ngram_lm_scale_0.3_attention_scale_1.5 15.42
|
202 |
+
ngram_lm_scale_0.08_attention_scale_3.0 15.45
|
203 |
+
ngram_lm_scale_0.1_attention_scale_3.0 15.46
|
204 |
+
ngram_lm_scale_0.3_attention_scale_1.7 15.47
|
205 |
+
ngram_lm_scale_0.3_attention_scale_1.9 15.48
|
206 |
+
ngram_lm_scale_0.3_attention_scale_2.1 15.48
|
207 |
+
ngram_lm_scale_0.01_attention_scale_4.0 15.49
|
208 |
+
ngram_lm_scale_0.3_attention_scale_2.0 15.49
|
209 |
+
ngram_lm_scale_0.05_attention_scale_4.0 15.5
|
210 |
+
ngram_lm_scale_0.3_attention_scale_2.2 15.5
|
211 |
+
ngram_lm_scale_0.08_attention_scale_4.0 15.52
|
212 |
+
ngram_lm_scale_0.3_attention_scale_2.3 15.52
|
213 |
+
ngram_lm_scale_0.1_attention_scale_4.0 15.53
|
214 |
+
ngram_lm_scale_0.3_attention_scale_0.1 15.55
|
215 |
+
ngram_lm_scale_0.3_attention_scale_2.5 15.55
|
216 |
+
ngram_lm_scale_0.01_attention_scale_5.0 15.56
|
217 |
+
ngram_lm_scale_0.05_attention_scale_5.0 15.57
|
218 |
+
ngram_lm_scale_0.08_attention_scale_5.0 15.59
|
219 |
+
ngram_lm_scale_0.1_attention_scale_5.0 15.59
|
220 |
+
ngram_lm_scale_0.3_attention_scale_0.08 15.59
|
221 |
+
ngram_lm_scale_0.3_attention_scale_3.0 15.6
|
222 |
+
ngram_lm_scale_0.5_attention_scale_1.5 15.62
|
223 |
+
ngram_lm_scale_0.5_attention_scale_1.1 15.63
|
224 |
+
ngram_lm_scale_0.5_attention_scale_1.3 15.63
|
225 |
+
ngram_lm_scale_0.3_attention_scale_4.0 15.64
|
226 |
+
ngram_lm_scale_0.5_attention_scale_1.2 15.64
|
227 |
+
ngram_lm_scale_0.5_attention_scale_0.9 15.65
|
228 |
+
ngram_lm_scale_0.5_attention_scale_1.0 15.65
|
229 |
+
ngram_lm_scale_0.5_attention_scale_1.7 15.66
|
230 |
+
ngram_lm_scale_0.3_attention_scale_5.0 15.67
|
231 |
+
ngram_lm_scale_0.5_attention_scale_1.9 15.67
|
232 |
+
ngram_lm_scale_0.5_attention_scale_2.0 15.67
|
233 |
+
ngram_lm_scale_0.5_attention_scale_2.1 15.68
|
234 |
+
ngram_lm_scale_0.5_attention_scale_2.2 15.68
|
235 |
+
ngram_lm_scale_0.5_attention_scale_2.3 15.68
|
236 |
+
ngram_lm_scale_0.3_attention_scale_0.05 15.69
|
237 |
+
ngram_lm_scale_0.5_attention_scale_2.5 15.69
|
238 |
+
ngram_lm_scale_0.5_attention_scale_0.7 15.71
|
239 |
+
ngram_lm_scale_0.5_attention_scale_3.0 15.71
|
240 |
+
ngram_lm_scale_0.5_attention_scale_4.0 15.76
|
241 |
+
ngram_lm_scale_0.5_attention_scale_0.6 15.77
|
242 |
+
ngram_lm_scale_0.5_attention_scale_5.0 15.79
|
243 |
+
ngram_lm_scale_0.6_attention_scale_2.0 15.81
|
244 |
+
ngram_lm_scale_0.3_attention_scale_0.01 15.82
|
245 |
+
ngram_lm_scale_0.6_attention_scale_1.9 15.82
|
246 |
+
ngram_lm_scale_0.6_attention_scale_2.1 15.82
|
247 |
+
ngram_lm_scale_0.6_attention_scale_2.2 15.82
|
248 |
+
ngram_lm_scale_0.6_attention_scale_2.3 15.82
|
249 |
+
ngram_lm_scale_0.6_attention_scale_2.5 15.82
|
250 |
+
ngram_lm_scale_0.6_attention_scale_3.0 15.82
|
251 |
+
ngram_lm_scale_0.6_attention_scale_1.5 15.83
|
252 |
+
ngram_lm_scale_0.5_attention_scale_0.5 15.84
|
253 |
+
ngram_lm_scale_0.6_attention_scale_1.7 15.84
|
254 |
+
ngram_lm_scale_0.6_attention_scale_1.1 15.85
|
255 |
+
ngram_lm_scale_0.6_attention_scale_4.0 15.85
|
256 |
+
ngram_lm_scale_0.6_attention_scale_1.2 15.86
|
257 |
+
ngram_lm_scale_0.6_attention_scale_1.3 15.86
|
258 |
+
ngram_lm_scale_0.6_attention_scale_5.0 15.86
|
259 |
+
ngram_lm_scale_0.6_attention_scale_1.0 15.88
|
260 |
+
ngram_lm_scale_0.6_attention_scale_0.9 15.92
|
261 |
+
ngram_lm_scale_0.7_attention_scale_4.0 15.92
|
262 |
+
ngram_lm_scale_0.7_attention_scale_5.0 15.92
|
263 |
+
ngram_lm_scale_0.7_attention_scale_2.5 15.96
|
264 |
+
ngram_lm_scale_0.7_attention_scale_3.0 15.96
|
265 |
+
ngram_lm_scale_0.7_attention_scale_2.3 15.99
|
266 |
+
ngram_lm_scale_0.7_attention_scale_2.2 16.0
|
267 |
+
ngram_lm_scale_0.7_attention_scale_2.1 16.01
|
268 |
+
ngram_lm_scale_0.5_attention_scale_0.3 16.02
|
269 |
+
ngram_lm_scale_0.7_attention_scale_2.0 16.02
|
270 |
+
ngram_lm_scale_0.7_attention_scale_1.9 16.03
|
271 |
+
ngram_lm_scale_0.7_attention_scale_1.7 16.04
|
272 |
+
ngram_lm_scale_0.7_attention_scale_1.5 16.06
|
273 |
+
ngram_lm_scale_0.6_attention_scale_0.7 16.07
|
274 |
+
ngram_lm_scale_0.7_attention_scale_1.3 16.1
|
275 |
+
ngram_lm_scale_0.9_attention_scale_5.0 16.11
|
276 |
+
ngram_lm_scale_0.6_attention_scale_0.6 16.13
|
277 |
+
ngram_lm_scale_0.7_attention_scale_1.2 16.14
|
278 |
+
ngram_lm_scale_0.9_attention_scale_4.0 16.16
|
279 |
+
ngram_lm_scale_0.7_attention_scale_1.1 16.18
|
280 |
+
ngram_lm_scale_1.0_attention_scale_5.0 16.19
|
281 |
+
ngram_lm_scale_0.7_attention_scale_1.0 16.21
|
282 |
+
ngram_lm_scale_0.6_attention_scale_0.5 16.24
|
283 |
+
ngram_lm_scale_0.9_attention_scale_3.0 16.26
|
284 |
+
ngram_lm_scale_1.0_attention_scale_4.0 16.26
|
285 |
+
ngram_lm_scale_1.1_attention_scale_5.0 16.28
|
286 |
+
ngram_lm_scale_0.7_attention_scale_0.9 16.29
|
287 |
+
ngram_lm_scale_0.9_attention_scale_2.5 16.32
|
288 |
+
ngram_lm_scale_0.9_attention_scale_2.3 16.37
|
289 |
+
ngram_lm_scale_0.9_attention_scale_2.1 16.39
|
290 |
+
ngram_lm_scale_0.9_attention_scale_2.2 16.39
|
291 |
+
ngram_lm_scale_1.0_attention_scale_3.0 16.39
|
292 |
+
ngram_lm_scale_1.1_attention_scale_4.0 16.39
|
293 |
+
ngram_lm_scale_1.2_attention_scale_5.0 16.41
|
294 |
+
ngram_lm_scale_0.9_attention_scale_2.0 16.42
|
295 |
+
ngram_lm_scale_0.9_attention_scale_1.9 16.45
|
296 |
+
ngram_lm_scale_1.0_attention_scale_2.5 16.47
|
297 |
+
ngram_lm_scale_0.7_attention_scale_0.7 16.5
|
298 |
+
ngram_lm_scale_1.3_attention_scale_5.0 16.5
|
299 |
+
ngram_lm_scale_1.1_attention_scale_3.0 16.53
|
300 |
+
ngram_lm_scale_1.2_attention_scale_4.0 16.53
|
301 |
+
ngram_lm_scale_1.0_attention_scale_2.3 16.55
|
302 |
+
ngram_lm_scale_0.9_attention_scale_1.7 16.56
|
303 |
+
ngram_lm_scale_1.0_attention_scale_2.2 16.59
|
304 |
+
ngram_lm_scale_0.7_attention_scale_0.6 16.61
|
305 |
+
ngram_lm_scale_1.0_attention_scale_2.1 16.63
|
306 |
+
ngram_lm_scale_0.5_attention_scale_0.1 16.66
|
307 |
+
ngram_lm_scale_0.9_attention_scale_1.5 16.67
|
308 |
+
ngram_lm_scale_0.6_attention_scale_0.3 16.68
|
309 |
+
ngram_lm_scale_1.0_attention_scale_2.0 16.68
|
310 |
+
ngram_lm_scale_1.3_attention_scale_4.0 16.69
|
311 |
+
ngram_lm_scale_1.0_attention_scale_1.9 16.73
|
312 |
+
ngram_lm_scale_1.1_attention_scale_2.5 16.73
|
313 |
+
ngram_lm_scale_0.5_attention_scale_0.08 16.76
|
314 |
+
ngram_lm_scale_1.5_attention_scale_5.0 16.78
|
315 |
+
ngram_lm_scale_1.2_attention_scale_3.0 16.79
|
316 |
+
ngram_lm_scale_0.7_attention_scale_0.5 16.84
|
317 |
+
ngram_lm_scale_0.9_attention_scale_1.3 16.84
|
318 |
+
ngram_lm_scale_1.1_attention_scale_2.3 16.84
|
319 |
+
ngram_lm_scale_1.0_attention_scale_1.7 16.86
|
320 |
+
ngram_lm_scale_1.1_attention_scale_2.2 16.89
|
321 |
+
ngram_lm_scale_0.5_attention_scale_0.05 16.93
|
322 |
+
ngram_lm_scale_0.9_attention_scale_1.2 16.94
|
323 |
+
ngram_lm_scale_1.1_attention_scale_2.1 16.98
|
324 |
+
ngram_lm_scale_1.3_attention_scale_3.0 17.02
|
325 |
+
ngram_lm_scale_1.1_attention_scale_2.0 17.03
|
326 |
+
ngram_lm_scale_1.2_attention_scale_2.5 17.03
|
327 |
+
ngram_lm_scale_1.5_attention_scale_4.0 17.03
|
328 |
+
ngram_lm_scale_1.7_attention_scale_5.0 17.03
|
329 |
+
ngram_lm_scale_0.9_attention_scale_1.1 17.05
|
330 |
+
ngram_lm_scale_1.0_attention_scale_1.5 17.05
|
331 |
+
ngram_lm_scale_1.1_attention_scale_1.9 17.11
|
332 |
+
ngram_lm_scale_1.2_attention_scale_2.3 17.16
|
333 |
+
ngram_lm_scale_0.9_attention_scale_1.0 17.2
|
334 |
+
ngram_lm_scale_0.5_attention_scale_0.01 17.21
|
335 |
+
ngram_lm_scale_1.2_attention_scale_2.2 17.22
|
336 |
+
ngram_lm_scale_1.0_attention_scale_1.3 17.29
|
337 |
+
ngram_lm_scale_1.2_attention_scale_2.1 17.29
|
338 |
+
ngram_lm_scale_1.1_attention_scale_1.7 17.3
|
339 |
+
ngram_lm_scale_1.3_attention_scale_2.5 17.3
|
340 |
+
ngram_lm_scale_0.9_attention_scale_0.9 17.36
|
341 |
+
ngram_lm_scale_1.9_attention_scale_5.0 17.38
|
342 |
+
ngram_lm_scale_1.2_attention_scale_2.0 17.41
|
343 |
+
ngram_lm_scale_1.0_attention_scale_1.2 17.42
|
344 |
+
ngram_lm_scale_1.7_attention_scale_4.0 17.46
|
345 |
+
ngram_lm_scale_1.3_attention_scale_2.3 17.5
|
346 |
+
ngram_lm_scale_1.1_attention_scale_1.5 17.52
|
347 |
+
ngram_lm_scale_1.2_attention_scale_1.9 17.52
|
348 |
+
ngram_lm_scale_0.7_attention_scale_0.3 17.55
|
349 |
+
ngram_lm_scale_1.0_attention_scale_1.1 17.55
|
350 |
+
ngram_lm_scale_1.3_attention_scale_2.2 17.58
|
351 |
+
ngram_lm_scale_2.0_attention_scale_5.0 17.59
|
352 |
+
ngram_lm_scale_1.5_attention_scale_3.0 17.61
|
353 |
+
ngram_lm_scale_0.6_attention_scale_0.1 17.66
|
354 |
+
ngram_lm_scale_1.3_attention_scale_2.1 17.71
|
355 |
+
ngram_lm_scale_1.0_attention_scale_1.0 17.75
|
356 |
+
ngram_lm_scale_1.2_attention_scale_1.7 17.76
|
357 |
+
ngram_lm_scale_1.1_attention_scale_1.3 17.82
|
358 |
+
ngram_lm_scale_0.6_attention_scale_0.08 17.84
|
359 |
+
ngram_lm_scale_2.1_attention_scale_5.0 17.84
|
360 |
+
ngram_lm_scale_1.3_attention_scale_2.0 17.85
|
361 |
+
ngram_lm_scale_0.9_attention_scale_0.7 17.88
|
362 |
+
ngram_lm_scale_1.9_attention_scale_4.0 17.98
|
363 |
+
ngram_lm_scale_1.3_attention_scale_1.9 17.99
|
364 |
+
ngram_lm_scale_1.1_attention_scale_1.2 18.05
|
365 |
+
ngram_lm_scale_1.5_attention_scale_2.5 18.05
|
366 |
+
ngram_lm_scale_2.2_attention_scale_5.0 18.06
|
367 |
+
ngram_lm_scale_1.0_attention_scale_0.9 18.08
|
368 |
+
ngram_lm_scale_1.2_attention_scale_1.5 18.08
|
369 |
+
ngram_lm_scale_0.6_attention_scale_0.05 18.1
|
370 |
+
ngram_lm_scale_1.7_attention_scale_3.0 18.27
|
371 |
+
ngram_lm_scale_1.3_attention_scale_1.7 18.29
|
372 |
+
ngram_lm_scale_2.0_attention_scale_4.0 18.29
|
373 |
+
ngram_lm_scale_2.3_attention_scale_5.0 18.29
|
374 |
+
ngram_lm_scale_0.9_attention_scale_0.6 18.3
|
375 |
+
ngram_lm_scale_1.5_attention_scale_2.3 18.33
|
376 |
+
ngram_lm_scale_1.1_attention_scale_1.1 18.36
|
377 |
+
ngram_lm_scale_0.6_attention_scale_0.01 18.45
|
378 |
+
ngram_lm_scale_1.5_attention_scale_2.2 18.51
|
379 |
+
ngram_lm_scale_1.2_attention_scale_1.3 18.57
|
380 |
+
ngram_lm_scale_2.1_attention_scale_4.0 18.58
|
381 |
+
ngram_lm_scale_1.5_attention_scale_2.1 18.67
|
382 |
+
ngram_lm_scale_1.1_attention_scale_1.0 18.71
|
383 |
+
ngram_lm_scale_2.5_attention_scale_5.0 18.74
|
384 |
+
ngram_lm_scale_0.9_attention_scale_0.5 18.8
|
385 |
+
ngram_lm_scale_1.3_attention_scale_1.5 18.82
|
386 |
+
ngram_lm_scale_2.2_attention_scale_4.0 18.86
|
387 |
+
ngram_lm_scale_1.5_attention_scale_2.0 18.9
|
388 |
+
ngram_lm_scale_1.0_attention_scale_0.7 18.92
|
389 |
+
ngram_lm_scale_1.2_attention_scale_1.2 18.92
|
390 |
+
ngram_lm_scale_0.7_attention_scale_0.1 18.95
|
391 |
+
ngram_lm_scale_1.7_attention_scale_2.5 19.01
|
392 |
+
ngram_lm_scale_1.9_attention_scale_3.0 19.09
|
393 |
+
ngram_lm_scale_1.1_attention_scale_0.9 19.1
|
394 |
+
ngram_lm_scale_0.7_attention_scale_0.08 19.17
|
395 |
+
ngram_lm_scale_1.5_attention_scale_1.9 19.2
|
396 |
+
ngram_lm_scale_2.3_attention_scale_4.0 19.21
|
397 |
+
ngram_lm_scale_1.2_attention_scale_1.1 19.3
|
398 |
+
ngram_lm_scale_1.7_attention_scale_2.3 19.46
|
399 |
+
ngram_lm_scale_1.0_attention_scale_0.6 19.47
|
400 |
+
ngram_lm_scale_0.7_attention_scale_0.05 19.5
|
401 |
+
ngram_lm_scale_1.3_attention_scale_1.3 19.5
|
402 |
+
ngram_lm_scale_2.0_attention_scale_3.0 19.61
|
403 |
+
ngram_lm_scale_1.2_attention_scale_1.0 19.67
|
404 |
+
ngram_lm_scale_1.7_attention_scale_2.2 19.76
|
405 |
+
ngram_lm_scale_1.5_attention_scale_1.7 19.78
|
406 |
+
ngram_lm_scale_1.3_attention_scale_1.2 19.83
|
407 |
+
ngram_lm_scale_1.7_attention_scale_2.1 19.96
|
408 |
+
ngram_lm_scale_0.7_attention_scale_0.01 20.02
|
409 |
+
ngram_lm_scale_1.0_attention_scale_0.5 20.1
|
410 |
+
ngram_lm_scale_2.5_attention_scale_4.0 20.1
|
411 |
+
ngram_lm_scale_1.1_attention_scale_0.7 20.15
|
412 |
+
ngram_lm_scale_1.2_attention_scale_0.9 20.19
|
413 |
+
ngram_lm_scale_2.1_attention_scale_3.0 20.19
|
414 |
+
ngram_lm_scale_1.9_attention_scale_2.5 20.2
|
415 |
+
ngram_lm_scale_1.3_attention_scale_1.1 20.28
|
416 |
+
ngram_lm_scale_1.7_attention_scale_2.0 20.28
|
417 |
+
ngram_lm_scale_0.9_attention_scale_0.3 20.31
|
418 |
+
ngram_lm_scale_3.0_attention_scale_5.0 20.41
|
419 |
+
ngram_lm_scale_1.5_attention_scale_1.5 20.49
|
420 |
+
ngram_lm_scale_1.7_attention_scale_1.9 20.63
|
421 |
+
ngram_lm_scale_2.2_attention_scale_3.0 20.67
|
422 |
+
ngram_lm_scale_1.9_attention_scale_2.3 20.78
|
423 |
+
ngram_lm_scale_1.1_attention_scale_0.6 20.79
|
424 |
+
ngram_lm_scale_2.0_attention_scale_2.5 20.81
|
425 |
+
ngram_lm_scale_1.3_attention_scale_1.0 20.83
|
426 |
+
ngram_lm_scale_1.9_attention_scale_2.2 21.05
|
427 |
+
ngram_lm_scale_2.3_attention_scale_3.0 21.18
|
428 |
+
ngram_lm_scale_1.7_attention_scale_1.7 21.32
|
429 |
+
ngram_lm_scale_1.9_attention_scale_2.1 21.36
|
430 |
+
ngram_lm_scale_1.5_attention_scale_1.3 21.39
|
431 |
+
ngram_lm_scale_2.0_attention_scale_2.3 21.39
|
432 |
+
ngram_lm_scale_2.1_attention_scale_2.5 21.44
|
433 |
+
ngram_lm_scale_1.3_attention_scale_0.9 21.53
|
434 |
+
ngram_lm_scale_1.2_attention_scale_0.7 21.54
|
435 |
+
ngram_lm_scale_1.1_attention_scale_0.5 21.67
|
436 |
+
ngram_lm_scale_1.9_attention_scale_2.0 21.78
|
437 |
+
ngram_lm_scale_2.0_attention_scale_2.2 21.79
|
438 |
+
ngram_lm_scale_1.5_attention_scale_1.2 21.93
|
439 |
+
ngram_lm_scale_1.0_attention_scale_0.3 22.04
|
440 |
+
ngram_lm_scale_2.2_attention_scale_2.5 22.15
|
441 |
+
ngram_lm_scale_2.1_attention_scale_2.3 22.18
|
442 |
+
ngram_lm_scale_2.0_attention_scale_2.1 22.2
|
443 |
+
ngram_lm_scale_3.0_attention_scale_4.0 22.22
|
444 |
+
ngram_lm_scale_1.9_attention_scale_1.9 22.23
|
445 |
+
ngram_lm_scale_1.7_attention_scale_1.5 22.37
|
446 |
+
ngram_lm_scale_1.2_attention_scale_0.6 22.38
|
447 |
+
ngram_lm_scale_2.5_attention_scale_3.0 22.4
|
448 |
+
ngram_lm_scale_2.1_attention_scale_2.2 22.54
|
449 |
+
ngram_lm_scale_1.5_attention_scale_1.1 22.56
|
450 |
+
ngram_lm_scale_2.0_attention_scale_2.0 22.61
|
451 |
+
ngram_lm_scale_0.9_attention_scale_0.1 22.7
|
452 |
+
ngram_lm_scale_2.3_attention_scale_2.5 22.84
|
453 |
+
ngram_lm_scale_2.2_attention_scale_2.3 22.92
|
454 |
+
ngram_lm_scale_0.9_attention_scale_0.08 23.02
|
455 |
+
ngram_lm_scale_1.3_attention_scale_0.7 23.02
|
456 |
+
ngram_lm_scale_2.1_attention_scale_2.1 23.03
|
457 |
+
ngram_lm_scale_2.0_attention_scale_1.9 23.16
|
458 |
+
ngram_lm_scale_1.9_attention_scale_1.7 23.31
|
459 |
+
ngram_lm_scale_1.5_attention_scale_1.0 23.32
|
460 |
+
ngram_lm_scale_1.2_attention_scale_0.5 23.4
|
461 |
+
ngram_lm_scale_2.2_attention_scale_2.2 23.45
|
462 |
+
ngram_lm_scale_2.1_attention_scale_2.0 23.55
|
463 |
+
ngram_lm_scale_0.9_attention_scale_0.05 23.6
|
464 |
+
ngram_lm_scale_1.7_attention_scale_1.3 23.65
|
465 |
+
ngram_lm_scale_2.3_attention_scale_2.3 23.79
|
466 |
+
ngram_lm_scale_2.2_attention_scale_2.1 23.94
|
467 |
+
ngram_lm_scale_1.1_attention_scale_0.3 23.95
|
468 |
+
ngram_lm_scale_1.3_attention_scale_0.6 24.04
|
469 |
+
ngram_lm_scale_1.5_attention_scale_0.9 24.07
|
470 |
+
ngram_lm_scale_2.1_attention_scale_1.9 24.07
|
471 |
+
ngram_lm_scale_2.0_attention_scale_1.7 24.25
|
472 |
+
ngram_lm_scale_2.3_attention_scale_2.2 24.27
|
473 |
+
ngram_lm_scale_4.0_attention_scale_5.0 24.3
|
474 |
+
ngram_lm_scale_1.7_attention_scale_1.2 24.31
|
475 |
+
ngram_lm_scale_0.9_attention_scale_0.01 24.33
|
476 |
+
ngram_lm_scale_2.5_attention_scale_2.5 24.42
|
477 |
+
ngram_lm_scale_2.2_attention_scale_2.0 24.45
|
478 |
+
ngram_lm_scale_1.9_attention_scale_1.5 24.52
|
479 |
+
ngram_lm_scale_2.3_attention_scale_2.1 24.75
|
480 |
+
ngram_lm_scale_2.2_attention_scale_1.9 24.97
|
481 |
+
ngram_lm_scale_1.0_attention_scale_0.1 25.02
|
482 |
+
ngram_lm_scale_1.7_attention_scale_1.1 25.03
|
483 |
+
ngram_lm_scale_1.3_attention_scale_0.5 25.08
|
484 |
+
ngram_lm_scale_2.1_attention_scale_1.7 25.16
|
485 |
+
ngram_lm_scale_2.3_attention_scale_2.0 25.26
|
486 |
+
ngram_lm_scale_2.5_attention_scale_2.3 25.32
|
487 |
+
ngram_lm_scale_1.0_attention_scale_0.08 25.39
|
488 |
+
ngram_lm_scale_2.0_attention_scale_1.5 25.4
|
489 |
+
ngram_lm_scale_3.0_attention_scale_3.0 25.55
|
490 |
+
ngram_lm_scale_2.5_attention_scale_2.2 25.72
|
491 |
+
ngram_lm_scale_2.3_attention_scale_1.9 25.73
|
492 |
+
ngram_lm_scale_1.9_attention_scale_1.3 25.78
|
493 |
+
ngram_lm_scale_1.7_attention_scale_1.0 25.86
|
494 |
+
ngram_lm_scale_1.0_attention_scale_0.05 25.93
|
495 |
+
ngram_lm_scale_1.2_attention_scale_0.3 25.96
|
496 |
+
ngram_lm_scale_1.5_attention_scale_0.7 25.99
|
497 |
+
ngram_lm_scale_2.2_attention_scale_1.7 26.03
|
498 |
+
ngram_lm_scale_2.5_attention_scale_2.1 26.23
|
499 |
+
ngram_lm_scale_2.1_attention_scale_1.5 26.41
|
500 |
+
ngram_lm_scale_1.9_attention_scale_1.2 26.54
|
501 |
+
ngram_lm_scale_1.7_attention_scale_0.9 26.72
|
502 |
+
ngram_lm_scale_1.0_attention_scale_0.01 26.74
|
503 |
+
ngram_lm_scale_2.5_attention_scale_2.0 26.85
|
504 |
+
ngram_lm_scale_2.0_attention_scale_1.3 26.88
|
505 |
+
ngram_lm_scale_2.3_attention_scale_1.7 26.98
|
506 |
+
ngram_lm_scale_4.0_attention_scale_4.0 27.04
|
507 |
+
ngram_lm_scale_1.5_attention_scale_0.6 27.1
|
508 |
+
ngram_lm_scale_1.1_attention_scale_0.1 27.23
|
509 |
+
ngram_lm_scale_1.9_attention_scale_1.1 27.38
|
510 |
+
ngram_lm_scale_2.5_attention_scale_1.9 27.39
|
511 |
+
ngram_lm_scale_2.2_attention_scale_1.5 27.4
|
512 |
+
ngram_lm_scale_1.1_attention_scale_0.08 27.59
|
513 |
+
ngram_lm_scale_2.0_attention_scale_1.2 27.67
|
514 |
+
ngram_lm_scale_3.0_attention_scale_2.5 27.73
|
515 |
+
ngram_lm_scale_1.3_attention_scale_0.3 27.76
|
516 |
+
ngram_lm_scale_2.1_attention_scale_1.3 27.87
|
517 |
+
ngram_lm_scale_5.0_attention_scale_5.0 28.03
|
518 |
+
ngram_lm_scale_1.1_attention_scale_0.05 28.14
|
519 |
+
ngram_lm_scale_2.3_attention_scale_1.5 28.24
|
520 |
+
ngram_lm_scale_1.9_attention_scale_1.0 28.26
|
521 |
+
ngram_lm_scale_1.5_attention_scale_0.5 28.4
|
522 |
+
ngram_lm_scale_2.0_attention_scale_1.1 28.49
|
523 |
+
ngram_lm_scale_2.5_attention_scale_1.7 28.55
|
524 |
+
ngram_lm_scale_2.1_attention_scale_1.2 28.65
|
525 |
+
ngram_lm_scale_3.0_attention_scale_2.3 28.67
|
526 |
+
ngram_lm_scale_2.2_attention_scale_1.3 28.82
|
527 |
+
ngram_lm_scale_1.7_attention_scale_0.7 28.84
|
528 |
+
ngram_lm_scale_1.1_attention_scale_0.01 28.96
|
529 |
+
ngram_lm_scale_3.0_attention_scale_2.2 29.13
|
530 |
+
ngram_lm_scale_1.9_attention_scale_0.9 29.16
|
531 |
+
ngram_lm_scale_1.2_attention_scale_0.1 29.24
|
532 |
+
ngram_lm_scale_2.0_attention_scale_1.0 29.32
|
533 |
+
ngram_lm_scale_2.1_attention_scale_1.1 29.49
|
534 |
+
ngram_lm_scale_2.2_attention_scale_1.2 29.6
|
535 |
+
ngram_lm_scale_1.2_attention_scale_0.08 29.62
|
536 |
+
ngram_lm_scale_3.0_attention_scale_2.1 29.65
|
537 |
+
ngram_lm_scale_2.3_attention_scale_1.3 29.72
|
538 |
+
ngram_lm_scale_2.5_attention_scale_1.5 29.86
|
539 |
+
ngram_lm_scale_1.7_attention_scale_0.6 29.92
|
540 |
+
ngram_lm_scale_1.2_attention_scale_0.05 30.17
|
541 |
+
ngram_lm_scale_3.0_attention_scale_2.0 30.19
|
542 |
+
ngram_lm_scale_2.0_attention_scale_0.9 30.24
|
543 |
+
ngram_lm_scale_2.1_attention_scale_1.0 30.32
|
544 |
+
ngram_lm_scale_2.2_attention_scale_1.1 30.38
|
545 |
+
ngram_lm_scale_2.3_attention_scale_1.2 30.41
|
546 |
+
ngram_lm_scale_4.0_attention_scale_3.0 30.56
|
547 |
+
ngram_lm_scale_3.0_attention_scale_1.9 30.66
|
548 |
+
ngram_lm_scale_5.0_attention_scale_4.0 30.83
|
549 |
+
ngram_lm_scale_1.2_attention_scale_0.01 30.85
|
550 |
+
ngram_lm_scale_1.3_attention_scale_0.1 30.98
|
551 |
+
ngram_lm_scale_1.5_attention_scale_0.3 31.09
|
552 |
+
ngram_lm_scale_1.7_attention_scale_0.5 31.12
|
553 |
+
ngram_lm_scale_2.1_attention_scale_0.9 31.13
|
554 |
+
ngram_lm_scale_2.5_attention_scale_1.3 31.13
|
555 |
+
ngram_lm_scale_2.2_attention_scale_1.0 31.15
|
556 |
+
ngram_lm_scale_2.3_attention_scale_1.1 31.15
|
557 |
+
ngram_lm_scale_1.9_attention_scale_0.7 31.19
|
558 |
+
ngram_lm_scale_1.3_attention_scale_0.08 31.3
|
559 |
+
ngram_lm_scale_3.0_attention_scale_1.7 31.66
|
560 |
+
ngram_lm_scale_1.3_attention_scale_0.05 31.79
|
561 |
+
ngram_lm_scale_2.5_attention_scale_1.2 31.82
|
562 |
+
ngram_lm_scale_2.3_attention_scale_1.0 31.9
|
563 |
+
ngram_lm_scale_2.2_attention_scale_0.9 31.98
|
564 |
+
ngram_lm_scale_2.0_attention_scale_0.7 32.08
|
565 |
+
ngram_lm_scale_1.9_attention_scale_0.6 32.14
|
566 |
+
ngram_lm_scale_4.0_attention_scale_2.5 32.25
|
567 |
+
ngram_lm_scale_1.3_attention_scale_0.01 32.33
|
568 |
+
ngram_lm_scale_2.5_attention_scale_1.1 32.52
|
569 |
+
ngram_lm_scale_2.3_attention_scale_0.9 32.66
|
570 |
+
ngram_lm_scale_3.0_attention_scale_1.5 32.7
|
571 |
+
ngram_lm_scale_2.1_attention_scale_0.7 32.8
|
572 |
+
ngram_lm_scale_2.0_attention_scale_0.6 32.94
|
573 |
+
ngram_lm_scale_4.0_attention_scale_2.3 32.98
|
574 |
+
ngram_lm_scale_1.9_attention_scale_0.5 33.05
|
575 |
+
ngram_lm_scale_2.5_attention_scale_1.0 33.07
|
576 |
+
ngram_lm_scale_1.7_attention_scale_0.3 33.24
|
577 |
+
ngram_lm_scale_4.0_attention_scale_2.2 33.29
|
578 |
+
ngram_lm_scale_5.0_attention_scale_3.0 33.37
|
579 |
+
ngram_lm_scale_2.2_attention_scale_0.7 33.43
|
580 |
+
ngram_lm_scale_1.5_attention_scale_0.1 33.5
|
581 |
+
ngram_lm_scale_2.1_attention_scale_0.6 33.58
|
582 |
+
ngram_lm_scale_1.5_attention_scale_0.08 33.7
|
583 |
+
ngram_lm_scale_2.0_attention_scale_0.5 33.71
|
584 |
+
ngram_lm_scale_4.0_attention_scale_2.1 33.71
|
585 |
+
ngram_lm_scale_2.5_attention_scale_0.9 33.72
|
586 |
+
ngram_lm_scale_3.0_attention_scale_1.3 33.72
|
587 |
+
ngram_lm_scale_1.5_attention_scale_0.05 34.0
|
588 |
+
ngram_lm_scale_2.3_attention_scale_0.7 34.03
|
589 |
+
ngram_lm_scale_4.0_attention_scale_2.0 34.05
|
590 |
+
ngram_lm_scale_2.2_attention_scale_0.6 34.13
|
591 |
+
ngram_lm_scale_3.0_attention_scale_1.2 34.23
|
592 |
+
ngram_lm_scale_2.1_attention_scale_0.5 34.25
|
593 |
+
ngram_lm_scale_1.5_attention_scale_0.01 34.35
|
594 |
+
ngram_lm_scale_4.0_attention_scale_1.9 34.39
|
595 |
+
ngram_lm_scale_1.9_attention_scale_0.3 34.53
|
596 |
+
ngram_lm_scale_2.3_attention_scale_0.6 34.62
|
597 |
+
ngram_lm_scale_3.0_attention_scale_1.1 34.66
|
598 |
+
ngram_lm_scale_5.0_attention_scale_2.5 34.72
|
599 |
+
ngram_lm_scale_2.2_attention_scale_0.5 34.74
|
600 |
+
ngram_lm_scale_2.5_attention_scale_0.7 34.87
|
601 |
+
ngram_lm_scale_1.7_attention_scale_0.1 34.96
|
602 |
+
ngram_lm_scale_4.0_attention_scale_1.7 35.0
|
603 |
+
ngram_lm_scale_2.0_attention_scale_0.3 35.07
|
604 |
+
ngram_lm_scale_3.0_attention_scale_1.0 35.1
|
605 |
+
ngram_lm_scale_1.7_attention_scale_0.08 35.14
|
606 |
+
ngram_lm_scale_2.3_attention_scale_0.5 35.18
|
607 |
+
ngram_lm_scale_5.0_attention_scale_2.3 35.18
|
608 |
+
ngram_lm_scale_1.7_attention_scale_0.05 35.37
|
609 |
+
ngram_lm_scale_2.5_attention_scale_0.6 35.37
|
610 |
+
ngram_lm_scale_5.0_attention_scale_2.2 35.38
|
611 |
+
ngram_lm_scale_2.1_attention_scale_0.3 35.47
|
612 |
+
ngram_lm_scale_3.0_attention_scale_0.9 35.47
|
613 |
+
ngram_lm_scale_4.0_attention_scale_1.5 35.55
|
614 |
+
ngram_lm_scale_5.0_attention_scale_2.1 35.59
|
615 |
+
ngram_lm_scale_1.7_attention_scale_0.01 35.67
|
616 |
+
ngram_lm_scale_2.5_attention_scale_0.5 35.77
|
617 |
+
ngram_lm_scale_5.0_attention_scale_2.0 35.81
|
618 |
+
ngram_lm_scale_2.2_attention_scale_0.3 35.86
|
619 |
+
ngram_lm_scale_1.9_attention_scale_0.1 35.98
|
620 |
+
ngram_lm_scale_5.0_attention_scale_1.9 36.06
|
621 |
+
ngram_lm_scale_1.9_attention_scale_0.08 36.1
|
622 |
+
ngram_lm_scale_4.0_attention_scale_1.3 36.1
|
623 |
+
ngram_lm_scale_3.0_attention_scale_0.7 36.17
|
624 |
+
ngram_lm_scale_2.3_attention_scale_0.3 36.21
|
625 |
+
ngram_lm_scale_1.9_attention_scale_0.05 36.3
|
626 |
+
ngram_lm_scale_4.0_attention_scale_1.2 36.33
|
627 |
+
ngram_lm_scale_5.0_attention_scale_1.7 36.38
|
628 |
+
ngram_lm_scale_2.0_attention_scale_0.1 36.4
|
629 |
+
ngram_lm_scale_1.9_attention_scale_0.01 36.49
|
630 |
+
ngram_lm_scale_2.0_attention_scale_0.08 36.49
|
631 |
+
ngram_lm_scale_3.0_attention_scale_0.6 36.51
|
632 |
+
ngram_lm_scale_4.0_attention_scale_1.1 36.56
|
633 |
+
ngram_lm_scale_2.0_attention_scale_0.05 36.58
|
634 |
+
ngram_lm_scale_2.1_attention_scale_0.1 36.64
|
635 |
+
ngram_lm_scale_2.1_attention_scale_0.08 36.71
|
636 |
+
ngram_lm_scale_2.5_attention_scale_0.3 36.71
|
637 |
+
ngram_lm_scale_5.0_attention_scale_1.5 36.75
|
638 |
+
ngram_lm_scale_2.0_attention_scale_0.01 36.76
|
639 |
+
ngram_lm_scale_4.0_attention_scale_1.0 36.78
|
640 |
+
ngram_lm_scale_2.1_attention_scale_0.05 36.83
|
641 |
+
ngram_lm_scale_2.2_attention_scale_0.1 36.83
|
642 |
+
ngram_lm_scale_3.0_attention_scale_0.5 36.84
|
643 |
+
ngram_lm_scale_2.2_attention_scale_0.08 36.94
|
644 |
+
ngram_lm_scale_4.0_attention_scale_0.9 37.01
|
645 |
+
ngram_lm_scale_2.1_attention_scale_0.01 37.07
|
646 |
+
ngram_lm_scale_2.3_attention_scale_0.1 37.08
|
647 |
+
ngram_lm_scale_2.2_attention_scale_0.05 37.09
|
648 |
+
ngram_lm_scale_5.0_attention_scale_1.3 37.11
|
649 |
+
ngram_lm_scale_2.3_attention_scale_0.08 37.17
|
650 |
+
ngram_lm_scale_5.0_attention_scale_1.2 37.29
|
651 |
+
ngram_lm_scale_2.2_attention_scale_0.01 37.3
|
652 |
+
ngram_lm_scale_2.3_attention_scale_0.05 37.3
|
653 |
+
ngram_lm_scale_2.3_attention_scale_0.01 37.45
|
654 |
+
ngram_lm_scale_2.5_attention_scale_0.1 37.45
|
655 |
+
ngram_lm_scale_3.0_attention_scale_0.3 37.46
|
656 |
+
ngram_lm_scale_5.0_attention_scale_1.1 37.5
|
657 |
+
ngram_lm_scale_4.0_attention_scale_0.7 37.53
|
658 |
+
ngram_lm_scale_2.5_attention_scale_0.08 37.54
|
659 |
+
ngram_lm_scale_2.5_attention_scale_0.05 37.63
|
660 |
+
ngram_lm_scale_5.0_attention_scale_1.0 37.64
|
661 |
+
ngram_lm_scale_4.0_attention_scale_0.6 37.68
|
662 |
+
ngram_lm_scale_2.5_attention_scale_0.01 37.76
|
663 |
+
ngram_lm_scale_5.0_attention_scale_0.9 37.82
|
664 |
+
ngram_lm_scale_4.0_attention_scale_0.5 37.94
|
665 |
+
ngram_lm_scale_3.0_attention_scale_0.1 38.07
|
666 |
+
ngram_lm_scale_3.0_attention_scale_0.08 38.11
|
667 |
+
ngram_lm_scale_5.0_attention_scale_0.7 38.14
|
668 |
+
ngram_lm_scale_3.0_attention_scale_0.05 38.2
|
669 |
+
ngram_lm_scale_3.0_attention_scale_0.01 38.32
|
670 |
+
ngram_lm_scale_4.0_attention_scale_0.3 38.32
|
671 |
+
ngram_lm_scale_5.0_attention_scale_0.6 38.33
|
672 |
+
ngram_lm_scale_5.0_attention_scale_0.5 38.47
|
673 |
+
ngram_lm_scale_4.0_attention_scale_0.1 38.73
|
674 |
+
ngram_lm_scale_4.0_attention_scale_0.08 38.77
|
675 |
+
ngram_lm_scale_5.0_attention_scale_0.3 38.78
|
676 |
+
ngram_lm_scale_4.0_attention_scale_0.05 38.85
|
677 |
+
ngram_lm_scale_4.0_attention_scale_0.01 38.92
|
678 |
+
ngram_lm_scale_5.0_attention_scale_0.1 39.07
|
679 |
+
ngram_lm_scale_5.0_attention_scale_0.08 39.11
|
680 |
+
ngram_lm_scale_5.0_attention_scale_0.05 39.17
|
681 |
+
ngram_lm_scale_5.0_attention_scale_0.01 39.22
|
682 |
+
|
683 |
+
2022-06-25 22:21:00,565 INFO [decode.py:483] batch 0/?, cuts processed until now is 2
|
684 |
+
2022-06-25 23:27:30,947 INFO [decode.py:483] batch 100/?, cuts processed until now is 277
|
685 |
+
2022-06-26 00:57:14,858 INFO [decode.py:483] batch 200/?, cuts processed until now is 570
|
686 |
+
2022-06-26 02:12:55,830 INFO [decode.py:483] batch 300/?, cuts processed until now is 872
|
687 |
+
2022-06-26 03:29:19,515 INFO [decode.py:483] batch 400/?, cuts processed until now is 1159
|
688 |
+
2022-06-26 04:56:43,624 INFO [decode.py:483] batch 500/?, cuts processed until now is 1433
|
689 |
+
2022-06-26 06:10:18,732 INFO [decode.py:483] batch 600/?, cuts processed until now is 1723
|
690 |
+
2022-06-26 07:18:12,039 INFO [decode.py:483] batch 700/?, cuts processed until now is 2012
|
691 |
+
2022-06-26 08:44:29,122 INFO [decode.py:483] batch 800/?, cuts processed until now is 2287
|
692 |
+
2022-06-26 10:12:10,048 INFO [decode.py:483] batch 900/?, cuts processed until now is 2582
|
693 |
+
2022-06-26 11:16:50,308 INFO [decode.py:483] batch 1000/?, cuts processed until now is 2870
|
694 |
+
2022-06-26 12:28:04,120 INFO [decode.py:483] batch 1100/?, cuts processed until now is 3152
|
695 |
+
2022-06-26 13:37:56,712 INFO [decode.py:483] batch 1200/?, cuts processed until now is 3458
|
696 |
+
2022-06-26 15:01:44,066 INFO [decode.py:483] batch 1300/?, cuts processed until now is 3731
|
697 |
+
2022-06-26 16:36:03,534 INFO [decode.py:483] batch 1400/?, cuts processed until now is 4012
|
698 |
+
2022-06-26 18:51:27,441 INFO [decode.py:483] batch 1500/?, cuts processed until now is 4290
|
699 |
+
2022-06-26 19:03:44,177 INFO [decode.py:733] Caught exception:
|
700 |
+
|
701 |
+
Some bad things happened. Please read the above error messages and stack
|
702 |
+
trace. If you are using Python, the following command may be helpful:
|
703 |
+
|
704 |
+
gdb --args python /path/to/your/code.py
|
705 |
+
|
706 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
707 |
+
a debug version of k2.).
|
708 |
+
|
709 |
+
If you are unable to fix it, please open an issue at:
|
710 |
+
|
711 |
+
https://github.com/k2-fsa/k2/issues/new
|
712 |
+
|
713 |
+
|
714 |
+
2022-06-26 19:03:44,177 INFO [decode.py:734] num_arcs before pruning: 2390934
|
715 |
+
2022-06-26 19:03:44,178 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
716 |
+
2022-06-26 19:03:44,562 INFO [decode.py:747] num_arcs after pruning: 5752
|
717 |
+
2022-06-26 19:57:22,287 INFO [decode.py:733] Caught exception:
|
718 |
+
|
719 |
+
Some bad things happened. Please read the above error messages and stack
|
720 |
+
trace. If you are using Python, the following command may be helpful:
|
721 |
+
|
722 |
+
gdb --args python /path/to/your/code.py
|
723 |
+
|
724 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
725 |
+
a debug version of k2.).
|
726 |
+
|
727 |
+
If you are unable to fix it, please open an issue at:
|
728 |
+
|
729 |
+
https://github.com/k2-fsa/k2/issues/new
|
730 |
+
|
731 |
+
|
732 |
+
2022-06-26 19:57:22,288 INFO [decode.py:734] num_arcs before pruning: 846107
|
733 |
+
2022-06-26 19:57:22,288 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
734 |
+
2022-06-26 19:57:22,439 INFO [decode.py:747] num_arcs after pruning: 8745
|
735 |
+
2022-06-26 21:23:00,214 INFO [decode.py:483] batch 1600/?, cuts processed until now is 4539
|
736 |
+
2022-06-26 23:32:50,861 INFO [decode.py:483] batch 1700/?, cuts processed until now is 4765
|
737 |
+
2022-06-27 01:24:13,288 INFO [decode.py:483] batch 1800/?, cuts processed until now is 4973
|
738 |
+
2022-06-27 01:34:55,855 INFO [decode.py:532]
|
739 |
+
For dev, WER of different settings are:
|
740 |
+
ngram_lm_scale_0.01_attention_scale_0.5 15.78 best for dev
|
741 |
+
ngram_lm_scale_0.01_attention_scale_0.6 15.79
|
742 |
+
ngram_lm_scale_0.01_attention_scale_0.7 15.8
|
743 |
+
ngram_lm_scale_0.05_attention_scale_0.5 15.8
|
744 |
+
ngram_lm_scale_0.05_attention_scale_0.6 15.8
|
745 |
+
ngram_lm_scale_0.01_attention_scale_0.3 15.81
|
746 |
+
ngram_lm_scale_0.05_attention_scale_0.3 15.81
|
747 |
+
ngram_lm_scale_0.05_attention_scale_0.7 15.81
|
748 |
+
ngram_lm_scale_0.08_attention_scale_0.3 15.81
|
749 |
+
ngram_lm_scale_0.08_attention_scale_0.5 15.81
|
750 |
+
ngram_lm_scale_0.01_attention_scale_0.9 15.82
|
751 |
+
ngram_lm_scale_0.08_attention_scale_0.6 15.82
|
752 |
+
ngram_lm_scale_0.08_attention_scale_0.7 15.83
|
753 |
+
ngram_lm_scale_0.1_attention_scale_0.3 15.83
|
754 |
+
ngram_lm_scale_0.1_attention_scale_0.5 15.83
|
755 |
+
ngram_lm_scale_0.1_attention_scale_0.6 15.83
|
756 |
+
ngram_lm_scale_0.01_attention_scale_1.0 15.84
|
757 |
+
ngram_lm_scale_0.05_attention_scale_0.9 15.84
|
758 |
+
ngram_lm_scale_0.08_attention_scale_0.9 15.86
|
759 |
+
ngram_lm_scale_0.1_attention_scale_0.7 15.86
|
760 |
+
ngram_lm_scale_0.05_attention_scale_1.0 15.87
|
761 |
+
ngram_lm_scale_0.01_attention_scale_1.1 15.88
|
762 |
+
ngram_lm_scale_0.1_attention_scale_0.9 15.88
|
763 |
+
ngram_lm_scale_0.01_attention_scale_1.2 15.9
|
764 |
+
ngram_lm_scale_0.01_attention_scale_1.3 15.91
|
765 |
+
ngram_lm_scale_0.05_attention_scale_1.1 15.91
|
766 |
+
ngram_lm_scale_0.08_attention_scale_1.0 15.91
|
767 |
+
ngram_lm_scale_0.1_attention_scale_1.0 15.91
|
768 |
+
ngram_lm_scale_0.01_attention_scale_0.1 15.92
|
769 |
+
ngram_lm_scale_0.05_attention_scale_1.2 15.93
|
770 |
+
ngram_lm_scale_0.05_attention_scale_0.1 15.94
|
771 |
+
ngram_lm_scale_0.05_attention_scale_1.3 15.94
|
772 |
+
ngram_lm_scale_0.08_attention_scale_1.1 15.94
|
773 |
+
ngram_lm_scale_0.08_attention_scale_1.2 15.94
|
774 |
+
ngram_lm_scale_0.01_attention_scale_1.5 15.95
|
775 |
+
ngram_lm_scale_0.08_attention_scale_0.1 15.95
|
776 |
+
ngram_lm_scale_0.1_attention_scale_1.1 15.95
|
777 |
+
ngram_lm_scale_0.01_attention_scale_0.08 15.96
|
778 |
+
ngram_lm_scale_0.05_attention_scale_0.08 15.97
|
779 |
+
ngram_lm_scale_0.08_attention_scale_0.08 15.97
|
780 |
+
ngram_lm_scale_0.1_attention_scale_0.1 15.97
|
781 |
+
ngram_lm_scale_0.1_attention_scale_1.2 15.97
|
782 |
+
ngram_lm_scale_0.01_attention_scale_1.7 15.98
|
783 |
+
ngram_lm_scale_0.05_attention_scale_1.5 15.98
|
784 |
+
ngram_lm_scale_0.1_attention_scale_1.3 15.98
|
785 |
+
ngram_lm_scale_0.01_attention_scale_0.05 15.99
|
786 |
+
ngram_lm_scale_0.08_attention_scale_1.3 15.99
|
787 |
+
ngram_lm_scale_0.1_attention_scale_0.08 15.99
|
788 |
+
ngram_lm_scale_0.05_attention_scale_0.05 16.01
|
789 |
+
ngram_lm_scale_0.08_attention_scale_1.5 16.01
|
790 |
+
ngram_lm_scale_0.05_attention_scale_1.7 16.02
|
791 |
+
ngram_lm_scale_0.08_attention_scale_0.05 16.03
|
792 |
+
ngram_lm_scale_0.1_attention_scale_0.05 16.03
|
793 |
+
ngram_lm_scale_0.1_attention_scale_1.5 16.04
|
794 |
+
ngram_lm_scale_0.01_attention_scale_0.01 16.05
|
795 |
+
ngram_lm_scale_0.01_attention_scale_1.9 16.05
|
796 |
+
ngram_lm_scale_0.01_attention_scale_2.0 16.07
|
797 |
+
ngram_lm_scale_0.08_attention_scale_1.7 16.07
|
798 |
+
ngram_lm_scale_0.05_attention_scale_0.01 16.08
|
799 |
+
ngram_lm_scale_0.05_attention_scale_1.9 16.08
|
800 |
+
ngram_lm_scale_0.1_attention_scale_1.7 16.09
|
801 |
+
ngram_lm_scale_0.08_attention_scale_0.01 16.1
|
802 |
+
ngram_lm_scale_0.08_attention_scale_1.9 16.1
|
803 |
+
ngram_lm_scale_0.05_attention_scale_2.0 16.11
|
804 |
+
ngram_lm_scale_0.01_attention_scale_2.1 16.12
|
805 |
+
ngram_lm_scale_0.1_attention_scale_1.9 16.12
|
806 |
+
ngram_lm_scale_0.05_attention_scale_2.1 16.13
|
807 |
+
ngram_lm_scale_0.08_attention_scale_2.0 16.13
|
808 |
+
ngram_lm_scale_0.1_attention_scale_0.01 16.13
|
809 |
+
ngram_lm_scale_0.01_attention_scale_2.2 16.14
|
810 |
+
ngram_lm_scale_0.08_attention_scale_2.1 16.14
|
811 |
+
ngram_lm_scale_0.01_attention_scale_2.3 16.15
|
812 |
+
ngram_lm_scale_0.05_attention_scale_2.2 16.15
|
813 |
+
ngram_lm_scale_0.1_attention_scale_2.0 16.15
|
814 |
+
ngram_lm_scale_0.3_attention_scale_0.9 16.16
|
815 |
+
ngram_lm_scale_0.01_attention_scale_2.5 16.17
|
816 |
+
ngram_lm_scale_0.05_attention_scale_2.3 16.17
|
817 |
+
ngram_lm_scale_0.08_attention_scale_2.2 16.18
|
818 |
+
ngram_lm_scale_0.1_attention_scale_2.1 16.18
|
819 |
+
ngram_lm_scale_0.3_attention_scale_0.5 16.18
|
820 |
+
ngram_lm_scale_0.3_attention_scale_0.6 16.18
|
821 |
+
ngram_lm_scale_0.3_attention_scale_0.7 16.18
|
822 |
+
ngram_lm_scale_0.3_attention_scale_1.0 16.18
|
823 |
+
ngram_lm_scale_0.3_attention_scale_1.1 16.18
|
824 |
+
ngram_lm_scale_0.05_attention_scale_2.5 16.19
|
825 |
+
ngram_lm_scale_0.1_attention_scale_2.2 16.19
|
826 |
+
ngram_lm_scale_0.08_attention_scale_2.3 16.2
|
827 |
+
ngram_lm_scale_0.1_attention_scale_2.3 16.21
|
828 |
+
ngram_lm_scale_0.08_attention_scale_2.5 16.22
|
829 |
+
ngram_lm_scale_0.3_attention_scale_0.3 16.22
|
830 |
+
ngram_lm_scale_0.3_attention_scale_1.2 16.22
|
831 |
+
ngram_lm_scale_0.3_attention_scale_1.3 16.22
|
832 |
+
ngram_lm_scale_0.01_attention_scale_3.0 16.23
|
833 |
+
ngram_lm_scale_0.1_attention_scale_2.5 16.23
|
834 |
+
ngram_lm_scale_0.3_attention_scale_1.5 16.24
|
835 |
+
ngram_lm_scale_0.05_attention_scale_3.0 16.27
|
836 |
+
ngram_lm_scale_0.3_attention_scale_1.7 16.29
|
837 |
+
ngram_lm_scale_0.08_attention_scale_3.0 16.3
|
838 |
+
ngram_lm_scale_0.1_attention_scale_3.0 16.31
|
839 |
+
ngram_lm_scale_0.3_attention_scale_1.9 16.31
|
840 |
+
ngram_lm_scale_0.3_attention_scale_2.0 16.34
|
841 |
+
ngram_lm_scale_0.3_attention_scale_2.1 16.36
|
842 |
+
ngram_lm_scale_0.3_attention_scale_2.2 16.38
|
843 |
+
ngram_lm_scale_0.3_attention_scale_2.3 16.38
|
844 |
+
ngram_lm_scale_0.01_attention_scale_4.0 16.39
|
845 |
+
ngram_lm_scale_0.05_attention_scale_4.0 16.4
|
846 |
+
ngram_lm_scale_0.3_attention_scale_0.1 16.41
|
847 |
+
ngram_lm_scale_0.3_attention_scale_2.5 16.42
|
848 |
+
ngram_lm_scale_0.08_attention_scale_4.0 16.45
|
849 |
+
ngram_lm_scale_0.3_attention_scale_0.08 16.45
|
850 |
+
ngram_lm_scale_0.1_attention_scale_4.0 16.47
|
851 |
+
ngram_lm_scale_0.01_attention_scale_5.0 16.52
|
852 |
+
ngram_lm_scale_0.3_attention_scale_3.0 16.53
|
853 |
+
ngram_lm_scale_0.05_attention_scale_5.0 16.54
|
854 |
+
ngram_lm_scale_0.3_attention_scale_0.05 16.54
|
855 |
+
ngram_lm_scale_0.08_attention_scale_5.0 16.57
|
856 |
+
ngram_lm_scale_0.1_attention_scale_5.0 16.57
|
857 |
+
ngram_lm_scale_0.5_attention_scale_0.9 16.58
|
858 |
+
ngram_lm_scale_0.5_attention_scale_1.2 16.58
|
859 |
+
ngram_lm_scale_0.5_attention_scale_1.1 16.59
|
860 |
+
ngram_lm_scale_0.5_attention_scale_1.3 16.6
|
861 |
+
ngram_lm_scale_0.3_attention_scale_4.0 16.61
|
862 |
+
ngram_lm_scale_0.5_attention_scale_1.0 16.61
|
863 |
+
ngram_lm_scale_0.5_attention_scale_1.5 16.64
|
864 |
+
ngram_lm_scale_0.3_attention_scale_5.0 16.66
|
865 |
+
ngram_lm_scale_0.5_attention_scale_0.7 16.66
|
866 |
+
ngram_lm_scale_0.5_attention_scale_1.9 16.66
|
867 |
+
ngram_lm_scale_0.5_attention_scale_2.0 16.66
|
868 |
+
ngram_lm_scale_0.5_attention_scale_2.1 16.66
|
869 |
+
ngram_lm_scale_0.3_attention_scale_0.01 16.67
|
870 |
+
ngram_lm_scale_0.5_attention_scale_1.7 16.67
|
871 |
+
ngram_lm_scale_0.5_attention_scale_2.2 16.68
|
872 |
+
ngram_lm_scale_0.5_attention_scale_2.3 16.69
|
873 |
+
ngram_lm_scale_0.5_attention_scale_2.5 16.69
|
874 |
+
ngram_lm_scale_0.5_attention_scale_0.6 16.73
|
875 |
+
ngram_lm_scale_0.5_attention_scale_3.0 16.73
|
876 |
+
ngram_lm_scale_0.5_attention_scale_0.5 16.79
|
877 |
+
ngram_lm_scale_0.5_attention_scale_4.0 16.8
|
878 |
+
ngram_lm_scale_0.5_attention_scale_5.0 16.83
|
879 |
+
ngram_lm_scale_0.6_attention_scale_3.0 16.83
|
880 |
+
ngram_lm_scale_0.6_attention_scale_2.1 16.85
|
881 |
+
ngram_lm_scale_0.6_attention_scale_2.2 16.85
|
882 |
+
ngram_lm_scale_0.6_attention_scale_2.3 16.85
|
883 |
+
ngram_lm_scale_0.6_attention_scale_2.5 16.86
|
884 |
+
ngram_lm_scale_0.6_attention_scale_2.0 16.87
|
885 |
+
ngram_lm_scale_0.6_attention_scale_1.7 16.89
|
886 |
+
ngram_lm_scale_0.6_attention_scale_1.9 16.89
|
887 |
+
ngram_lm_scale_0.6_attention_scale_4.0 16.9
|
888 |
+
ngram_lm_scale_0.6_attention_scale_1.5 16.91
|
889 |
+
ngram_lm_scale_0.6_attention_scale_5.0 16.93
|
890 |
+
ngram_lm_scale_0.6_attention_scale_1.2 16.94
|
891 |
+
ngram_lm_scale_0.6_attention_scale_1.1 16.95
|
892 |
+
ngram_lm_scale_0.6_attention_scale_1.3 16.95
|
893 |
+
ngram_lm_scale_0.6_attention_scale_1.0 16.96
|
894 |
+
ngram_lm_scale_0.6_attention_scale_0.9 16.97
|
895 |
+
ngram_lm_scale_0.5_attention_scale_0.3 17.0
|
896 |
+
ngram_lm_scale_0.7_attention_scale_4.0 17.01
|
897 |
+
ngram_lm_scale_0.7_attention_scale_5.0 17.01
|
898 |
+
ngram_lm_scale_0.7_attention_scale_3.0 17.02
|
899 |
+
ngram_lm_scale_0.7_attention_scale_2.5 17.06
|
900 |
+
ngram_lm_scale_0.7_attention_scale_2.3 17.09
|
901 |
+
ngram_lm_scale_0.7_attention_scale_2.1 17.1
|
902 |
+
ngram_lm_scale_0.7_attention_scale_2.2 17.1
|
903 |
+
ngram_lm_scale_0.6_attention_scale_0.7 17.12
|
904 |
+
ngram_lm_scale_0.7_attention_scale_2.0 17.13
|
905 |
+
ngram_lm_scale_0.7_attention_scale_1.9 17.14
|
906 |
+
ngram_lm_scale_0.7_attention_scale_1.7 17.16
|
907 |
+
ngram_lm_scale_0.7_attention_scale_1.5 17.18
|
908 |
+
ngram_lm_scale_0.7_attention_scale_1.3 17.21
|
909 |
+
ngram_lm_scale_0.6_attention_scale_0.6 17.24
|
910 |
+
ngram_lm_scale_0.9_attention_scale_5.0 17.27
|
911 |
+
ngram_lm_scale_0.7_attention_scale_1.2 17.28
|
912 |
+
ngram_lm_scale_0.7_attention_scale_1.1 17.32
|
913 |
+
ngram_lm_scale_0.6_attention_scale_0.5 17.34
|
914 |
+
ngram_lm_scale_0.9_attention_scale_4.0 17.34
|
915 |
+
ngram_lm_scale_0.7_attention_scale_1.0 17.39
|
916 |
+
ngram_lm_scale_1.0_attention_scale_5.0 17.39
|
917 |
+
ngram_lm_scale_0.9_attention_scale_3.0 17.47
|
918 |
+
ngram_lm_scale_0.7_attention_scale_0.9 17.48
|
919 |
+
ngram_lm_scale_1.0_attention_scale_4.0 17.48
|
920 |
+
ngram_lm_scale_1.1_attention_scale_5.0 17.52
|
921 |
+
ngram_lm_scale_0.9_attention_scale_2.5 17.56
|
922 |
+
ngram_lm_scale_0.9_attention_scale_2.3 17.6
|
923 |
+
ngram_lm_scale_0.9_attention_scale_2.2 17.63
|
924 |
+
ngram_lm_scale_1.1_attention_scale_4.0 17.64
|
925 |
+
ngram_lm_scale_1.2_attention_scale_5.0 17.64
|
926 |
+
ngram_lm_scale_0.5_attention_scale_0.1 17.66
|
927 |
+
ngram_lm_scale_0.9_attention_scale_2.1 17.66
|
928 |
+
ngram_lm_scale_1.0_attention_scale_3.0 17.66
|
929 |
+
ngram_lm_scale_0.9_attention_scale_2.0 17.67
|
930 |
+
ngram_lm_scale_0.7_attention_scale_0.7 17.68
|
931 |
+
ngram_lm_scale_0.9_attention_scale_1.9 17.74
|
932 |
+
ngram_lm_scale_1.3_attention_scale_5.0 17.76
|
933 |
+
ngram_lm_scale_1.0_attention_scale_2.5 17.77
|
934 |
+
ngram_lm_scale_0.6_attention_scale_0.3 17.78
|
935 |
+
ngram_lm_scale_1.2_attention_scale_4.0 17.78
|
936 |
+
ngram_lm_scale_0.5_attention_scale_0.08 17.81
|
937 |
+
ngram_lm_scale_0.7_attention_scale_0.6 17.83
|
938 |
+
ngram_lm_scale_0.9_attention_scale_1.7 17.84
|
939 |
+
ngram_lm_scale_1.0_attention_scale_2.3 17.85
|
940 |
+
ngram_lm_scale_1.1_attention_scale_3.0 17.87
|
941 |
+
ngram_lm_scale_1.0_attention_scale_2.2 17.9
|
942 |
+
ngram_lm_scale_0.9_attention_scale_1.5 17.95
|
943 |
+
ngram_lm_scale_1.0_attention_scale_2.1 17.96
|
944 |
+
ngram_lm_scale_0.5_attention_scale_0.05 17.97
|
945 |
+
ngram_lm_scale_1.3_attention_scale_4.0 17.98
|
946 |
+
ngram_lm_scale_1.0_attention_scale_2.0 18.01
|
947 |
+
ngram_lm_scale_1.1_attention_scale_2.5 18.06
|
948 |
+
ngram_lm_scale_1.5_attention_scale_5.0 18.08
|
949 |
+
ngram_lm_scale_0.7_attention_scale_0.5 18.09
|
950 |
+
ngram_lm_scale_1.0_attention_scale_1.9 18.09
|
951 |
+
ngram_lm_scale_1.2_attention_scale_3.0 18.1
|
952 |
+
ngram_lm_scale_0.9_attention_scale_1.3 18.16
|
953 |
+
ngram_lm_scale_1.1_attention_scale_2.3 18.19
|
954 |
+
ngram_lm_scale_1.0_attention_scale_1.7 18.24
|
955 |
+
ngram_lm_scale_1.1_attention_scale_2.2 18.24
|
956 |
+
ngram_lm_scale_0.5_attention_scale_0.01 18.28
|
957 |
+
ngram_lm_scale_0.9_attention_scale_1.2 18.29
|
958 |
+
ngram_lm_scale_1.1_attention_scale_2.1 18.33
|
959 |
+
ngram_lm_scale_1.3_attention_scale_3.0 18.33
|
960 |
+
ngram_lm_scale_1.5_attention_scale_4.0 18.36
|
961 |
+
ngram_lm_scale_1.7_attention_scale_5.0 18.38
|
962 |
+
ngram_lm_scale_1.1_attention_scale_2.0 18.39
|
963 |
+
ngram_lm_scale_1.2_attention_scale_2.5 18.39
|
964 |
+
ngram_lm_scale_1.0_attention_scale_1.5 18.44
|
965 |
+
ngram_lm_scale_0.9_attention_scale_1.1 18.45
|
966 |
+
ngram_lm_scale_1.1_attention_scale_1.9 18.5
|
967 |
+
ngram_lm_scale_1.2_attention_scale_2.3 18.53
|
968 |
+
ngram_lm_scale_0.9_attention_scale_1.0 18.59
|
969 |
+
ngram_lm_scale_1.2_attention_scale_2.2 18.63
|
970 |
+
ngram_lm_scale_1.0_attention_scale_1.3 18.73
|
971 |
+
ngram_lm_scale_1.1_attention_scale_1.7 18.74
|
972 |
+
ngram_lm_scale_1.2_attention_scale_2.1 18.74
|
973 |
+
ngram_lm_scale_1.3_attention_scale_2.5 18.74
|
974 |
+
ngram_lm_scale_0.9_attention_scale_0.9 18.77
|
975 |
+
ngram_lm_scale_0.6_attention_scale_0.1 18.79
|
976 |
+
ngram_lm_scale_0.7_attention_scale_0.3 18.79
|
977 |
+
ngram_lm_scale_1.2_attention_scale_2.0 18.88
|
978 |
+
ngram_lm_scale_1.9_attention_scale_5.0 18.88
|
979 |
+
ngram_lm_scale_1.0_attention_scale_1.2 18.92
|
980 |
+
ngram_lm_scale_1.7_attention_scale_4.0 18.94
|
981 |
+
ngram_lm_scale_0.6_attention_scale_0.08 19.0
|
982 |
+
ngram_lm_scale_1.3_attention_scale_2.3 19.0
|
983 |
+
ngram_lm_scale_1.2_attention_scale_1.9 19.03
|
984 |
+
ngram_lm_scale_1.1_attention_scale_1.5 19.06
|
985 |
+
ngram_lm_scale_2.0_attention_scale_5.0 19.06
|
986 |
+
ngram_lm_scale_1.5_attention_scale_3.0 19.07
|
987 |
+
ngram_lm_scale_1.0_attention_scale_1.1 19.09
|
988 |
+
ngram_lm_scale_1.3_attention_scale_2.2 19.19
|
989 |
+
ngram_lm_scale_0.6_attention_scale_0.05 19.21
|
990 |
+
ngram_lm_scale_2.1_attention_scale_5.0 19.23
|
991 |
+
ngram_lm_scale_0.9_attention_scale_0.7 19.29
|
992 |
+
ngram_lm_scale_1.3_attention_scale_2.1 19.31
|
993 |
+
ngram_lm_scale_1.2_attention_scale_1.7 19.34
|
994 |
+
ngram_lm_scale_1.0_attention_scale_1.0 19.4
|
995 |
+
ngram_lm_scale_1.1_attention_scale_1.3 19.4
|
996 |
+
ngram_lm_scale_1.9_attention_scale_4.0 19.42
|
997 |
+
ngram_lm_scale_1.3_attention_scale_2.0 19.43
|
998 |
+
ngram_lm_scale_2.2_attention_scale_5.0 19.45
|
999 |
+
ngram_lm_scale_1.3_attention_scale_1.9 19.53
|
1000 |
+
ngram_lm_scale_1.1_attention_scale_1.2 19.59
|
1001 |
+
ngram_lm_scale_1.0_attention_scale_0.9 19.61
|
1002 |
+
ngram_lm_scale_1.2_attention_scale_1.5 19.61
|
1003 |
+
ngram_lm_scale_1.5_attention_scale_2.5 19.63
|
1004 |
+
ngram_lm_scale_0.6_attention_scale_0.01 19.65
|
1005 |
+
ngram_lm_scale_0.9_attention_scale_0.6 19.71
|
1006 |
+
ngram_lm_scale_2.3_attention_scale_5.0 19.73
|
1007 |
+
ngram_lm_scale_2.0_attention_scale_4.0 19.75
|
1008 |
+
ngram_lm_scale_1.7_attention_scale_3.0 19.81
|
1009 |
+
ngram_lm_scale_1.3_attention_scale_1.7 19.89
|
1010 |
+
ngram_lm_scale_1.1_attention_scale_1.1 19.9
|
1011 |
+
ngram_lm_scale_1.5_attention_scale_2.3 19.96
|
1012 |
+
ngram_lm_scale_1.5_attention_scale_2.2 20.13
|
1013 |
+
ngram_lm_scale_2.1_attention_scale_4.0 20.14
|
1014 |
+
ngram_lm_scale_1.2_attention_scale_1.3 20.15
|
1015 |
+
ngram_lm_scale_0.9_attention_scale_0.5 20.18
|
1016 |
+
ngram_lm_scale_1.1_attention_scale_1.0 20.23
|
1017 |
+
ngram_lm_scale_0.7_attention_scale_0.1 20.25
|
1018 |
+
ngram_lm_scale_1.5_attention_scale_2.1 20.29
|
1019 |
+
ngram_lm_scale_2.5_attention_scale_5.0 20.37
|
1020 |
+
ngram_lm_scale_1.3_attention_scale_1.5 20.38
|
1021 |
+
ngram_lm_scale_1.0_attention_scale_0.7 20.4
|
1022 |
+
ngram_lm_scale_1.2_attention_scale_1.2 20.46
|
1023 |
+
ngram_lm_scale_0.7_attention_scale_0.08 20.49
|
1024 |
+
ngram_lm_scale_1.5_attention_scale_2.0 20.5
|
1025 |
+
ngram_lm_scale_2.2_attention_scale_4.0 20.5
|
1026 |
+
ngram_lm_scale_1.1_attention_scale_0.9 20.59
|
1027 |
+
ngram_lm_scale_1.7_attention_scale_2.5 20.65
|
1028 |
+
ngram_lm_scale_1.2_attention_scale_1.1 20.76
|
1029 |
+
ngram_lm_scale_1.9_attention_scale_3.0 20.77
|
1030 |
+
ngram_lm_scale_1.5_attention_scale_1.9 20.78
|
1031 |
+
ngram_lm_scale_0.7_attention_scale_0.05 20.84
|
1032 |
+
ngram_lm_scale_2.3_attention_scale_4.0 20.9
|
1033 |
+
ngram_lm_scale_1.0_attention_scale_0.6 20.91
|
1034 |
+
ngram_lm_scale_1.3_attention_scale_1.3 21.02
|
1035 |
+
ngram_lm_scale_1.7_attention_scale_2.3 21.12
|
1036 |
+
ngram_lm_scale_1.2_attention_scale_1.0 21.21
|
1037 |
+
ngram_lm_scale_2.0_attention_scale_3.0 21.25
|
1038 |
+
ngram_lm_scale_1.5_attention_scale_1.7 21.35
|
1039 |
+
ngram_lm_scale_1.7_attention_scale_2.2 21.37
|
1040 |
+
ngram_lm_scale_1.3_attention_scale_1.2 21.4
|
1041 |
+
ngram_lm_scale_0.7_attention_scale_0.01 21.42
|
1042 |
+
ngram_lm_scale_1.0_attention_scale_0.5 21.62
|
1043 |
+
ngram_lm_scale_1.7_attention_scale_2.1 21.66
|
1044 |
+
ngram_lm_scale_2.5_attention_scale_4.0 21.67
|
1045 |
+
ngram_lm_scale_1.1_attention_scale_0.7 21.68
|
1046 |
+
ngram_lm_scale_0.9_attention_scale_0.3 21.71
|
1047 |
+
ngram_lm_scale_2.1_attention_scale_3.0 21.79
|
1048 |
+
ngram_lm_scale_1.2_attention_scale_0.9 21.8
|
1049 |
+
ngram_lm_scale_1.9_attention_scale_2.5 21.87
|
1050 |
+
ngram_lm_scale_1.3_attention_scale_1.1 21.91
|
1051 |
+
ngram_lm_scale_1.7_attention_scale_2.0 21.94
|
1052 |
+
ngram_lm_scale_3.0_attention_scale_5.0 21.99
|
1053 |
+
ngram_lm_scale_1.5_attention_scale_1.5 22.12
|
1054 |
+
ngram_lm_scale_2.2_attention_scale_3.0 22.28
|
1055 |
+
ngram_lm_scale_1.7_attention_scale_1.9 22.31
|
1056 |
+
ngram_lm_scale_1.1_attention_scale_0.6 22.38
|
1057 |
+
ngram_lm_scale_1.9_attention_scale_2.3 22.41
|
1058 |
+
ngram_lm_scale_2.0_attention_scale_2.5 22.45
|
1059 |
+
ngram_lm_scale_1.3_attention_scale_1.0 22.46
|
1060 |
+
ngram_lm_scale_1.9_attention_scale_2.2 22.75
|
1061 |
+
ngram_lm_scale_2.3_attention_scale_3.0 22.87
|
1062 |
+
ngram_lm_scale_1.3_attention_scale_0.9 23.06
|
1063 |
+
ngram_lm_scale_1.5_attention_scale_1.3 23.07
|
1064 |
+
ngram_lm_scale_1.2_attention_scale_0.7 23.08
|
1065 |
+
ngram_lm_scale_1.7_attention_scale_1.7 23.13
|
1066 |
+
ngram_lm_scale_1.9_attention_scale_2.1 23.14
|
1067 |
+
ngram_lm_scale_2.0_attention_scale_2.3 23.16
|
1068 |
+
ngram_lm_scale_2.1_attention_scale_2.5 23.17
|
1069 |
+
ngram_lm_scale_1.1_attention_scale_0.5 23.27
|
1070 |
+
ngram_lm_scale_1.9_attention_scale_2.0 23.5
|
1071 |
+
ngram_lm_scale_2.0_attention_scale_2.2 23.51
|
1072 |
+
ngram_lm_scale_1.0_attention_scale_0.3 23.6
|
1073 |
+
ngram_lm_scale_1.5_attention_scale_1.2 23.6
|
1074 |
+
ngram_lm_scale_2.2_attention_scale_2.5 23.9
|
1075 |
+
ngram_lm_scale_2.1_attention_scale_2.3 23.91
|
1076 |
+
ngram_lm_scale_2.0_attention_scale_2.1 23.94
|
1077 |
+
ngram_lm_scale_1.9_attention_scale_1.9 23.99
|
1078 |
+
ngram_lm_scale_1.2_attention_scale_0.6 24.02
|
1079 |
+
ngram_lm_scale_1.7_attention_scale_1.5 24.06
|
1080 |
+
ngram_lm_scale_3.0_attention_scale_4.0 24.11
|
1081 |
+
ngram_lm_scale_2.5_attention_scale_3.0 24.24
|
1082 |
+
ngram_lm_scale_1.5_attention_scale_1.1 24.25
|
1083 |
+
ngram_lm_scale_0.9_attention_scale_0.1 24.27
|
1084 |
+
ngram_lm_scale_2.1_attention_scale_2.2 24.36
|
1085 |
+
ngram_lm_scale_2.0_attention_scale_2.0 24.42
|
1086 |
+
ngram_lm_scale_0.9_attention_scale_0.08 24.57
|
1087 |
+
ngram_lm_scale_1.3_attention_scale_0.7 24.62
|
1088 |
+
ngram_lm_scale_2.3_attention_scale_2.5 24.62
|
1089 |
+
ngram_lm_scale_2.2_attention_scale_2.3 24.72
|
1090 |
+
ngram_lm_scale_2.1_attention_scale_2.1 24.79
|
1091 |
+
ngram_lm_scale_2.0_attention_scale_1.9 24.87
|
1092 |
+
ngram_lm_scale_1.5_attention_scale_1.0 24.99
|
1093 |
+
ngram_lm_scale_1.2_attention_scale_0.5 25.0
|
1094 |
+
ngram_lm_scale_1.9_attention_scale_1.7 25.05
|
1095 |
+
ngram_lm_scale_2.2_attention_scale_2.2 25.15
|
1096 |
+
ngram_lm_scale_0.9_attention_scale_0.05 25.18
|
1097 |
+
ngram_lm_scale_2.1_attention_scale_2.0 25.28
|
1098 |
+
ngram_lm_scale_1.7_attention_scale_1.3 25.29
|
1099 |
+
ngram_lm_scale_2.3_attention_scale_2.3 25.47
|
1100 |
+
ngram_lm_scale_1.1_attention_scale_0.3 25.55
|
1101 |
+
ngram_lm_scale_1.3_attention_scale_0.6 25.6
|
1102 |
+
ngram_lm_scale_2.2_attention_scale_2.1 25.61
|
1103 |
+
ngram_lm_scale_1.5_attention_scale_0.9 25.73
|
1104 |
+
ngram_lm_scale_2.1_attention_scale_1.9 25.79
|
1105 |
+
ngram_lm_scale_2.3_attention_scale_2.2 25.92
|
1106 |
+
ngram_lm_scale_0.9_attention_scale_0.01 25.95
|
1107 |
+
ngram_lm_scale_4.0_attention_scale_5.0 25.96
|
1108 |
+
ngram_lm_scale_2.0_attention_scale_1.7 25.97
|
1109 |
+
ngram_lm_scale_1.7_attention_scale_1.2 26.03
|
1110 |
+
ngram_lm_scale_2.5_attention_scale_2.5 26.03
|
1111 |
+
ngram_lm_scale_2.2_attention_scale_2.0 26.1
|
1112 |
+
ngram_lm_scale_1.9_attention_scale_1.5 26.22
|
1113 |
+
ngram_lm_scale_2.3_attention_scale_2.1 26.44
|
1114 |
+
ngram_lm_scale_1.0_attention_scale_0.1 26.63
|
1115 |
+
ngram_lm_scale_1.3_attention_scale_0.5 26.68
|
1116 |
+
ngram_lm_scale_2.2_attention_scale_1.9 26.69
|
1117 |
+
ngram_lm_scale_1.7_attention_scale_1.1 26.71
|
1118 |
+
ngram_lm_scale_2.1_attention_scale_1.7 26.89
|
1119 |
+
ngram_lm_scale_2.3_attention_scale_2.0 26.96
|
1120 |
+
ngram_lm_scale_2.5_attention_scale_2.3 26.98
|
1121 |
+
ngram_lm_scale_1.0_attention_scale_0.08 26.99
|
1122 |
+
ngram_lm_scale_2.0_attention_scale_1.5 27.15
|
1123 |
+
ngram_lm_scale_3.0_attention_scale_3.0 27.22
|
1124 |
+
ngram_lm_scale_1.9_attention_scale_1.3 27.44
|
1125 |
+
ngram_lm_scale_1.7_attention_scale_1.0 27.46
|
1126 |
+
ngram_lm_scale_2.5_attention_scale_2.2 27.47
|
1127 |
+
ngram_lm_scale_2.3_attention_scale_1.9 27.49
|
1128 |
+
ngram_lm_scale_1.0_attention_scale_0.05 27.54
|
1129 |
+
ngram_lm_scale_1.2_attention_scale_0.3 27.57
|
1130 |
+
ngram_lm_scale_1.5_attention_scale_0.7 27.67
|
1131 |
+
ngram_lm_scale_2.2_attention_scale_1.7 27.75
|
1132 |
+
ngram_lm_scale_2.5_attention_scale_2.1 27.96
|
1133 |
+
ngram_lm_scale_2.1_attention_scale_1.5 28.08
|
1134 |
+
ngram_lm_scale_1.9_attention_scale_1.2 28.17
|
1135 |
+
ngram_lm_scale_1.0_attention_scale_0.01 28.32
|
1136 |
+
ngram_lm_scale_1.7_attention_scale_0.9 28.38
|
1137 |
+
ngram_lm_scale_2.5_attention_scale_2.0 28.4
|
1138 |
+
ngram_lm_scale_2.0_attention_scale_1.3 28.5
|
1139 |
+
ngram_lm_scale_2.3_attention_scale_1.7 28.57
|
1140 |
+
ngram_lm_scale_4.0_attention_scale_4.0 28.74
|
1141 |
+
ngram_lm_scale_1.5_attention_scale_0.6 28.79
|
1142 |
+
ngram_lm_scale_1.1_attention_scale_0.1 28.81
|
1143 |
+
ngram_lm_scale_1.9_attention_scale_1.1 28.98
|
1144 |
+
ngram_lm_scale_2.2_attention_scale_1.5 28.99
|
1145 |
+
ngram_lm_scale_2.5_attention_scale_1.9 29.0
|
1146 |
+
ngram_lm_scale_1.1_attention_scale_0.08 29.2
|
1147 |
+
ngram_lm_scale_2.0_attention_scale_1.2 29.21
|
1148 |
+
ngram_lm_scale_3.0_attention_scale_2.5 29.36
|
1149 |
+
ngram_lm_scale_2.1_attention_scale_1.3 29.42
|
1150 |
+
ngram_lm_scale_1.3_attention_scale_0.3 29.49
|
1151 |
+
ngram_lm_scale_5.0_attention_scale_5.0 29.66
|
1152 |
+
ngram_lm_scale_2.3_attention_scale_1.5 29.78
|
1153 |
+
ngram_lm_scale_1.1_attention_scale_0.05 29.79
|
1154 |
+
ngram_lm_scale_1.9_attention_scale_1.0 29.8
|
1155 |
+
ngram_lm_scale_2.0_attention_scale_1.1 30.02
|
1156 |
+
ngram_lm_scale_1.5_attention_scale_0.5 30.04
|
1157 |
+
ngram_lm_scale_2.5_attention_scale_1.7 30.09
|
1158 |
+
ngram_lm_scale_3.0_attention_scale_2.3 30.14
|
1159 |
+
ngram_lm_scale_2.1_attention_scale_1.2 30.19
|
1160 |
+
ngram_lm_scale_2.2_attention_scale_1.3 30.32
|
1161 |
+
ngram_lm_scale_1.7_attention_scale_0.7 30.36
|
1162 |
+
ngram_lm_scale_1.1_attention_scale_0.01 30.53
|
1163 |
+
ngram_lm_scale_3.0_attention_scale_2.2 30.68
|
1164 |
+
ngram_lm_scale_1.9_attention_scale_0.9 30.76
|
1165 |
+
ngram_lm_scale_1.2_attention_scale_0.1 30.81
|
1166 |
+
ngram_lm_scale_2.0_attention_scale_1.0 30.87
|
1167 |
+
ngram_lm_scale_2.1_attention_scale_1.1 31.02
|
1168 |
+
ngram_lm_scale_2.2_attention_scale_1.2 31.09
|
1169 |
+
ngram_lm_scale_3.0_attention_scale_2.1 31.12
|
1170 |
+
ngram_lm_scale_2.3_attention_scale_1.3 31.18
|
1171 |
+
ngram_lm_scale_1.2_attention_scale_0.08 31.2
|
1172 |
+
ngram_lm_scale_2.5_attention_scale_1.5 31.35
|
1173 |
+
ngram_lm_scale_1.7_attention_scale_0.6 31.49
|
1174 |
+
ngram_lm_scale_3.0_attention_scale_2.0 31.62
|
1175 |
+
ngram_lm_scale_2.0_attention_scale_0.9 31.75
|
1176 |
+
ngram_lm_scale_1.2_attention_scale_0.05 31.77
|
1177 |
+
ngram_lm_scale_2.1_attention_scale_1.0 31.78
|
1178 |
+
ngram_lm_scale_2.2_attention_scale_1.1 31.87
|
1179 |
+
ngram_lm_scale_2.3_attention_scale_1.2 31.91
|
1180 |
+
ngram_lm_scale_4.0_attention_scale_3.0 32.04
|
1181 |
+
ngram_lm_scale_3.0_attention_scale_1.9 32.1
|
1182 |
+
ngram_lm_scale_5.0_attention_scale_4.0 32.25
|
1183 |
+
ngram_lm_scale_1.2_attention_scale_0.01 32.41
|
1184 |
+
ngram_lm_scale_1.5_attention_scale_0.3 32.5
|
1185 |
+
ngram_lm_scale_1.3_attention_scale_0.1 32.52
|
1186 |
+
ngram_lm_scale_1.7_attention_scale_0.5 32.55
|
1187 |
+
ngram_lm_scale_2.2_attention_scale_1.0 32.59
|
1188 |
+
ngram_lm_scale_2.3_attention_scale_1.1 32.59
|
1189 |
+
ngram_lm_scale_1.9_attention_scale_0.7 32.6
|
1190 |
+
ngram_lm_scale_2.1_attention_scale_0.9 32.61
|
1191 |
+
ngram_lm_scale_2.5_attention_scale_1.3 32.63
|
1192 |
+
ngram_lm_scale_1.3_attention_scale_0.08 32.75
|
1193 |
+
ngram_lm_scale_3.0_attention_scale_1.7 33.19
|
1194 |
+
ngram_lm_scale_2.5_attention_scale_1.2 33.22
|
1195 |
+
ngram_lm_scale_1.3_attention_scale_0.05 33.24
|
1196 |
+
ngram_lm_scale_2.3_attention_scale_1.0 33.26
|
1197 |
+
ngram_lm_scale_2.2_attention_scale_0.9 33.29
|
1198 |
+
ngram_lm_scale_2.0_attention_scale_0.7 33.42
|
1199 |
+
ngram_lm_scale_1.9_attention_scale_0.6 33.48
|
1200 |
+
ngram_lm_scale_4.0_attention_scale_2.5 33.7
|
1201 |
+
ngram_lm_scale_1.3_attention_scale_0.01 33.83
|
1202 |
+
ngram_lm_scale_2.5_attention_scale_1.1 33.86
|
1203 |
+
ngram_lm_scale_2.3_attention_scale_0.9 33.99
|
1204 |
+
ngram_lm_scale_3.0_attention_scale_1.5 34.09
|
1205 |
+
ngram_lm_scale_2.1_attention_scale_0.7 34.14
|
1206 |
+
ngram_lm_scale_2.0_attention_scale_0.6 34.27
|
1207 |
+
ngram_lm_scale_1.9_attention_scale_0.5 34.36
|
1208 |
+
ngram_lm_scale_4.0_attention_scale_2.3 34.42
|
1209 |
+
ngram_lm_scale_2.5_attention_scale_1.0 34.49
|
1210 |
+
ngram_lm_scale_1.7_attention_scale_0.3 34.54
|
1211 |
+
ngram_lm_scale_4.0_attention_scale_2.2 34.77
|
1212 |
+
ngram_lm_scale_2.2_attention_scale_0.7 34.78
|
1213 |
+
ngram_lm_scale_5.0_attention_scale_3.0 34.81
|
1214 |
+
ngram_lm_scale_1.5_attention_scale_0.1 34.85
|
1215 |
+
ngram_lm_scale_2.1_attention_scale_0.6 34.88
|
1216 |
+
ngram_lm_scale_2.0_attention_scale_0.5 35.01
|
1217 |
+
ngram_lm_scale_1.5_attention_scale_0.08 35.03
|
1218 |
+
ngram_lm_scale_2.5_attention_scale_0.9 35.03
|
1219 |
+
ngram_lm_scale_3.0_attention_scale_1.3 35.07
|
1220 |
+
ngram_lm_scale_4.0_attention_scale_2.1 35.08
|
1221 |
+
ngram_lm_scale_2.3_attention_scale_0.7 35.29
|
1222 |
+
ngram_lm_scale_1.5_attention_scale_0.05 35.31
|
1223 |
+
ngram_lm_scale_4.0_attention_scale_2.0 35.35
|
1224 |
+
ngram_lm_scale_2.2_attention_scale_0.6 35.4
|
1225 |
+
ngram_lm_scale_3.0_attention_scale_1.2 35.48
|
1226 |
+
ngram_lm_scale_2.1_attention_scale_0.5 35.53
|
1227 |
+
ngram_lm_scale_4.0_attention_scale_1.9 35.66
|
1228 |
+
ngram_lm_scale_1.5_attention_scale_0.01 35.74
|
1229 |
+
ngram_lm_scale_2.3_attention_scale_0.6 35.79
|
1230 |
+
ngram_lm_scale_1.9_attention_scale_0.3 35.84
|
1231 |
+
ngram_lm_scale_3.0_attention_scale_1.1 35.86
|
1232 |
+
ngram_lm_scale_2.2_attention_scale_0.5 35.95
|
1233 |
+
ngram_lm_scale_5.0_attention_scale_2.5 35.98
|
1234 |
+
ngram_lm_scale_2.5_attention_scale_0.7 36.0
|
1235 |
+
ngram_lm_scale_1.7_attention_scale_0.1 36.21
|
1236 |
+
ngram_lm_scale_4.0_attention_scale_1.7 36.23
|
1237 |
+
ngram_lm_scale_3.0_attention_scale_1.0 36.26
|
1238 |
+
ngram_lm_scale_2.0_attention_scale_0.3 36.33
|
1239 |
+
ngram_lm_scale_2.3_attention_scale_0.5 36.33
|
1240 |
+
ngram_lm_scale_1.7_attention_scale_0.08 36.39
|
1241 |
+
ngram_lm_scale_5.0_attention_scale_2.3 36.42
|
1242 |
+
ngram_lm_scale_2.5_attention_scale_0.6 36.49
|
1243 |
+
ngram_lm_scale_5.0_attention_scale_2.2 36.59
|
1244 |
+
ngram_lm_scale_1.7_attention_scale_0.05 36.61
|
1245 |
+
ngram_lm_scale_3.0_attention_scale_0.9 36.61
|
1246 |
+
ngram_lm_scale_2.1_attention_scale_0.3 36.73
|
1247 |
+
ngram_lm_scale_4.0_attention_scale_1.5 36.75
|
1248 |
+
ngram_lm_scale_5.0_attention_scale_2.1 36.84
|
1249 |
+
ngram_lm_scale_1.7_attention_scale_0.01 36.86
|
1250 |
+
ngram_lm_scale_2.5_attention_scale_0.5 37.0
|
1251 |
+
ngram_lm_scale_2.2_attention_scale_0.3 37.05
|
1252 |
+
ngram_lm_scale_5.0_attention_scale_2.0 37.05
|
1253 |
+
ngram_lm_scale_1.9_attention_scale_0.1 37.17
|
1254 |
+
ngram_lm_scale_5.0_attention_scale_1.9 37.27
|
1255 |
+
ngram_lm_scale_1.9_attention_scale_0.08 37.3
|
1256 |
+
ngram_lm_scale_4.0_attention_scale_1.3 37.32
|
1257 |
+
ngram_lm_scale_2.3_attention_scale_0.3 37.39
|
1258 |
+
ngram_lm_scale_3.0_attention_scale_0.7 37.39
|
1259 |
+
ngram_lm_scale_1.9_attention_scale_0.05 37.48
|
1260 |
+
ngram_lm_scale_2.0_attention_scale_0.1 37.52
|
1261 |
+
ngram_lm_scale_4.0_attention_scale_1.2 37.56
|
1262 |
+
ngram_lm_scale_2.0_attention_scale_0.08 37.63
|
1263 |
+
ngram_lm_scale_5.0_attention_scale_1.7 37.66
|
1264 |
+
ngram_lm_scale_1.9_attention_scale_0.01 37.72
|
1265 |
+
ngram_lm_scale_2.1_attention_scale_0.1 37.78
|
1266 |
+
ngram_lm_scale_3.0_attention_scale_0.6 37.79
|
1267 |
+
ngram_lm_scale_2.0_attention_scale_0.05 37.8
|
1268 |
+
ngram_lm_scale_4.0_attention_scale_1.1 37.82
|
1269 |
+
ngram_lm_scale_2.1_attention_scale_0.08 37.87
|
1270 |
+
ngram_lm_scale_2.5_attention_scale_0.3 37.91
|
1271 |
+
ngram_lm_scale_2.0_attention_scale_0.01 37.99
|
1272 |
+
ngram_lm_scale_2.1_attention_scale_0.05 38.03
|
1273 |
+
ngram_lm_scale_5.0_attention_scale_1.5 38.04
|
1274 |
+
ngram_lm_scale_2.2_attention_scale_0.1 38.05
|
1275 |
+
ngram_lm_scale_4.0_attention_scale_1.0 38.11
|
1276 |
+
ngram_lm_scale_3.0_attention_scale_0.5 38.13
|
1277 |
+
ngram_lm_scale_2.2_attention_scale_0.08 38.16
|
1278 |
+
ngram_lm_scale_2.1_attention_scale_0.01 38.23
|
1279 |
+
ngram_lm_scale_2.3_attention_scale_0.1 38.27
|
1280 |
+
ngram_lm_scale_2.2_attention_scale_0.05 38.28
|
1281 |
+
ngram_lm_scale_4.0_attention_scale_0.9 38.33
|
1282 |
+
ngram_lm_scale_2.3_attention_scale_0.08 38.35
|
1283 |
+
ngram_lm_scale_5.0_attention_scale_1.3 38.41
|
1284 |
+
ngram_lm_scale_2.2_attention_scale_0.01 38.45
|
1285 |
+
ngram_lm_scale_2.3_attention_scale_0.05 38.47
|
1286 |
+
ngram_lm_scale_5.0_attention_scale_1.2 38.58
|
1287 |
+
ngram_lm_scale_2.5_attention_scale_0.1 38.61
|
1288 |
+
ngram_lm_scale_2.3_attention_scale_0.01 38.65
|
1289 |
+
ngram_lm_scale_2.5_attention_scale_0.08 38.68
|
1290 |
+
ngram_lm_scale_3.0_attention_scale_0.3 38.72
|
1291 |
+
ngram_lm_scale_5.0_attention_scale_1.1 38.73
|
1292 |
+
ngram_lm_scale_2.5_attention_scale_0.05 38.77
|
1293 |
+
ngram_lm_scale_4.0_attention_scale_0.7 38.77
|
1294 |
+
ngram_lm_scale_5.0_attention_scale_1.0 38.9
|
1295 |
+
ngram_lm_scale_2.5_attention_scale_0.01 38.93
|
1296 |
+
ngram_lm_scale_4.0_attention_scale_0.6 38.94
|
1297 |
+
ngram_lm_scale_5.0_attention_scale_0.9 39.04
|
1298 |
+
ngram_lm_scale_4.0_attention_scale_0.5 39.11
|
1299 |
+
ngram_lm_scale_3.0_attention_scale_0.1 39.23
|
1300 |
+
ngram_lm_scale_3.0_attention_scale_0.08 39.3
|
1301 |
+
ngram_lm_scale_5.0_attention_scale_0.7 39.33
|
1302 |
+
ngram_lm_scale_3.0_attention_scale_0.05 39.38
|
1303 |
+
ngram_lm_scale_5.0_attention_scale_0.6 39.44
|
1304 |
+
ngram_lm_scale_3.0_attention_scale_0.01 39.48
|
1305 |
+
ngram_lm_scale_4.0_attention_scale_0.3 39.49
|
1306 |
+
ngram_lm_scale_5.0_attention_scale_0.5 39.59
|
1307 |
+
ngram_lm_scale_4.0_attention_scale_0.1 39.91
|
1308 |
+
ngram_lm_scale_4.0_attention_scale_0.08 39.94
|
1309 |
+
ngram_lm_scale_5.0_attention_scale_0.3 39.95
|
1310 |
+
ngram_lm_scale_4.0_attention_scale_0.05 39.98
|
1311 |
+
ngram_lm_scale_4.0_attention_scale_0.01 40.02
|
1312 |
+
ngram_lm_scale_5.0_attention_scale_0.1 40.22
|
1313 |
+
ngram_lm_scale_5.0_attention_scale_0.08 40.25
|
1314 |
+
ngram_lm_scale_5.0_attention_scale_0.05 40.3
|
1315 |
+
ngram_lm_scale_5.0_attention_scale_0.01 40.35
|
1316 |
+
|
1317 |
+
2022-06-27 01:34:55,856 INFO [decode.py:695] Done!
|
decoding-results/log-attention-decoder/log-decode-2022-06-24-17-22-16
ADDED
@@ -0,0 +1,1428 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-24 17:22:16,163 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-24 17:22:16,164 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.11', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '', 'k2-git-date': '', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-cuda-available': False, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': '7e72d78-dirty', 'icefall-git-date': 'Sat May 28 19:13:53 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3srv026', 'IP address': '10.141.0.21'}, 'epoch': 39, 'avg': 10, 'method': 'attention-decoder', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 100, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': False, 'drop_last': True, 'return_cuts': True, 'num_workers': 20, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'enable_musan': False}
|
3 |
+
2022-06-24 17:22:16,482 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-24 17:22:16,882 INFO [decode.py:559] device: cpu
|
5 |
+
2022-06-24 17:22:40,122 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-24 17:22:46,616 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-30.pt', 'conformer_ctc/exp_5000_att0.8/epoch-31.pt', 'conformer_ctc/exp_5000_att0.8/epoch-32.pt', 'conformer_ctc/exp_5000_att0.8/epoch-33.pt', 'conformer_ctc/exp_5000_att0.8/epoch-34.pt', 'conformer_ctc/exp_5000_att0.8/epoch-35.pt', 'conformer_ctc/exp_5000_att0.8/epoch-36.pt', 'conformer_ctc/exp_5000_att0.8/epoch-37.pt', 'conformer_ctc/exp_5000_att0.8/epoch-38.pt', 'conformer_ctc/exp_5000_att0.8/epoch-39.pt']
|
7 |
+
2022-06-24 17:26:25,489 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-24 17:26:25,489 INFO [asr_datamodule.py:374] About to get test cuts
|
9 |
+
2022-06-24 17:26:25,524 INFO [asr_datamodule.py:367] About to get dev cuts
|
10 |
+
2022-06-24 17:29:23,798 INFO [decode.py:483] batch 0/?, cuts processed until now is 13
|
11 |
+
2022-06-24 18:44:36,966 INFO [decode.py:733] Caught exception:
|
12 |
+
|
13 |
+
Some bad things happened. Please read the above error messages and stack
|
14 |
+
trace. If you are using Python, the following command may be helpful:
|
15 |
+
|
16 |
+
gdb --args python /path/to/your/code.py
|
17 |
+
|
18 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
19 |
+
a debug version of k2.).
|
20 |
+
|
21 |
+
If you are unable to fix it, please open an issue at:
|
22 |
+
|
23 |
+
https://github.com/k2-fsa/k2/issues/new
|
24 |
+
|
25 |
+
|
26 |
+
2022-06-24 18:44:36,969 INFO [decode.py:734] num_arcs before pruning: 3793830
|
27 |
+
2022-06-24 18:44:36,969 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
28 |
+
2022-06-24 18:44:37,490 INFO [decode.py:747] num_arcs after pruning: 15363
|
29 |
+
2022-06-24 18:49:44,859 INFO [decode.py:733] Caught exception:
|
30 |
+
|
31 |
+
Some bad things happened. Please read the above error messages and stack
|
32 |
+
trace. If you are using Python, the following command may be helpful:
|
33 |
+
|
34 |
+
gdb --args python /path/to/your/code.py
|
35 |
+
|
36 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
37 |
+
a debug version of k2.).
|
38 |
+
|
39 |
+
If you are unable to fix it, please open an issue at:
|
40 |
+
|
41 |
+
https://github.com/k2-fsa/k2/issues/new
|
42 |
+
|
43 |
+
|
44 |
+
2022-06-24 18:49:44,860 INFO [decode.py:734] num_arcs before pruning: 7793289
|
45 |
+
2022-06-24 18:49:44,860 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
46 |
+
2022-06-24 18:49:45,947 INFO [decode.py:747] num_arcs after pruning: 21157
|
47 |
+
2022-06-24 19:57:06,281 INFO [decode.py:733] Caught exception:
|
48 |
+
|
49 |
+
Some bad things happened. Please read the above error messages and stack
|
50 |
+
trace. If you are using Python, the following command may be helpful:
|
51 |
+
|
52 |
+
gdb --args python /path/to/your/code.py
|
53 |
+
|
54 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
55 |
+
a debug version of k2.).
|
56 |
+
|
57 |
+
If you are unable to fix it, please open an issue at:
|
58 |
+
|
59 |
+
https://github.com/k2-fsa/k2/issues/new
|
60 |
+
|
61 |
+
|
62 |
+
2022-06-24 19:57:06,283 INFO [decode.py:734] num_arcs before pruning: 4231973
|
63 |
+
2022-06-24 19:57:06,284 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
64 |
+
2022-06-24 19:57:06,903 INFO [decode.py:747] num_arcs after pruning: 24242
|
65 |
+
2022-06-25 00:20:28,938 INFO [decode.py:733] Caught exception:
|
66 |
+
|
67 |
+
Some bad things happened. Please read the above error messages and stack
|
68 |
+
trace. If you are using Python, the following command may be helpful:
|
69 |
+
|
70 |
+
gdb --args python /path/to/your/code.py
|
71 |
+
|
72 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
73 |
+
a debug version of k2.).
|
74 |
+
|
75 |
+
If you are unable to fix it, please open an issue at:
|
76 |
+
|
77 |
+
https://github.com/k2-fsa/k2/issues/new
|
78 |
+
|
79 |
+
|
80 |
+
2022-06-25 00:20:28,940 INFO [decode.py:734] num_arcs before pruning: 6579893
|
81 |
+
2022-06-25 00:20:28,940 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
82 |
+
2022-06-25 00:20:29,808 INFO [decode.py:747] num_arcs after pruning: 22371
|
83 |
+
2022-06-25 00:28:47,575 INFO [decode.py:483] batch 100/?, cuts processed until now is 1483
|
84 |
+
2022-06-25 06:00:11,280 INFO [decode.py:733] Caught exception:
|
85 |
+
|
86 |
+
Some bad things happened. Please read the above error messages and stack
|
87 |
+
trace. If you are using Python, the following command may be helpful:
|
88 |
+
|
89 |
+
gdb --args python /path/to/your/code.py
|
90 |
+
|
91 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
92 |
+
a debug version of k2.).
|
93 |
+
|
94 |
+
If you are unable to fix it, please open an issue at:
|
95 |
+
|
96 |
+
https://github.com/k2-fsa/k2/issues/new
|
97 |
+
|
98 |
+
|
99 |
+
2022-06-25 06:00:11,282 INFO [decode.py:734] num_arcs before pruning: 7814892
|
100 |
+
2022-06-25 06:00:11,282 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
101 |
+
2022-06-25 06:00:12,279 INFO [decode.py:747] num_arcs after pruning: 23294
|
102 |
+
2022-06-25 07:10:42,039 INFO [decode.py:483] batch 200/?, cuts processed until now is 3045
|
103 |
+
2022-06-25 08:39:07,227 INFO [decode.py:733] Caught exception:
|
104 |
+
|
105 |
+
Some bad things happened. Please read the above error messages and stack
|
106 |
+
trace. If you are using Python, the following command may be helpful:
|
107 |
+
|
108 |
+
gdb --args python /path/to/your/code.py
|
109 |
+
|
110 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
111 |
+
a debug version of k2.).
|
112 |
+
|
113 |
+
If you are unable to fix it, please open an issue at:
|
114 |
+
|
115 |
+
https://github.com/k2-fsa/k2/issues/new
|
116 |
+
|
117 |
+
|
118 |
+
2022-06-25 08:39:07,231 INFO [decode.py:734] num_arcs before pruning: 5093346
|
119 |
+
2022-06-25 08:39:07,231 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
120 |
+
2022-06-25 08:39:08,033 INFO [decode.py:747] num_arcs after pruning: 21764
|
121 |
+
2022-06-25 09:11:54,803 INFO [decode.py:733] Caught exception:
|
122 |
+
|
123 |
+
Some bad things happened. Please read the above error messages and stack
|
124 |
+
trace. If you are using Python, the following command may be helpful:
|
125 |
+
|
126 |
+
gdb --args python /path/to/your/code.py
|
127 |
+
|
128 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
129 |
+
a debug version of k2.).
|
130 |
+
|
131 |
+
If you are unable to fix it, please open an issue at:
|
132 |
+
|
133 |
+
https://github.com/k2-fsa/k2/issues/new
|
134 |
+
|
135 |
+
|
136 |
+
2022-06-25 09:11:54,804 INFO [decode.py:734] num_arcs before pruning: 10821654
|
137 |
+
2022-06-25 09:11:54,804 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
138 |
+
2022-06-25 09:11:56,386 INFO [decode.py:747] num_arcs after pruning: 28715
|
139 |
+
2022-06-25 14:37:16,871 INFO [decode.py:483] batch 300/?, cuts processed until now is 4598
|
140 |
+
2022-06-25 14:58:22,045 INFO [decode.py:733] Caught exception:
|
141 |
+
|
142 |
+
Some bad things happened. Please read the above error messages and stack
|
143 |
+
trace. If you are using Python, the following command may be helpful:
|
144 |
+
|
145 |
+
gdb --args python /path/to/your/code.py
|
146 |
+
|
147 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
148 |
+
a debug version of k2.).
|
149 |
+
|
150 |
+
If you are unable to fix it, please open an issue at:
|
151 |
+
|
152 |
+
https://github.com/k2-fsa/k2/issues/new
|
153 |
+
|
154 |
+
|
155 |
+
2022-06-25 14:58:22,046 INFO [decode.py:734] num_arcs before pruning: 5552108
|
156 |
+
2022-06-25 14:58:22,046 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
157 |
+
2022-06-25 14:58:22,815 INFO [decode.py:747] num_arcs after pruning: 21203
|
158 |
+
2022-06-25 17:28:28,538 INFO [decode.py:532]
|
159 |
+
For test, WER of different settings are:
|
160 |
+
ngram_lm_scale_0.01_attention_scale_0.5 15.07 best for test
|
161 |
+
ngram_lm_scale_0.05_attention_scale_0.5 15.07
|
162 |
+
ngram_lm_scale_0.01_attention_scale_0.3 15.08
|
163 |
+
ngram_lm_scale_0.01_attention_scale_0.7 15.08
|
164 |
+
ngram_lm_scale_0.08_attention_scale_0.5 15.08
|
165 |
+
ngram_lm_scale_0.01_attention_scale_0.6 15.09
|
166 |
+
ngram_lm_scale_0.05_attention_scale_0.6 15.09
|
167 |
+
ngram_lm_scale_0.05_attention_scale_0.3 15.1
|
168 |
+
ngram_lm_scale_0.01_attention_scale_0.9 15.11
|
169 |
+
ngram_lm_scale_0.05_attention_scale_0.7 15.11
|
170 |
+
ngram_lm_scale_0.08_attention_scale_0.3 15.11
|
171 |
+
ngram_lm_scale_0.08_attention_scale_0.6 15.11
|
172 |
+
ngram_lm_scale_0.08_attention_scale_0.7 15.11
|
173 |
+
ngram_lm_scale_0.1_attention_scale_0.5 15.11
|
174 |
+
ngram_lm_scale_0.1_attention_scale_0.3 15.12
|
175 |
+
ngram_lm_scale_0.1_attention_scale_0.6 15.12
|
176 |
+
ngram_lm_scale_0.01_attention_scale_1.0 15.13
|
177 |
+
ngram_lm_scale_0.05_attention_scale_0.9 15.13
|
178 |
+
ngram_lm_scale_0.1_attention_scale_0.7 15.13
|
179 |
+
ngram_lm_scale_0.01_attention_scale_0.1 15.14
|
180 |
+
ngram_lm_scale_0.01_attention_scale_1.1 15.14
|
181 |
+
ngram_lm_scale_0.05_attention_scale_1.0 15.14
|
182 |
+
ngram_lm_scale_0.08_attention_scale_0.9 15.14
|
183 |
+
ngram_lm_scale_0.1_attention_scale_0.9 15.14
|
184 |
+
ngram_lm_scale_0.01_attention_scale_1.2 15.16
|
185 |
+
ngram_lm_scale_0.05_attention_scale_0.1 15.16
|
186 |
+
ngram_lm_scale_0.08_attention_scale_1.0 15.16
|
187 |
+
ngram_lm_scale_0.05_attention_scale_1.1 15.17
|
188 |
+
ngram_lm_scale_0.01_attention_scale_0.08 15.18
|
189 |
+
ngram_lm_scale_0.01_attention_scale_1.3 15.18
|
190 |
+
ngram_lm_scale_0.05_attention_scale_0.08 15.18
|
191 |
+
ngram_lm_scale_0.08_attention_scale_0.08 15.18
|
192 |
+
ngram_lm_scale_0.08_attention_scale_0.1 15.18
|
193 |
+
ngram_lm_scale_0.1_attention_scale_1.0 15.18
|
194 |
+
ngram_lm_scale_0.05_attention_scale_1.2 15.19
|
195 |
+
ngram_lm_scale_0.1_attention_scale_0.1 15.19
|
196 |
+
ngram_lm_scale_0.08_attention_scale_1.1 15.2
|
197 |
+
ngram_lm_scale_0.05_attention_scale_1.3 15.21
|
198 |
+
ngram_lm_scale_0.1_attention_scale_0.08 15.21
|
199 |
+
ngram_lm_scale_0.1_attention_scale_1.1 15.21
|
200 |
+
ngram_lm_scale_0.01_attention_scale_0.05 15.22
|
201 |
+
ngram_lm_scale_0.05_attention_scale_0.05 15.22
|
202 |
+
ngram_lm_scale_0.08_attention_scale_1.2 15.22
|
203 |
+
ngram_lm_scale_0.01_attention_scale_1.5 15.23
|
204 |
+
ngram_lm_scale_0.08_attention_scale_0.05 15.24
|
205 |
+
ngram_lm_scale_0.1_attention_scale_1.2 15.24
|
206 |
+
ngram_lm_scale_0.05_attention_scale_1.5 15.25
|
207 |
+
ngram_lm_scale_0.08_attention_scale_1.3 15.25
|
208 |
+
ngram_lm_scale_0.1_attention_scale_0.05 15.25
|
209 |
+
ngram_lm_scale_0.01_attention_scale_1.7 15.26
|
210 |
+
ngram_lm_scale_0.1_attention_scale_1.3 15.26
|
211 |
+
ngram_lm_scale_0.01_attention_scale_0.01 15.27
|
212 |
+
ngram_lm_scale_0.05_attention_scale_1.7 15.28
|
213 |
+
ngram_lm_scale_0.08_attention_scale_1.5 15.28
|
214 |
+
ngram_lm_scale_0.1_attention_scale_1.5 15.28
|
215 |
+
ngram_lm_scale_0.01_attention_scale_1.9 15.29
|
216 |
+
ngram_lm_scale_0.01_attention_scale_2.0 15.29
|
217 |
+
ngram_lm_scale_0.01_attention_scale_2.1 15.29
|
218 |
+
ngram_lm_scale_0.05_attention_scale_1.9 15.29
|
219 |
+
ngram_lm_scale_0.08_attention_scale_0.01 15.29
|
220 |
+
ngram_lm_scale_0.08_attention_scale_1.7 15.29
|
221 |
+
ngram_lm_scale_0.1_attention_scale_0.01 15.29
|
222 |
+
ngram_lm_scale_0.1_attention_scale_1.7 15.29
|
223 |
+
ngram_lm_scale_0.3_attention_scale_0.6 15.29
|
224 |
+
ngram_lm_scale_0.3_attention_scale_0.7 15.29
|
225 |
+
ngram_lm_scale_0.01_attention_scale_2.2 15.3
|
226 |
+
ngram_lm_scale_0.05_attention_scale_0.01 15.3
|
227 |
+
ngram_lm_scale_0.01_attention_scale_2.3 15.31
|
228 |
+
ngram_lm_scale_0.05_attention_scale_2.0 15.31
|
229 |
+
ngram_lm_scale_0.05_attention_scale_2.1 15.31
|
230 |
+
ngram_lm_scale_0.05_attention_scale_2.2 15.31
|
231 |
+
ngram_lm_scale_0.08_attention_scale_1.9 15.31
|
232 |
+
ngram_lm_scale_0.08_attention_scale_2.0 15.31
|
233 |
+
ngram_lm_scale_0.3_attention_scale_0.5 15.31
|
234 |
+
ngram_lm_scale_0.3_attention_scale_0.9 15.31
|
235 |
+
ngram_lm_scale_0.1_attention_scale_1.9 15.32
|
236 |
+
ngram_lm_scale_0.3_attention_scale_1.0 15.32
|
237 |
+
ngram_lm_scale_0.01_attention_scale_2.5 15.33
|
238 |
+
ngram_lm_scale_0.05_attention_scale_2.3 15.33
|
239 |
+
ngram_lm_scale_0.05_attention_scale_2.5 15.33
|
240 |
+
ngram_lm_scale_0.08_attention_scale_2.1 15.33
|
241 |
+
ngram_lm_scale_0.1_attention_scale_2.0 15.33
|
242 |
+
ngram_lm_scale_0.1_attention_scale_2.2 15.33
|
243 |
+
ngram_lm_scale_0.3_attention_scale_1.1 15.33
|
244 |
+
ngram_lm_scale_0.08_attention_scale_2.2 15.34
|
245 |
+
ngram_lm_scale_0.08_attention_scale_2.3 15.34
|
246 |
+
ngram_lm_scale_0.1_attention_scale_2.1 15.34
|
247 |
+
ngram_lm_scale_0.1_attention_scale_2.3 15.35
|
248 |
+
ngram_lm_scale_0.3_attention_scale_1.2 15.35
|
249 |
+
ngram_lm_scale_0.01_attention_scale_3.0 15.36
|
250 |
+
ngram_lm_scale_0.08_attention_scale_2.5 15.36
|
251 |
+
ngram_lm_scale_0.1_attention_scale_2.5 15.37
|
252 |
+
ngram_lm_scale_0.3_attention_scale_1.3 15.37
|
253 |
+
ngram_lm_scale_0.05_attention_scale_3.0 15.38
|
254 |
+
ngram_lm_scale_0.3_attention_scale_0.3 15.38
|
255 |
+
ngram_lm_scale_0.08_attention_scale_3.0 15.39
|
256 |
+
ngram_lm_scale_0.1_attention_scale_3.0 15.4
|
257 |
+
ngram_lm_scale_0.3_attention_scale_1.5 15.41
|
258 |
+
ngram_lm_scale_0.01_attention_scale_4.0 15.43
|
259 |
+
ngram_lm_scale_0.3_attention_scale_1.7 15.43
|
260 |
+
ngram_lm_scale_0.05_attention_scale_4.0 15.45
|
261 |
+
ngram_lm_scale_0.08_attention_scale_4.0 15.45
|
262 |
+
ngram_lm_scale_0.3_attention_scale_1.9 15.45
|
263 |
+
ngram_lm_scale_0.3_attention_scale_2.0 15.45
|
264 |
+
ngram_lm_scale_0.3_attention_scale_2.1 15.45
|
265 |
+
ngram_lm_scale_0.1_attention_scale_4.0 15.46
|
266 |
+
ngram_lm_scale_0.3_attention_scale_2.2 15.46
|
267 |
+
ngram_lm_scale_0.3_attention_scale_2.3 15.47
|
268 |
+
ngram_lm_scale_0.01_attention_scale_5.0 15.48
|
269 |
+
ngram_lm_scale_0.05_attention_scale_5.0 15.49
|
270 |
+
ngram_lm_scale_0.3_attention_scale_2.5 15.49
|
271 |
+
ngram_lm_scale_0.08_attention_scale_5.0 15.51
|
272 |
+
ngram_lm_scale_0.1_attention_scale_5.0 15.51
|
273 |
+
ngram_lm_scale_0.3_attention_scale_3.0 15.52
|
274 |
+
ngram_lm_scale_0.3_attention_scale_0.1 15.55
|
275 |
+
ngram_lm_scale_0.3_attention_scale_4.0 15.57
|
276 |
+
ngram_lm_scale_0.3_attention_scale_0.08 15.59
|
277 |
+
ngram_lm_scale_0.3_attention_scale_5.0 15.61
|
278 |
+
ngram_lm_scale_0.5_attention_scale_1.1 15.64
|
279 |
+
ngram_lm_scale_0.5_attention_scale_1.2 15.65
|
280 |
+
ngram_lm_scale_0.5_attention_scale_1.5 15.65
|
281 |
+
ngram_lm_scale_0.5_attention_scale_1.9 15.65
|
282 |
+
ngram_lm_scale_0.5_attention_scale_2.0 15.65
|
283 |
+
ngram_lm_scale_0.5_attention_scale_2.1 15.65
|
284 |
+
ngram_lm_scale_0.5_attention_scale_2.5 15.65
|
285 |
+
ngram_lm_scale_0.5_attention_scale_0.9 15.66
|
286 |
+
ngram_lm_scale_0.5_attention_scale_1.0 15.66
|
287 |
+
ngram_lm_scale_0.5_attention_scale_1.3 15.66
|
288 |
+
ngram_lm_scale_0.5_attention_scale_1.7 15.66
|
289 |
+
ngram_lm_scale_0.5_attention_scale_2.2 15.66
|
290 |
+
ngram_lm_scale_0.5_attention_scale_2.3 15.66
|
291 |
+
ngram_lm_scale_0.3_attention_scale_0.05 15.67
|
292 |
+
ngram_lm_scale_0.5_attention_scale_3.0 15.69
|
293 |
+
ngram_lm_scale_0.5_attention_scale_0.7 15.72
|
294 |
+
ngram_lm_scale_0.5_attention_scale_0.6 15.74
|
295 |
+
ngram_lm_scale_0.5_attention_scale_4.0 15.74
|
296 |
+
ngram_lm_scale_0.5_attention_scale_5.0 15.76
|
297 |
+
ngram_lm_scale_0.3_attention_scale_0.01 15.78
|
298 |
+
ngram_lm_scale_0.6_attention_scale_2.0 15.79
|
299 |
+
ngram_lm_scale_0.6_attention_scale_2.2 15.79
|
300 |
+
ngram_lm_scale_0.5_attention_scale_0.5 15.8
|
301 |
+
ngram_lm_scale_0.6_attention_scale_1.9 15.8
|
302 |
+
ngram_lm_scale_0.6_attention_scale_2.1 15.8
|
303 |
+
ngram_lm_scale_0.6_attention_scale_2.3 15.8
|
304 |
+
ngram_lm_scale_0.6_attention_scale_2.5 15.8
|
305 |
+
ngram_lm_scale_0.6_attention_scale_1.7 15.81
|
306 |
+
ngram_lm_scale_0.6_attention_scale_1.5 15.82
|
307 |
+
ngram_lm_scale_0.6_attention_scale_3.0 15.82
|
308 |
+
ngram_lm_scale_0.6_attention_scale_5.0 15.82
|
309 |
+
ngram_lm_scale_0.6_attention_scale_1.2 15.83
|
310 |
+
ngram_lm_scale_0.6_attention_scale_1.3 15.84
|
311 |
+
ngram_lm_scale_0.6_attention_scale_4.0 15.84
|
312 |
+
ngram_lm_scale_0.6_attention_scale_1.0 15.85
|
313 |
+
ngram_lm_scale_0.6_attention_scale_1.1 15.85
|
314 |
+
ngram_lm_scale_0.6_attention_scale_0.9 15.87
|
315 |
+
ngram_lm_scale_0.7_attention_scale_5.0 15.9
|
316 |
+
ngram_lm_scale_0.7_attention_scale_2.5 15.91
|
317 |
+
ngram_lm_scale_0.7_attention_scale_4.0 15.91
|
318 |
+
ngram_lm_scale_0.7_attention_scale_3.0 15.92
|
319 |
+
ngram_lm_scale_0.7_attention_scale_2.2 15.93
|
320 |
+
ngram_lm_scale_0.7_attention_scale_2.3 15.93
|
321 |
+
ngram_lm_scale_0.7_attention_scale_2.0 15.96
|
322 |
+
ngram_lm_scale_0.7_attention_scale_2.1 15.97
|
323 |
+
ngram_lm_scale_0.6_attention_scale_0.7 15.98
|
324 |
+
ngram_lm_scale_0.7_attention_scale_1.7 15.98
|
325 |
+
ngram_lm_scale_0.7_attention_scale_1.9 15.98
|
326 |
+
ngram_lm_scale_0.5_attention_scale_0.3 16.0
|
327 |
+
ngram_lm_scale_0.7_attention_scale_1.5 16.01
|
328 |
+
ngram_lm_scale_0.9_attention_scale_5.0 16.05
|
329 |
+
ngram_lm_scale_0.7_attention_scale_1.3 16.06
|
330 |
+
ngram_lm_scale_0.6_attention_scale_0.6 16.07
|
331 |
+
ngram_lm_scale_0.7_attention_scale_1.2 16.08
|
332 |
+
ngram_lm_scale_0.9_attention_scale_4.0 16.1
|
333 |
+
ngram_lm_scale_0.7_attention_scale_1.1 16.12
|
334 |
+
ngram_lm_scale_0.7_attention_scale_1.0 16.15
|
335 |
+
ngram_lm_scale_1.0_attention_scale_5.0 16.17
|
336 |
+
ngram_lm_scale_0.9_attention_scale_3.0 16.2
|
337 |
+
ngram_lm_scale_0.7_attention_scale_0.9 16.22
|
338 |
+
ngram_lm_scale_1.1_attention_scale_5.0 16.22
|
339 |
+
ngram_lm_scale_1.0_attention_scale_4.0 16.23
|
340 |
+
ngram_lm_scale_0.6_attention_scale_0.5 16.24
|
341 |
+
ngram_lm_scale_0.9_attention_scale_2.5 16.28
|
342 |
+
ngram_lm_scale_0.9_attention_scale_2.2 16.3
|
343 |
+
ngram_lm_scale_0.9_attention_scale_2.3 16.31
|
344 |
+
ngram_lm_scale_0.9_attention_scale_2.1 16.32
|
345 |
+
ngram_lm_scale_0.9_attention_scale_2.0 16.34
|
346 |
+
ngram_lm_scale_1.0_attention_scale_3.0 16.35
|
347 |
+
ngram_lm_scale_1.1_attention_scale_4.0 16.35
|
348 |
+
ngram_lm_scale_1.2_attention_scale_5.0 16.35
|
349 |
+
ngram_lm_scale_0.9_attention_scale_1.9 16.4
|
350 |
+
ngram_lm_scale_0.7_attention_scale_0.7 16.43
|
351 |
+
ngram_lm_scale_1.0_attention_scale_2.5 16.44
|
352 |
+
ngram_lm_scale_0.9_attention_scale_1.7 16.46
|
353 |
+
ngram_lm_scale_1.2_attention_scale_4.0 16.47
|
354 |
+
ngram_lm_scale_1.3_attention_scale_5.0 16.48
|
355 |
+
ngram_lm_scale_1.0_attention_scale_2.3 16.5
|
356 |
+
ngram_lm_scale_1.1_attention_scale_3.0 16.51
|
357 |
+
ngram_lm_scale_1.0_attention_scale_2.2 16.53
|
358 |
+
ngram_lm_scale_1.0_attention_scale_2.1 16.55
|
359 |
+
ngram_lm_scale_0.7_attention_scale_0.6 16.59
|
360 |
+
ngram_lm_scale_0.9_attention_scale_1.5 16.59
|
361 |
+
ngram_lm_scale_1.0_attention_scale_2.0 16.6
|
362 |
+
ngram_lm_scale_1.3_attention_scale_4.0 16.61
|
363 |
+
ngram_lm_scale_0.5_attention_scale_0.1 16.64
|
364 |
+
ngram_lm_scale_1.1_attention_scale_2.5 16.66
|
365 |
+
ngram_lm_scale_0.6_attention_scale_0.3 16.67
|
366 |
+
ngram_lm_scale_1.0_attention_scale_1.9 16.67
|
367 |
+
ngram_lm_scale_1.2_attention_scale_3.0 16.7
|
368 |
+
ngram_lm_scale_1.5_attention_scale_5.0 16.7
|
369 |
+
ngram_lm_scale_0.5_attention_scale_0.08 16.75
|
370 |
+
ngram_lm_scale_0.9_attention_scale_1.3 16.75
|
371 |
+
ngram_lm_scale_1.1_attention_scale_2.3 16.77
|
372 |
+
ngram_lm_scale_0.7_attention_scale_0.5 16.81
|
373 |
+
ngram_lm_scale_1.0_attention_scale_1.7 16.82
|
374 |
+
ngram_lm_scale_1.1_attention_scale_2.2 16.85
|
375 |
+
ngram_lm_scale_0.9_attention_scale_1.2 16.87
|
376 |
+
ngram_lm_scale_1.1_attention_scale_2.1 16.9
|
377 |
+
ngram_lm_scale_0.5_attention_scale_0.05 16.91
|
378 |
+
ngram_lm_scale_1.3_attention_scale_3.0 16.96
|
379 |
+
ngram_lm_scale_1.2_attention_scale_2.5 16.97
|
380 |
+
ngram_lm_scale_1.5_attention_scale_4.0 16.98
|
381 |
+
ngram_lm_scale_1.1_attention_scale_2.0 17.0
|
382 |
+
ngram_lm_scale_1.7_attention_scale_5.0 17.01
|
383 |
+
ngram_lm_scale_0.9_attention_scale_1.1 17.05
|
384 |
+
ngram_lm_scale_1.0_attention_scale_1.5 17.05
|
385 |
+
ngram_lm_scale_1.1_attention_scale_1.9 17.08
|
386 |
+
ngram_lm_scale_1.2_attention_scale_2.3 17.12
|
387 |
+
ngram_lm_scale_0.5_attention_scale_0.01 17.16
|
388 |
+
ngram_lm_scale_1.2_attention_scale_2.2 17.2
|
389 |
+
ngram_lm_scale_0.9_attention_scale_1.0 17.21
|
390 |
+
ngram_lm_scale_1.1_attention_scale_1.7 17.27
|
391 |
+
ngram_lm_scale_1.2_attention_scale_2.1 17.27
|
392 |
+
ngram_lm_scale_1.3_attention_scale_2.5 17.27
|
393 |
+
ngram_lm_scale_1.0_attention_scale_1.3 17.33
|
394 |
+
ngram_lm_scale_0.9_attention_scale_0.9 17.36
|
395 |
+
ngram_lm_scale_1.9_attention_scale_5.0 17.36
|
396 |
+
ngram_lm_scale_1.2_attention_scale_2.0 17.37
|
397 |
+
ngram_lm_scale_1.7_attention_scale_4.0 17.41
|
398 |
+
ngram_lm_scale_1.3_attention_scale_2.3 17.46
|
399 |
+
ngram_lm_scale_1.0_attention_scale_1.2 17.48
|
400 |
+
ngram_lm_scale_1.2_attention_scale_1.9 17.49
|
401 |
+
ngram_lm_scale_1.5_attention_scale_3.0 17.55
|
402 |
+
ngram_lm_scale_1.1_attention_scale_1.5 17.56
|
403 |
+
ngram_lm_scale_2.0_attention_scale_5.0 17.56
|
404 |
+
ngram_lm_scale_0.7_attention_scale_0.3 17.57
|
405 |
+
ngram_lm_scale_1.3_attention_scale_2.2 17.58
|
406 |
+
ngram_lm_scale_1.0_attention_scale_1.1 17.59
|
407 |
+
ngram_lm_scale_0.6_attention_scale_0.1 17.62
|
408 |
+
ngram_lm_scale_1.3_attention_scale_2.1 17.71
|
409 |
+
ngram_lm_scale_2.1_attention_scale_5.0 17.78
|
410 |
+
ngram_lm_scale_0.6_attention_scale_0.08 17.79
|
411 |
+
ngram_lm_scale_1.2_attention_scale_1.7 17.8
|
412 |
+
ngram_lm_scale_1.0_attention_scale_1.0 17.84
|
413 |
+
ngram_lm_scale_1.1_attention_scale_1.3 17.84
|
414 |
+
ngram_lm_scale_1.3_attention_scale_2.0 17.84
|
415 |
+
ngram_lm_scale_0.9_attention_scale_0.7 17.92
|
416 |
+
ngram_lm_scale_1.9_attention_scale_4.0 17.93
|
417 |
+
ngram_lm_scale_1.3_attention_scale_1.9 17.98
|
418 |
+
ngram_lm_scale_2.2_attention_scale_5.0 17.99
|
419 |
+
ngram_lm_scale_0.6_attention_scale_0.05 18.07
|
420 |
+
ngram_lm_scale_1.1_attention_scale_1.2 18.07
|
421 |
+
ngram_lm_scale_1.5_attention_scale_2.5 18.1
|
422 |
+
ngram_lm_scale_1.2_attention_scale_1.5 18.12
|
423 |
+
ngram_lm_scale_1.0_attention_scale_0.9 18.17
|
424 |
+
ngram_lm_scale_2.3_attention_scale_5.0 18.24
|
425 |
+
ngram_lm_scale_2.0_attention_scale_4.0 18.25
|
426 |
+
ngram_lm_scale_0.9_attention_scale_0.6 18.26
|
427 |
+
ngram_lm_scale_1.7_attention_scale_3.0 18.26
|
428 |
+
ngram_lm_scale_1.3_attention_scale_1.7 18.35
|
429 |
+
ngram_lm_scale_1.5_attention_scale_2.3 18.37
|
430 |
+
ngram_lm_scale_1.1_attention_scale_1.1 18.38
|
431 |
+
ngram_lm_scale_0.6_attention_scale_0.01 18.41
|
432 |
+
ngram_lm_scale_1.5_attention_scale_2.2 18.52
|
433 |
+
ngram_lm_scale_2.1_attention_scale_4.0 18.52
|
434 |
+
ngram_lm_scale_1.2_attention_scale_1.3 18.59
|
435 |
+
ngram_lm_scale_1.5_attention_scale_2.1 18.66
|
436 |
+
ngram_lm_scale_1.1_attention_scale_1.0 18.72
|
437 |
+
ngram_lm_scale_2.5_attention_scale_5.0 18.72
|
438 |
+
ngram_lm_scale_0.9_attention_scale_0.5 18.73
|
439 |
+
ngram_lm_scale_2.2_attention_scale_4.0 18.8
|
440 |
+
ngram_lm_scale_1.3_attention_scale_1.5 18.81
|
441 |
+
ngram_lm_scale_1.5_attention_scale_2.0 18.82
|
442 |
+
ngram_lm_scale_1.2_attention_scale_1.2 18.9
|
443 |
+
ngram_lm_scale_1.7_attention_scale_2.5 18.9
|
444 |
+
ngram_lm_scale_0.7_attention_scale_0.1 18.92
|
445 |
+
ngram_lm_scale_1.0_attention_scale_0.7 18.94
|
446 |
+
ngram_lm_scale_1.9_attention_scale_3.0 19.04
|
447 |
+
ngram_lm_scale_1.1_attention_scale_0.9 19.06
|
448 |
+
ngram_lm_scale_1.5_attention_scale_1.9 19.07
|
449 |
+
ngram_lm_scale_0.7_attention_scale_0.08 19.14
|
450 |
+
ngram_lm_scale_2.3_attention_scale_4.0 19.23
|
451 |
+
ngram_lm_scale_1.2_attention_scale_1.1 19.27
|
452 |
+
ngram_lm_scale_1.7_attention_scale_2.3 19.38
|
453 |
+
ngram_lm_scale_1.3_attention_scale_1.3 19.43
|
454 |
+
ngram_lm_scale_1.0_attention_scale_0.6 19.44
|
455 |
+
ngram_lm_scale_0.7_attention_scale_0.05 19.5
|
456 |
+
ngram_lm_scale_2.0_attention_scale_3.0 19.57
|
457 |
+
ngram_lm_scale_1.2_attention_scale_1.0 19.63
|
458 |
+
ngram_lm_scale_1.7_attention_scale_2.2 19.67
|
459 |
+
ngram_lm_scale_1.5_attention_scale_1.7 19.71
|
460 |
+
ngram_lm_scale_1.3_attention_scale_1.2 19.85
|
461 |
+
ngram_lm_scale_1.7_attention_scale_2.1 19.96
|
462 |
+
ngram_lm_scale_2.5_attention_scale_4.0 19.98
|
463 |
+
ngram_lm_scale_0.7_attention_scale_0.01 20.01
|
464 |
+
ngram_lm_scale_2.1_attention_scale_3.0 20.02
|
465 |
+
ngram_lm_scale_1.0_attention_scale_0.5 20.06
|
466 |
+
ngram_lm_scale_1.1_attention_scale_0.7 20.1
|
467 |
+
ngram_lm_scale_1.9_attention_scale_2.5 20.13
|
468 |
+
ngram_lm_scale_1.2_attention_scale_0.9 20.16
|
469 |
+
ngram_lm_scale_0.9_attention_scale_0.3 20.25
|
470 |
+
ngram_lm_scale_1.3_attention_scale_1.1 20.25
|
471 |
+
ngram_lm_scale_1.7_attention_scale_2.0 20.26
|
472 |
+
ngram_lm_scale_3.0_attention_scale_5.0 20.31
|
473 |
+
ngram_lm_scale_1.5_attention_scale_1.5 20.44
|
474 |
+
ngram_lm_scale_1.7_attention_scale_1.9 20.58
|
475 |
+
ngram_lm_scale_2.2_attention_scale_3.0 20.6
|
476 |
+
ngram_lm_scale_1.9_attention_scale_2.3 20.74
|
477 |
+
ngram_lm_scale_1.1_attention_scale_0.6 20.78
|
478 |
+
ngram_lm_scale_2.0_attention_scale_2.5 20.79
|
479 |
+
ngram_lm_scale_1.3_attention_scale_1.0 20.84
|
480 |
+
ngram_lm_scale_1.9_attention_scale_2.2 21.1
|
481 |
+
ngram_lm_scale_2.3_attention_scale_3.0 21.19
|
482 |
+
ngram_lm_scale_1.5_attention_scale_1.3 21.36
|
483 |
+
ngram_lm_scale_1.7_attention_scale_1.7 21.39
|
484 |
+
ngram_lm_scale_1.9_attention_scale_2.1 21.44
|
485 |
+
ngram_lm_scale_2.0_attention_scale_2.3 21.44
|
486 |
+
ngram_lm_scale_1.3_attention_scale_0.9 21.48
|
487 |
+
ngram_lm_scale_1.2_attention_scale_0.7 21.49
|
488 |
+
ngram_lm_scale_2.1_attention_scale_2.5 21.49
|
489 |
+
ngram_lm_scale_1.1_attention_scale_0.5 21.63
|
490 |
+
ngram_lm_scale_2.0_attention_scale_2.2 21.79
|
491 |
+
ngram_lm_scale_1.9_attention_scale_2.0 21.81
|
492 |
+
ngram_lm_scale_1.0_attention_scale_0.3 21.92
|
493 |
+
ngram_lm_scale_1.5_attention_scale_1.2 21.94
|
494 |
+
ngram_lm_scale_2.2_attention_scale_2.5 22.14
|
495 |
+
ngram_lm_scale_2.1_attention_scale_2.3 22.17
|
496 |
+
ngram_lm_scale_2.0_attention_scale_2.1 22.2
|
497 |
+
ngram_lm_scale_3.0_attention_scale_4.0 22.21
|
498 |
+
ngram_lm_scale_1.9_attention_scale_1.9 22.23
|
499 |
+
ngram_lm_scale_1.2_attention_scale_0.6 22.37
|
500 |
+
ngram_lm_scale_1.7_attention_scale_1.5 22.37
|
501 |
+
ngram_lm_scale_2.5_attention_scale_3.0 22.41
|
502 |
+
ngram_lm_scale_1.5_attention_scale_1.1 22.59
|
503 |
+
ngram_lm_scale_0.9_attention_scale_0.1 22.6
|
504 |
+
ngram_lm_scale_2.1_attention_scale_2.2 22.62
|
505 |
+
ngram_lm_scale_2.0_attention_scale_2.0 22.69
|
506 |
+
ngram_lm_scale_2.3_attention_scale_2.5 22.91
|
507 |
+
ngram_lm_scale_0.9_attention_scale_0.08 22.92
|
508 |
+
ngram_lm_scale_1.3_attention_scale_0.7 22.98
|
509 |
+
ngram_lm_scale_2.2_attention_scale_2.3 22.98
|
510 |
+
ngram_lm_scale_2.1_attention_scale_2.1 23.1
|
511 |
+
ngram_lm_scale_2.0_attention_scale_1.9 23.21
|
512 |
+
ngram_lm_scale_1.9_attention_scale_1.7 23.32
|
513 |
+
ngram_lm_scale_1.2_attention_scale_0.5 23.38
|
514 |
+
ngram_lm_scale_1.5_attention_scale_1.0 23.38
|
515 |
+
ngram_lm_scale_2.2_attention_scale_2.2 23.43
|
516 |
+
ngram_lm_scale_0.9_attention_scale_0.05 23.51
|
517 |
+
ngram_lm_scale_2.1_attention_scale_2.0 23.56
|
518 |
+
ngram_lm_scale_1.7_attention_scale_1.3 23.71
|
519 |
+
ngram_lm_scale_2.3_attention_scale_2.3 23.81
|
520 |
+
ngram_lm_scale_1.1_attention_scale_0.3 23.9
|
521 |
+
ngram_lm_scale_2.2_attention_scale_2.1 23.95
|
522 |
+
ngram_lm_scale_1.3_attention_scale_0.6 24.01
|
523 |
+
ngram_lm_scale_2.1_attention_scale_1.9 24.11
|
524 |
+
ngram_lm_scale_1.5_attention_scale_0.9 24.18
|
525 |
+
ngram_lm_scale_2.3_attention_scale_2.2 24.27
|
526 |
+
ngram_lm_scale_0.9_attention_scale_0.01 24.31
|
527 |
+
ngram_lm_scale_2.0_attention_scale_1.7 24.32
|
528 |
+
ngram_lm_scale_1.7_attention_scale_1.2 24.35
|
529 |
+
ngram_lm_scale_4.0_attention_scale_5.0 24.35
|
530 |
+
ngram_lm_scale_2.5_attention_scale_2.5 24.46
|
531 |
+
ngram_lm_scale_2.2_attention_scale_2.0 24.47
|
532 |
+
ngram_lm_scale_1.9_attention_scale_1.5 24.57
|
533 |
+
ngram_lm_scale_2.3_attention_scale_2.1 24.84
|
534 |
+
ngram_lm_scale_1.0_attention_scale_0.1 24.91
|
535 |
+
ngram_lm_scale_2.2_attention_scale_1.9 25.0
|
536 |
+
ngram_lm_scale_1.3_attention_scale_0.5 25.02
|
537 |
+
ngram_lm_scale_1.7_attention_scale_1.1 25.12
|
538 |
+
ngram_lm_scale_2.1_attention_scale_1.7 25.2
|
539 |
+
ngram_lm_scale_1.0_attention_scale_0.08 25.3
|
540 |
+
ngram_lm_scale_2.3_attention_scale_2.0 25.32
|
541 |
+
ngram_lm_scale_2.5_attention_scale_2.3 25.38
|
542 |
+
ngram_lm_scale_2.0_attention_scale_1.5 25.49
|
543 |
+
ngram_lm_scale_3.0_attention_scale_3.0 25.62
|
544 |
+
ngram_lm_scale_2.5_attention_scale_2.2 25.8
|
545 |
+
ngram_lm_scale_2.3_attention_scale_1.9 25.81
|
546 |
+
ngram_lm_scale_1.0_attention_scale_0.05 25.82
|
547 |
+
ngram_lm_scale_1.9_attention_scale_1.3 25.88
|
548 |
+
ngram_lm_scale_1.7_attention_scale_1.0 25.92
|
549 |
+
ngram_lm_scale_1.2_attention_scale_0.3 25.93
|
550 |
+
ngram_lm_scale_1.5_attention_scale_0.7 26.0
|
551 |
+
ngram_lm_scale_2.2_attention_scale_1.7 26.12
|
552 |
+
ngram_lm_scale_2.5_attention_scale_2.1 26.33
|
553 |
+
ngram_lm_scale_2.1_attention_scale_1.5 26.49
|
554 |
+
ngram_lm_scale_1.9_attention_scale_1.2 26.64
|
555 |
+
ngram_lm_scale_1.0_attention_scale_0.01 26.66
|
556 |
+
ngram_lm_scale_1.7_attention_scale_0.9 26.77
|
557 |
+
ngram_lm_scale_2.5_attention_scale_2.0 26.91
|
558 |
+
ngram_lm_scale_2.0_attention_scale_1.3 26.93
|
559 |
+
ngram_lm_scale_2.3_attention_scale_1.7 27.04
|
560 |
+
ngram_lm_scale_1.5_attention_scale_0.6 27.13
|
561 |
+
ngram_lm_scale_4.0_attention_scale_4.0 27.2
|
562 |
+
ngram_lm_scale_1.1_attention_scale_0.1 27.21
|
563 |
+
ngram_lm_scale_2.2_attention_scale_1.5 27.45
|
564 |
+
ngram_lm_scale_2.5_attention_scale_1.9 27.46
|
565 |
+
ngram_lm_scale_1.9_attention_scale_1.1 27.47
|
566 |
+
ngram_lm_scale_1.1_attention_scale_0.08 27.58
|
567 |
+
ngram_lm_scale_1.3_attention_scale_0.3 27.75
|
568 |
+
ngram_lm_scale_2.0_attention_scale_1.2 27.76
|
569 |
+
ngram_lm_scale_3.0_attention_scale_2.5 27.82
|
570 |
+
ngram_lm_scale_2.1_attention_scale_1.3 27.95
|
571 |
+
ngram_lm_scale_1.1_attention_scale_0.05 28.1
|
572 |
+
ngram_lm_scale_5.0_attention_scale_5.0 28.15
|
573 |
+
ngram_lm_scale_2.3_attention_scale_1.5 28.33
|
574 |
+
ngram_lm_scale_1.5_attention_scale_0.5 28.37
|
575 |
+
ngram_lm_scale_1.9_attention_scale_1.0 28.4
|
576 |
+
ngram_lm_scale_2.0_attention_scale_1.1 28.59
|
577 |
+
ngram_lm_scale_2.5_attention_scale_1.7 28.61
|
578 |
+
ngram_lm_scale_3.0_attention_scale_2.3 28.67
|
579 |
+
ngram_lm_scale_2.1_attention_scale_1.2 28.76
|
580 |
+
ngram_lm_scale_1.1_attention_scale_0.01 28.87
|
581 |
+
ngram_lm_scale_2.2_attention_scale_1.3 28.88
|
582 |
+
ngram_lm_scale_1.7_attention_scale_0.7 28.94
|
583 |
+
ngram_lm_scale_1.2_attention_scale_0.1 29.15
|
584 |
+
ngram_lm_scale_3.0_attention_scale_2.2 29.17
|
585 |
+
ngram_lm_scale_1.9_attention_scale_0.9 29.29
|
586 |
+
ngram_lm_scale_2.0_attention_scale_1.0 29.41
|
587 |
+
ngram_lm_scale_1.2_attention_scale_0.08 29.49
|
588 |
+
ngram_lm_scale_2.1_attention_scale_1.1 29.55
|
589 |
+
ngram_lm_scale_2.2_attention_scale_1.2 29.65
|
590 |
+
ngram_lm_scale_3.0_attention_scale_2.1 29.67
|
591 |
+
ngram_lm_scale_2.3_attention_scale_1.3 29.75
|
592 |
+
ngram_lm_scale_2.5_attention_scale_1.5 29.93
|
593 |
+
ngram_lm_scale_1.7_attention_scale_0.6 29.94
|
594 |
+
ngram_lm_scale_1.2_attention_scale_0.05 30.03
|
595 |
+
ngram_lm_scale_3.0_attention_scale_2.0 30.2
|
596 |
+
ngram_lm_scale_2.0_attention_scale_0.9 30.31
|
597 |
+
ngram_lm_scale_2.1_attention_scale_1.0 30.35
|
598 |
+
ngram_lm_scale_2.2_attention_scale_1.1 30.38
|
599 |
+
ngram_lm_scale_2.3_attention_scale_1.2 30.44
|
600 |
+
ngram_lm_scale_4.0_attention_scale_3.0 30.53
|
601 |
+
ngram_lm_scale_3.0_attention_scale_1.9 30.65
|
602 |
+
ngram_lm_scale_5.0_attention_scale_4.0 30.71
|
603 |
+
ngram_lm_scale_1.2_attention_scale_0.01 30.77
|
604 |
+
ngram_lm_scale_1.3_attention_scale_0.1 30.86
|
605 |
+
ngram_lm_scale_1.5_attention_scale_0.3 31.01
|
606 |
+
ngram_lm_scale_1.7_attention_scale_0.5 31.1
|
607 |
+
ngram_lm_scale_2.1_attention_scale_0.9 31.15
|
608 |
+
ngram_lm_scale_1.3_attention_scale_0.08 31.19
|
609 |
+
ngram_lm_scale_2.2_attention_scale_1.0 31.19
|
610 |
+
ngram_lm_scale_2.5_attention_scale_1.3 31.2
|
611 |
+
ngram_lm_scale_2.3_attention_scale_1.1 31.21
|
612 |
+
ngram_lm_scale_1.9_attention_scale_0.7 31.23
|
613 |
+
ngram_lm_scale_1.3_attention_scale_0.05 31.68
|
614 |
+
ngram_lm_scale_3.0_attention_scale_1.7 31.69
|
615 |
+
ngram_lm_scale_2.5_attention_scale_1.2 31.84
|
616 |
+
ngram_lm_scale_2.3_attention_scale_1.0 31.88
|
617 |
+
ngram_lm_scale_2.2_attention_scale_0.9 31.91
|
618 |
+
ngram_lm_scale_2.0_attention_scale_0.7 32.0
|
619 |
+
ngram_lm_scale_1.9_attention_scale_0.6 32.05
|
620 |
+
ngram_lm_scale_1.3_attention_scale_0.01 32.25
|
621 |
+
ngram_lm_scale_4.0_attention_scale_2.5 32.3
|
622 |
+
ngram_lm_scale_2.5_attention_scale_1.1 32.43
|
623 |
+
ngram_lm_scale_2.3_attention_scale_0.9 32.57
|
624 |
+
ngram_lm_scale_3.0_attention_scale_1.5 32.7
|
625 |
+
ngram_lm_scale_2.1_attention_scale_0.7 32.73
|
626 |
+
ngram_lm_scale_2.0_attention_scale_0.6 32.8
|
627 |
+
ngram_lm_scale_1.9_attention_scale_0.5 32.9
|
628 |
+
ngram_lm_scale_4.0_attention_scale_2.3 33.0
|
629 |
+
ngram_lm_scale_2.5_attention_scale_1.0 33.08
|
630 |
+
ngram_lm_scale_1.7_attention_scale_0.3 33.11
|
631 |
+
ngram_lm_scale_4.0_attention_scale_2.2 33.33
|
632 |
+
ngram_lm_scale_2.2_attention_scale_0.7 33.34
|
633 |
+
ngram_lm_scale_1.5_attention_scale_0.1 33.35
|
634 |
+
ngram_lm_scale_5.0_attention_scale_3.0 33.43
|
635 |
+
ngram_lm_scale_2.1_attention_scale_0.6 33.46
|
636 |
+
ngram_lm_scale_1.5_attention_scale_0.08 33.58
|
637 |
+
ngram_lm_scale_2.0_attention_scale_0.5 33.62
|
638 |
+
ngram_lm_scale_2.5_attention_scale_0.9 33.66
|
639 |
+
ngram_lm_scale_3.0_attention_scale_1.3 33.71
|
640 |
+
ngram_lm_scale_4.0_attention_scale_2.1 33.71
|
641 |
+
ngram_lm_scale_2.3_attention_scale_0.7 33.86
|
642 |
+
ngram_lm_scale_1.5_attention_scale_0.05 33.91
|
643 |
+
ngram_lm_scale_2.2_attention_scale_0.6 33.98
|
644 |
+
ngram_lm_scale_4.0_attention_scale_2.0 34.03
|
645 |
+
ngram_lm_scale_3.0_attention_scale_1.2 34.1
|
646 |
+
ngram_lm_scale_2.1_attention_scale_0.5 34.15
|
647 |
+
ngram_lm_scale_1.5_attention_scale_0.01 34.29
|
648 |
+
ngram_lm_scale_4.0_attention_scale_1.9 34.35
|
649 |
+
ngram_lm_scale_1.9_attention_scale_0.3 34.45
|
650 |
+
ngram_lm_scale_2.3_attention_scale_0.6 34.47
|
651 |
+
ngram_lm_scale_3.0_attention_scale_1.1 34.53
|
652 |
+
ngram_lm_scale_2.2_attention_scale_0.5 34.6
|
653 |
+
ngram_lm_scale_2.5_attention_scale_0.7 34.69
|
654 |
+
ngram_lm_scale_5.0_attention_scale_2.5 34.7
|
655 |
+
ngram_lm_scale_1.7_attention_scale_0.1 34.92
|
656 |
+
ngram_lm_scale_3.0_attention_scale_1.0 34.92
|
657 |
+
ngram_lm_scale_2.0_attention_scale_0.3 34.94
|
658 |
+
ngram_lm_scale_4.0_attention_scale_1.7 34.96
|
659 |
+
ngram_lm_scale_2.3_attention_scale_0.5 34.98
|
660 |
+
ngram_lm_scale_1.7_attention_scale_0.08 35.07
|
661 |
+
ngram_lm_scale_5.0_attention_scale_2.3 35.14
|
662 |
+
ngram_lm_scale_2.5_attention_scale_0.6 35.18
|
663 |
+
ngram_lm_scale_1.7_attention_scale_0.05 35.27
|
664 |
+
ngram_lm_scale_3.0_attention_scale_0.9 35.35
|
665 |
+
ngram_lm_scale_5.0_attention_scale_2.2 35.36
|
666 |
+
ngram_lm_scale_2.1_attention_scale_0.3 35.39
|
667 |
+
ngram_lm_scale_4.0_attention_scale_1.5 35.48
|
668 |
+
ngram_lm_scale_5.0_attention_scale_2.1 35.53
|
669 |
+
ngram_lm_scale_1.7_attention_scale_0.01 35.6
|
670 |
+
ngram_lm_scale_2.5_attention_scale_0.5 35.66
|
671 |
+
ngram_lm_scale_2.2_attention_scale_0.3 35.74
|
672 |
+
ngram_lm_scale_5.0_attention_scale_2.0 35.74
|
673 |
+
ngram_lm_scale_1.9_attention_scale_0.1 35.84
|
674 |
+
ngram_lm_scale_1.9_attention_scale_0.08 35.95
|
675 |
+
ngram_lm_scale_5.0_attention_scale_1.9 35.97
|
676 |
+
ngram_lm_scale_4.0_attention_scale_1.3 36.01
|
677 |
+
ngram_lm_scale_3.0_attention_scale_0.7 36.05
|
678 |
+
ngram_lm_scale_2.3_attention_scale_0.3 36.08
|
679 |
+
ngram_lm_scale_1.9_attention_scale_0.05 36.14
|
680 |
+
ngram_lm_scale_2.0_attention_scale_0.1 36.17
|
681 |
+
ngram_lm_scale_4.0_attention_scale_1.2 36.24
|
682 |
+
ngram_lm_scale_2.0_attention_scale_0.08 36.27
|
683 |
+
ngram_lm_scale_5.0_attention_scale_1.7 36.31
|
684 |
+
ngram_lm_scale_1.9_attention_scale_0.01 36.34
|
685 |
+
ngram_lm_scale_2.0_attention_scale_0.05 36.42
|
686 |
+
ngram_lm_scale_3.0_attention_scale_0.6 36.43
|
687 |
+
ngram_lm_scale_2.1_attention_scale_0.1 36.46
|
688 |
+
ngram_lm_scale_4.0_attention_scale_1.1 36.48
|
689 |
+
ngram_lm_scale_2.5_attention_scale_0.3 36.55
|
690 |
+
ngram_lm_scale_2.1_attention_scale_0.08 36.57
|
691 |
+
ngram_lm_scale_2.0_attention_scale_0.01 36.63
|
692 |
+
ngram_lm_scale_2.1_attention_scale_0.05 36.68
|
693 |
+
ngram_lm_scale_2.2_attention_scale_0.1 36.72
|
694 |
+
ngram_lm_scale_5.0_attention_scale_1.5 36.74
|
695 |
+
ngram_lm_scale_4.0_attention_scale_1.0 36.75
|
696 |
+
ngram_lm_scale_3.0_attention_scale_0.5 36.76
|
697 |
+
ngram_lm_scale_2.2_attention_scale_0.08 36.79
|
698 |
+
ngram_lm_scale_2.1_attention_scale_0.01 36.9
|
699 |
+
ngram_lm_scale_2.2_attention_scale_0.05 36.96
|
700 |
+
ngram_lm_scale_2.3_attention_scale_0.1 36.96
|
701 |
+
ngram_lm_scale_4.0_attention_scale_0.9 36.97
|
702 |
+
ngram_lm_scale_2.3_attention_scale_0.08 37.03
|
703 |
+
ngram_lm_scale_5.0_attention_scale_1.3 37.07
|
704 |
+
ngram_lm_scale_2.2_attention_scale_0.01 37.11
|
705 |
+
ngram_lm_scale_2.3_attention_scale_0.05 37.12
|
706 |
+
ngram_lm_scale_5.0_attention_scale_1.2 37.24
|
707 |
+
ngram_lm_scale_2.3_attention_scale_0.01 37.26
|
708 |
+
ngram_lm_scale_2.5_attention_scale_0.1 37.29
|
709 |
+
ngram_lm_scale_3.0_attention_scale_0.3 37.33
|
710 |
+
ngram_lm_scale_2.5_attention_scale_0.08 37.35
|
711 |
+
ngram_lm_scale_4.0_attention_scale_0.7 37.39
|
712 |
+
ngram_lm_scale_5.0_attention_scale_1.1 37.4
|
713 |
+
ngram_lm_scale_2.5_attention_scale_0.05 37.42
|
714 |
+
ngram_lm_scale_2.5_attention_scale_0.01 37.55
|
715 |
+
ngram_lm_scale_4.0_attention_scale_0.6 37.58
|
716 |
+
ngram_lm_scale_5.0_attention_scale_1.0 37.58
|
717 |
+
ngram_lm_scale_5.0_attention_scale_0.9 37.76
|
718 |
+
ngram_lm_scale_4.0_attention_scale_0.5 37.78
|
719 |
+
ngram_lm_scale_3.0_attention_scale_0.1 37.9
|
720 |
+
ngram_lm_scale_3.0_attention_scale_0.08 37.94
|
721 |
+
ngram_lm_scale_3.0_attention_scale_0.05 38.03
|
722 |
+
ngram_lm_scale_5.0_attention_scale_0.7 38.05
|
723 |
+
ngram_lm_scale_3.0_attention_scale_0.01 38.13
|
724 |
+
ngram_lm_scale_5.0_attention_scale_0.6 38.2
|
725 |
+
ngram_lm_scale_4.0_attention_scale_0.3 38.21
|
726 |
+
ngram_lm_scale_5.0_attention_scale_0.5 38.37
|
727 |
+
ngram_lm_scale_4.0_attention_scale_0.1 38.57
|
728 |
+
ngram_lm_scale_4.0_attention_scale_0.08 38.61
|
729 |
+
ngram_lm_scale_5.0_attention_scale_0.3 38.65
|
730 |
+
ngram_lm_scale_4.0_attention_scale_0.05 38.67
|
731 |
+
ngram_lm_scale_4.0_attention_scale_0.01 38.74
|
732 |
+
ngram_lm_scale_5.0_attention_scale_0.1 38.93
|
733 |
+
ngram_lm_scale_5.0_attention_scale_0.08 38.97
|
734 |
+
ngram_lm_scale_5.0_attention_scale_0.05 39.01
|
735 |
+
ngram_lm_scale_5.0_attention_scale_0.01 39.06
|
736 |
+
|
737 |
+
2022-06-25 17:31:30,127 INFO [decode.py:483] batch 0/?, cuts processed until now is 13
|
738 |
+
2022-06-26 00:14:14,317 INFO [decode.py:483] batch 100/?, cuts processed until now is 1548
|
739 |
+
2022-06-26 05:48:34,635 INFO [decode.py:733] Caught exception:
|
740 |
+
|
741 |
+
Some bad things happened. Please read the above error messages and stack
|
742 |
+
trace. If you are using Python, the following command may be helpful:
|
743 |
+
|
744 |
+
gdb --args python /path/to/your/code.py
|
745 |
+
|
746 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
747 |
+
a debug version of k2.).
|
748 |
+
|
749 |
+
If you are unable to fix it, please open an issue at:
|
750 |
+
|
751 |
+
https://github.com/k2-fsa/k2/issues/new
|
752 |
+
|
753 |
+
|
754 |
+
2022-06-26 05:48:34,636 INFO [decode.py:734] num_arcs before pruning: 6585058
|
755 |
+
2022-06-26 05:48:34,636 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
756 |
+
2022-06-26 05:48:35,670 INFO [decode.py:747] num_arcs after pruning: 26992
|
757 |
+
2022-06-26 06:38:15,357 INFO [decode.py:483] batch 200/?, cuts processed until now is 3195
|
758 |
+
2022-06-26 07:23:18,776 INFO [decode.py:733] Caught exception:
|
759 |
+
|
760 |
+
Some bad things happened. Please read the above error messages and stack
|
761 |
+
trace. If you are using Python, the following command may be helpful:
|
762 |
+
|
763 |
+
gdb --args python /path/to/your/code.py
|
764 |
+
|
765 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
766 |
+
a debug version of k2.).
|
767 |
+
|
768 |
+
If you are unable to fix it, please open an issue at:
|
769 |
+
|
770 |
+
https://github.com/k2-fsa/k2/issues/new
|
771 |
+
|
772 |
+
|
773 |
+
2022-06-26 07:23:18,777 INFO [decode.py:734] num_arcs before pruning: 5733129
|
774 |
+
2022-06-26 07:23:18,778 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
775 |
+
2022-06-26 07:23:19,605 INFO [decode.py:747] num_arcs after pruning: 28851
|
776 |
+
2022-06-26 07:50:08,702 INFO [decode.py:733] Caught exception:
|
777 |
+
|
778 |
+
Some bad things happened. Please read the above error messages and stack
|
779 |
+
trace. If you are using Python, the following command may be helpful:
|
780 |
+
|
781 |
+
gdb --args python /path/to/your/code.py
|
782 |
+
|
783 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
784 |
+
a debug version of k2.).
|
785 |
+
|
786 |
+
If you are unable to fix it, please open an issue at:
|
787 |
+
|
788 |
+
https://github.com/k2-fsa/k2/issues/new
|
789 |
+
|
790 |
+
|
791 |
+
2022-06-26 07:50:08,703 INFO [decode.py:734] num_arcs before pruning: 6461523
|
792 |
+
2022-06-26 07:50:08,703 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
793 |
+
2022-06-26 07:50:09,674 INFO [decode.py:747] num_arcs after pruning: 22617
|
794 |
+
2022-06-26 09:38:10,983 INFO [decode.py:733] Caught exception:
|
795 |
+
|
796 |
+
Some bad things happened. Please read the above error messages and stack
|
797 |
+
trace. If you are using Python, the following command may be helpful:
|
798 |
+
|
799 |
+
gdb --args python /path/to/your/code.py
|
800 |
+
|
801 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
802 |
+
a debug version of k2.).
|
803 |
+
|
804 |
+
If you are unable to fix it, please open an issue at:
|
805 |
+
|
806 |
+
https://github.com/k2-fsa/k2/issues/new
|
807 |
+
|
808 |
+
|
809 |
+
2022-06-26 09:38:10,984 INFO [decode.py:734] num_arcs before pruning: 5956089
|
810 |
+
2022-06-26 09:38:10,984 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
811 |
+
2022-06-26 09:38:11,799 INFO [decode.py:747] num_arcs after pruning: 20569
|
812 |
+
2022-06-26 11:43:09,309 INFO [decode.py:733] Caught exception:
|
813 |
+
|
814 |
+
Some bad things happened. Please read the above error messages and stack
|
815 |
+
trace. If you are using Python, the following command may be helpful:
|
816 |
+
|
817 |
+
gdb --args python /path/to/your/code.py
|
818 |
+
|
819 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
820 |
+
a debug version of k2.).
|
821 |
+
|
822 |
+
If you are unable to fix it, please open an issue at:
|
823 |
+
|
824 |
+
https://github.com/k2-fsa/k2/issues/new
|
825 |
+
|
826 |
+
|
827 |
+
2022-06-26 11:43:09,311 INFO [decode.py:734] num_arcs before pruning: 4422748
|
828 |
+
2022-06-26 11:43:09,311 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
829 |
+
2022-06-26 11:43:09,913 INFO [decode.py:747] num_arcs after pruning: 16921
|
830 |
+
2022-06-26 11:59:44,567 INFO [decode.py:733] Caught exception:
|
831 |
+
|
832 |
+
Some bad things happened. Please read the above error messages and stack
|
833 |
+
trace. If you are using Python, the following command may be helpful:
|
834 |
+
|
835 |
+
gdb --args python /path/to/your/code.py
|
836 |
+
|
837 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
838 |
+
a debug version of k2.).
|
839 |
+
|
840 |
+
If you are unable to fix it, please open an issue at:
|
841 |
+
|
842 |
+
https://github.com/k2-fsa/k2/issues/new
|
843 |
+
|
844 |
+
|
845 |
+
2022-06-26 11:59:44,568 INFO [decode.py:734] num_arcs before pruning: 5991196
|
846 |
+
2022-06-26 11:59:44,568 INFO [decode.py:737] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
847 |
+
2022-06-26 11:59:45,380 INFO [decode.py:747] num_arcs after pruning: 35987
|
848 |
+
2022-06-26 14:20:44,143 INFO [decode.py:483] batch 300/?, cuts processed until now is 4803
|
849 |
+
2022-06-26 15:43:53,294 INFO [decode.py:532]
|
850 |
+
For dev, WER of different settings are:
|
851 |
+
ngram_lm_scale_0.01_attention_scale_0.5 15.78 best for dev
|
852 |
+
ngram_lm_scale_0.01_attention_scale_0.6 15.78
|
853 |
+
ngram_lm_scale_0.01_attention_scale_0.7 15.78
|
854 |
+
ngram_lm_scale_0.05_attention_scale_0.6 15.79
|
855 |
+
ngram_lm_scale_0.01_attention_scale_0.9 15.8
|
856 |
+
ngram_lm_scale_0.05_attention_scale_0.5 15.81
|
857 |
+
ngram_lm_scale_0.05_attention_scale_0.7 15.81
|
858 |
+
ngram_lm_scale_0.08_attention_scale_0.7 15.81
|
859 |
+
ngram_lm_scale_0.01_attention_scale_0.3 15.82
|
860 |
+
ngram_lm_scale_0.01_attention_scale_1.0 15.82
|
861 |
+
ngram_lm_scale_0.05_attention_scale_0.9 15.82
|
862 |
+
ngram_lm_scale_0.08_attention_scale_0.5 15.82
|
863 |
+
ngram_lm_scale_0.08_attention_scale_0.6 15.82
|
864 |
+
ngram_lm_scale_0.08_attention_scale_0.9 15.82
|
865 |
+
ngram_lm_scale_0.1_attention_scale_0.6 15.82
|
866 |
+
ngram_lm_scale_0.1_attention_scale_0.7 15.83
|
867 |
+
ngram_lm_scale_0.05_attention_scale_1.0 15.84
|
868 |
+
ngram_lm_scale_0.1_attention_scale_0.5 15.84
|
869 |
+
ngram_lm_scale_0.01_attention_scale_1.1 15.85
|
870 |
+
ngram_lm_scale_0.05_attention_scale_0.3 15.85
|
871 |
+
ngram_lm_scale_0.05_attention_scale_1.1 15.85
|
872 |
+
ngram_lm_scale_0.08_attention_scale_0.3 15.85
|
873 |
+
ngram_lm_scale_0.1_attention_scale_0.3 15.85
|
874 |
+
ngram_lm_scale_0.1_attention_scale_0.9 15.86
|
875 |
+
ngram_lm_scale_0.1_attention_scale_1.0 15.86
|
876 |
+
ngram_lm_scale_0.01_attention_scale_1.2 15.87
|
877 |
+
ngram_lm_scale_0.08_attention_scale_1.0 15.87
|
878 |
+
ngram_lm_scale_0.01_attention_scale_1.3 15.89
|
879 |
+
ngram_lm_scale_0.08_attention_scale_1.1 15.89
|
880 |
+
ngram_lm_scale_0.01_attention_scale_0.1 15.9
|
881 |
+
ngram_lm_scale_0.05_attention_scale_1.2 15.9
|
882 |
+
ngram_lm_scale_0.1_attention_scale_1.1 15.9
|
883 |
+
ngram_lm_scale_0.05_attention_scale_1.3 15.91
|
884 |
+
ngram_lm_scale_0.08_attention_scale_1.2 15.91
|
885 |
+
ngram_lm_scale_0.01_attention_scale_0.08 15.92
|
886 |
+
ngram_lm_scale_0.01_attention_scale_1.5 15.92
|
887 |
+
ngram_lm_scale_0.05_attention_scale_0.1 15.93
|
888 |
+
ngram_lm_scale_0.1_attention_scale_1.2 15.93
|
889 |
+
ngram_lm_scale_0.05_attention_scale_0.08 15.94
|
890 |
+
ngram_lm_scale_0.08_attention_scale_0.1 15.94
|
891 |
+
ngram_lm_scale_0.08_attention_scale_1.3 15.94
|
892 |
+
ngram_lm_scale_0.08_attention_scale_0.08 15.96
|
893 |
+
ngram_lm_scale_0.1_attention_scale_0.1 15.96
|
894 |
+
ngram_lm_scale_0.1_attention_scale_1.3 15.96
|
895 |
+
ngram_lm_scale_0.01_attention_scale_1.7 15.97
|
896 |
+
ngram_lm_scale_0.05_attention_scale_1.5 15.97
|
897 |
+
ngram_lm_scale_0.01_attention_scale_0.05 15.99
|
898 |
+
ngram_lm_scale_0.05_attention_scale_0.05 15.99
|
899 |
+
ngram_lm_scale_0.05_attention_scale_1.7 15.99
|
900 |
+
ngram_lm_scale_0.08_attention_scale_1.5 15.99
|
901 |
+
ngram_lm_scale_0.1_attention_scale_0.08 15.99
|
902 |
+
ngram_lm_scale_0.01_attention_scale_0.01 16.0
|
903 |
+
ngram_lm_scale_0.01_attention_scale_1.9 16.0
|
904 |
+
ngram_lm_scale_0.08_attention_scale_0.05 16.0
|
905 |
+
ngram_lm_scale_0.01_attention_scale_2.0 16.01
|
906 |
+
ngram_lm_scale_0.1_attention_scale_1.5 16.01
|
907 |
+
ngram_lm_scale_0.1_attention_scale_0.05 16.02
|
908 |
+
ngram_lm_scale_0.05_attention_scale_1.9 16.03
|
909 |
+
ngram_lm_scale_0.08_attention_scale_1.7 16.03
|
910 |
+
ngram_lm_scale_0.08_attention_scale_1.9 16.03
|
911 |
+
ngram_lm_scale_0.01_attention_scale_2.1 16.04
|
912 |
+
ngram_lm_scale_0.05_attention_scale_2.0 16.04
|
913 |
+
ngram_lm_scale_0.1_attention_scale_1.7 16.04
|
914 |
+
ngram_lm_scale_0.01_attention_scale_2.2 16.05
|
915 |
+
ngram_lm_scale_0.05_attention_scale_0.01 16.05
|
916 |
+
ngram_lm_scale_0.05_attention_scale_2.1 16.05
|
917 |
+
ngram_lm_scale_0.1_attention_scale_1.9 16.05
|
918 |
+
ngram_lm_scale_0.08_attention_scale_2.0 16.06
|
919 |
+
ngram_lm_scale_0.01_attention_scale_2.3 16.07
|
920 |
+
ngram_lm_scale_0.1_attention_scale_2.0 16.07
|
921 |
+
ngram_lm_scale_0.3_attention_scale_0.9 16.07
|
922 |
+
ngram_lm_scale_0.05_attention_scale_2.2 16.08
|
923 |
+
ngram_lm_scale_0.08_attention_scale_0.01 16.08
|
924 |
+
ngram_lm_scale_0.08_attention_scale_2.1 16.09
|
925 |
+
ngram_lm_scale_0.08_attention_scale_2.2 16.09
|
926 |
+
ngram_lm_scale_0.1_attention_scale_2.1 16.09
|
927 |
+
ngram_lm_scale_0.1_attention_scale_2.2 16.09
|
928 |
+
ngram_lm_scale_0.3_attention_scale_1.0 16.09
|
929 |
+
ngram_lm_scale_0.01_attention_scale_2.5 16.1
|
930 |
+
ngram_lm_scale_0.05_attention_scale_2.3 16.1
|
931 |
+
ngram_lm_scale_0.08_attention_scale_2.3 16.1
|
932 |
+
ngram_lm_scale_0.3_attention_scale_1.1 16.1
|
933 |
+
ngram_lm_scale_0.3_attention_scale_0.7 16.11
|
934 |
+
ngram_lm_scale_0.05_attention_scale_2.5 16.12
|
935 |
+
ngram_lm_scale_0.1_attention_scale_0.01 16.12
|
936 |
+
ngram_lm_scale_0.1_attention_scale_2.3 16.12
|
937 |
+
ngram_lm_scale_0.3_attention_scale_0.6 16.12
|
938 |
+
ngram_lm_scale_0.3_attention_scale_1.2 16.12
|
939 |
+
ngram_lm_scale_0.3_attention_scale_1.3 16.13
|
940 |
+
ngram_lm_scale_0.08_attention_scale_2.5 16.14
|
941 |
+
ngram_lm_scale_0.1_attention_scale_2.5 16.14
|
942 |
+
ngram_lm_scale_0.3_attention_scale_1.5 16.14
|
943 |
+
ngram_lm_scale_0.3_attention_scale_0.5 16.16
|
944 |
+
ngram_lm_scale_0.01_attention_scale_3.0 16.18
|
945 |
+
ngram_lm_scale_0.3_attention_scale_1.7 16.2
|
946 |
+
ngram_lm_scale_0.3_attention_scale_0.3 16.21
|
947 |
+
ngram_lm_scale_0.05_attention_scale_3.0 16.22
|
948 |
+
ngram_lm_scale_0.08_attention_scale_3.0 16.24
|
949 |
+
ngram_lm_scale_0.3_attention_scale_1.9 16.25
|
950 |
+
ngram_lm_scale_0.1_attention_scale_3.0 16.27
|
951 |
+
ngram_lm_scale_0.3_attention_scale_2.0 16.28
|
952 |
+
ngram_lm_scale_0.3_attention_scale_2.1 16.3
|
953 |
+
ngram_lm_scale_0.3_attention_scale_2.2 16.31
|
954 |
+
ngram_lm_scale_0.3_attention_scale_2.3 16.32
|
955 |
+
ngram_lm_scale_0.01_attention_scale_4.0 16.34
|
956 |
+
ngram_lm_scale_0.05_attention_scale_4.0 16.35
|
957 |
+
ngram_lm_scale_0.3_attention_scale_2.5 16.35
|
958 |
+
ngram_lm_scale_0.08_attention_scale_4.0 16.37
|
959 |
+
ngram_lm_scale_0.1_attention_scale_4.0 16.39
|
960 |
+
ngram_lm_scale_0.3_attention_scale_0.1 16.4
|
961 |
+
ngram_lm_scale_0.3_attention_scale_3.0 16.41
|
962 |
+
ngram_lm_scale_0.3_attention_scale_0.08 16.42
|
963 |
+
ngram_lm_scale_0.01_attention_scale_5.0 16.43
|
964 |
+
ngram_lm_scale_0.05_attention_scale_5.0 16.45
|
965 |
+
ngram_lm_scale_0.08_attention_scale_5.0 16.47
|
966 |
+
ngram_lm_scale_0.1_attention_scale_5.0 16.49
|
967 |
+
ngram_lm_scale_0.3_attention_scale_0.05 16.49
|
968 |
+
ngram_lm_scale_0.5_attention_scale_1.2 16.53
|
969 |
+
ngram_lm_scale_0.3_attention_scale_4.0 16.54
|
970 |
+
ngram_lm_scale_0.5_attention_scale_0.9 16.54
|
971 |
+
ngram_lm_scale_0.5_attention_scale_1.3 16.54
|
972 |
+
ngram_lm_scale_0.5_attention_scale_1.0 16.55
|
973 |
+
ngram_lm_scale_0.5_attention_scale_1.1 16.55
|
974 |
+
ngram_lm_scale_0.5_attention_scale_1.9 16.55
|
975 |
+
ngram_lm_scale_0.5_attention_scale_1.7 16.56
|
976 |
+
ngram_lm_scale_0.5_attention_scale_2.0 16.56
|
977 |
+
ngram_lm_scale_0.5_attention_scale_1.5 16.57
|
978 |
+
ngram_lm_scale_0.5_attention_scale_2.1 16.58
|
979 |
+
ngram_lm_scale_0.5_attention_scale_2.2 16.58
|
980 |
+
ngram_lm_scale_0.3_attention_scale_0.01 16.59
|
981 |
+
ngram_lm_scale_0.5_attention_scale_2.3 16.6
|
982 |
+
ngram_lm_scale_0.5_attention_scale_2.5 16.6
|
983 |
+
ngram_lm_scale_0.3_attention_scale_5.0 16.61
|
984 |
+
ngram_lm_scale_0.5_attention_scale_0.7 16.62
|
985 |
+
ngram_lm_scale_0.5_attention_scale_3.0 16.65
|
986 |
+
ngram_lm_scale_0.5_attention_scale_0.6 16.69
|
987 |
+
ngram_lm_scale_0.5_attention_scale_4.0 16.72
|
988 |
+
ngram_lm_scale_0.5_attention_scale_0.5 16.74
|
989 |
+
ngram_lm_scale_0.5_attention_scale_5.0 16.75
|
990 |
+
ngram_lm_scale_0.6_attention_scale_2.1 16.79
|
991 |
+
ngram_lm_scale_0.6_attention_scale_2.2 16.79
|
992 |
+
ngram_lm_scale_0.6_attention_scale_2.3 16.79
|
993 |
+
ngram_lm_scale_0.6_attention_scale_3.0 16.79
|
994 |
+
ngram_lm_scale_0.6_attention_scale_2.0 16.81
|
995 |
+
ngram_lm_scale_0.6_attention_scale_2.5 16.81
|
996 |
+
ngram_lm_scale_0.6_attention_scale_1.9 16.82
|
997 |
+
ngram_lm_scale_0.6_attention_scale_1.7 16.83
|
998 |
+
ngram_lm_scale_0.6_attention_scale_4.0 16.83
|
999 |
+
ngram_lm_scale_0.6_attention_scale_1.5 16.85
|
1000 |
+
ngram_lm_scale_0.6_attention_scale_1.3 16.87
|
1001 |
+
ngram_lm_scale_0.6_attention_scale_1.2 16.88
|
1002 |
+
ngram_lm_scale_0.6_attention_scale_1.1 16.89
|
1003 |
+
ngram_lm_scale_0.6_attention_scale_5.0 16.89
|
1004 |
+
ngram_lm_scale_0.6_attention_scale_1.0 16.94
|
1005 |
+
ngram_lm_scale_0.7_attention_scale_4.0 16.95
|
1006 |
+
ngram_lm_scale_0.6_attention_scale_0.9 16.96
|
1007 |
+
ngram_lm_scale_0.7_attention_scale_3.0 16.97
|
1008 |
+
ngram_lm_scale_0.7_attention_scale_5.0 16.97
|
1009 |
+
ngram_lm_scale_0.5_attention_scale_0.3 16.98
|
1010 |
+
ngram_lm_scale_0.7_attention_scale_2.5 17.0
|
1011 |
+
ngram_lm_scale_0.7_attention_scale_2.1 17.02
|
1012 |
+
ngram_lm_scale_0.7_attention_scale_2.2 17.02
|
1013 |
+
ngram_lm_scale_0.7_attention_scale_2.3 17.02
|
1014 |
+
ngram_lm_scale_0.7_attention_scale_2.0 17.04
|
1015 |
+
ngram_lm_scale_0.7_attention_scale_1.9 17.06
|
1016 |
+
ngram_lm_scale_0.7_attention_scale_1.7 17.09
|
1017 |
+
ngram_lm_scale_0.6_attention_scale_0.7 17.1
|
1018 |
+
ngram_lm_scale_0.7_attention_scale_1.5 17.15
|
1019 |
+
ngram_lm_scale_0.9_attention_scale_5.0 17.18
|
1020 |
+
ngram_lm_scale_0.7_attention_scale_1.3 17.2
|
1021 |
+
ngram_lm_scale_0.6_attention_scale_0.6 17.23
|
1022 |
+
ngram_lm_scale_0.7_attention_scale_1.2 17.24
|
1023 |
+
ngram_lm_scale_0.9_attention_scale_4.0 17.24
|
1024 |
+
ngram_lm_scale_0.7_attention_scale_1.1 17.26
|
1025 |
+
ngram_lm_scale_1.0_attention_scale_5.0 17.28
|
1026 |
+
ngram_lm_scale_0.6_attention_scale_0.5 17.31
|
1027 |
+
ngram_lm_scale_0.7_attention_scale_1.0 17.33
|
1028 |
+
ngram_lm_scale_0.9_attention_scale_3.0 17.35
|
1029 |
+
ngram_lm_scale_1.0_attention_scale_4.0 17.38
|
1030 |
+
ngram_lm_scale_1.1_attention_scale_5.0 17.43
|
1031 |
+
ngram_lm_scale_0.7_attention_scale_0.9 17.45
|
1032 |
+
ngram_lm_scale_0.9_attention_scale_2.5 17.47
|
1033 |
+
ngram_lm_scale_0.9_attention_scale_2.3 17.53
|
1034 |
+
ngram_lm_scale_1.1_attention_scale_4.0 17.55
|
1035 |
+
ngram_lm_scale_0.9_attention_scale_2.2 17.57
|
1036 |
+
ngram_lm_scale_1.2_attention_scale_5.0 17.57
|
1037 |
+
ngram_lm_scale_1.0_attention_scale_3.0 17.59
|
1038 |
+
ngram_lm_scale_0.9_attention_scale_2.1 17.61
|
1039 |
+
ngram_lm_scale_0.9_attention_scale_2.0 17.64
|
1040 |
+
ngram_lm_scale_0.5_attention_scale_0.1 17.65
|
1041 |
+
ngram_lm_scale_0.7_attention_scale_0.7 17.67
|
1042 |
+
ngram_lm_scale_0.9_attention_scale_1.9 17.69
|
1043 |
+
ngram_lm_scale_1.3_attention_scale_5.0 17.69
|
1044 |
+
ngram_lm_scale_1.2_attention_scale_4.0 17.73
|
1045 |
+
ngram_lm_scale_0.5_attention_scale_0.08 17.76
|
1046 |
+
ngram_lm_scale_1.0_attention_scale_2.5 17.76
|
1047 |
+
ngram_lm_scale_0.6_attention_scale_0.3 17.78
|
1048 |
+
ngram_lm_scale_0.7_attention_scale_0.6 17.8
|
1049 |
+
ngram_lm_scale_1.0_attention_scale_2.3 17.81
|
1050 |
+
ngram_lm_scale_1.1_attention_scale_3.0 17.82
|
1051 |
+
ngram_lm_scale_0.9_attention_scale_1.7 17.83
|
1052 |
+
ngram_lm_scale_1.0_attention_scale_2.2 17.87
|
1053 |
+
ngram_lm_scale_1.3_attention_scale_4.0 17.9
|
1054 |
+
ngram_lm_scale_1.0_attention_scale_2.1 17.94
|
1055 |
+
ngram_lm_scale_0.5_attention_scale_0.05 17.95
|
1056 |
+
ngram_lm_scale_0.9_attention_scale_1.5 17.96
|
1057 |
+
ngram_lm_scale_1.1_attention_scale_2.5 17.97
|
1058 |
+
ngram_lm_scale_1.5_attention_scale_5.0 17.99
|
1059 |
+
ngram_lm_scale_1.2_attention_scale_3.0 18.0
|
1060 |
+
ngram_lm_scale_1.0_attention_scale_2.0 18.01
|
1061 |
+
ngram_lm_scale_1.0_attention_scale_1.9 18.04
|
1062 |
+
ngram_lm_scale_0.7_attention_scale_0.5 18.09
|
1063 |
+
ngram_lm_scale_1.1_attention_scale_2.3 18.12
|
1064 |
+
ngram_lm_scale_0.9_attention_scale_1.3 18.14
|
1065 |
+
ngram_lm_scale_1.1_attention_scale_2.2 18.21
|
1066 |
+
ngram_lm_scale_1.0_attention_scale_1.7 18.23
|
1067 |
+
ngram_lm_scale_0.9_attention_scale_1.2 18.25
|
1068 |
+
ngram_lm_scale_0.5_attention_scale_0.01 18.27
|
1069 |
+
ngram_lm_scale_1.1_attention_scale_2.1 18.27
|
1070 |
+
ngram_lm_scale_1.3_attention_scale_3.0 18.32
|
1071 |
+
ngram_lm_scale_1.5_attention_scale_4.0 18.35
|
1072 |
+
ngram_lm_scale_1.7_attention_scale_5.0 18.36
|
1073 |
+
ngram_lm_scale_1.1_attention_scale_2.0 18.37
|
1074 |
+
ngram_lm_scale_1.2_attention_scale_2.5 18.37
|
1075 |
+
ngram_lm_scale_0.9_attention_scale_1.1 18.38
|
1076 |
+
ngram_lm_scale_1.0_attention_scale_1.5 18.44
|
1077 |
+
ngram_lm_scale_1.1_attention_scale_1.9 18.47
|
1078 |
+
ngram_lm_scale_1.2_attention_scale_2.3 18.5
|
1079 |
+
ngram_lm_scale_0.9_attention_scale_1.0 18.57
|
1080 |
+
ngram_lm_scale_1.2_attention_scale_2.2 18.59
|
1081 |
+
ngram_lm_scale_1.1_attention_scale_1.7 18.69
|
1082 |
+
ngram_lm_scale_0.9_attention_scale_0.9 18.7
|
1083 |
+
ngram_lm_scale_1.2_attention_scale_2.1 18.7
|
1084 |
+
ngram_lm_scale_1.0_attention_scale_1.3 18.71
|
1085 |
+
ngram_lm_scale_1.3_attention_scale_2.5 18.71
|
1086 |
+
ngram_lm_scale_0.7_attention_scale_0.3 18.76
|
1087 |
+
ngram_lm_scale_0.6_attention_scale_0.1 18.78
|
1088 |
+
ngram_lm_scale_1.9_attention_scale_5.0 18.78
|
1089 |
+
ngram_lm_scale_1.2_attention_scale_2.0 18.81
|
1090 |
+
ngram_lm_scale_1.7_attention_scale_4.0 18.83
|
1091 |
+
ngram_lm_scale_1.0_attention_scale_1.2 18.88
|
1092 |
+
ngram_lm_scale_1.3_attention_scale_2.3 18.91
|
1093 |
+
ngram_lm_scale_1.2_attention_scale_1.9 18.95
|
1094 |
+
ngram_lm_scale_1.5_attention_scale_3.0 18.95
|
1095 |
+
ngram_lm_scale_0.6_attention_scale_0.08 18.96
|
1096 |
+
ngram_lm_scale_2.0_attention_scale_5.0 18.96
|
1097 |
+
ngram_lm_scale_1.1_attention_scale_1.5 18.97
|
1098 |
+
ngram_lm_scale_1.0_attention_scale_1.1 19.04
|
1099 |
+
ngram_lm_scale_1.3_attention_scale_2.2 19.08
|
1100 |
+
ngram_lm_scale_1.3_attention_scale_2.1 19.17
|
1101 |
+
ngram_lm_scale_2.1_attention_scale_5.0 19.18
|
1102 |
+
ngram_lm_scale_0.6_attention_scale_0.05 19.21
|
1103 |
+
ngram_lm_scale_0.9_attention_scale_0.7 19.27
|
1104 |
+
ngram_lm_scale_1.2_attention_scale_1.7 19.28
|
1105 |
+
ngram_lm_scale_1.3_attention_scale_2.0 19.34
|
1106 |
+
ngram_lm_scale_1.9_attention_scale_4.0 19.36
|
1107 |
+
ngram_lm_scale_1.0_attention_scale_1.0 19.4
|
1108 |
+
ngram_lm_scale_2.2_attention_scale_5.0 19.41
|
1109 |
+
ngram_lm_scale_1.1_attention_scale_1.3 19.45
|
1110 |
+
ngram_lm_scale_1.3_attention_scale_1.9 19.53
|
1111 |
+
ngram_lm_scale_1.5_attention_scale_2.5 19.54
|
1112 |
+
ngram_lm_scale_1.0_attention_scale_0.9 19.63
|
1113 |
+
ngram_lm_scale_0.6_attention_scale_0.01 19.64
|
1114 |
+
ngram_lm_scale_0.9_attention_scale_0.6 19.66
|
1115 |
+
ngram_lm_scale_1.1_attention_scale_1.2 19.67
|
1116 |
+
ngram_lm_scale_1.2_attention_scale_1.5 19.67
|
1117 |
+
ngram_lm_scale_2.3_attention_scale_5.0 19.68
|
1118 |
+
ngram_lm_scale_2.0_attention_scale_4.0 19.72
|
1119 |
+
ngram_lm_scale_1.7_attention_scale_3.0 19.77
|
1120 |
+
ngram_lm_scale_1.5_attention_scale_2.3 19.91
|
1121 |
+
ngram_lm_scale_1.1_attention_scale_1.1 19.94
|
1122 |
+
ngram_lm_scale_1.3_attention_scale_1.7 19.94
|
1123 |
+
ngram_lm_scale_2.1_attention_scale_4.0 20.03
|
1124 |
+
ngram_lm_scale_1.5_attention_scale_2.2 20.1
|
1125 |
+
ngram_lm_scale_1.2_attention_scale_1.3 20.19
|
1126 |
+
ngram_lm_scale_0.9_attention_scale_0.5 20.2
|
1127 |
+
ngram_lm_scale_0.7_attention_scale_0.1 20.26
|
1128 |
+
ngram_lm_scale_1.1_attention_scale_1.0 20.26
|
1129 |
+
ngram_lm_scale_2.5_attention_scale_5.0 20.27
|
1130 |
+
ngram_lm_scale_1.5_attention_scale_2.1 20.3
|
1131 |
+
ngram_lm_scale_2.2_attention_scale_4.0 20.38
|
1132 |
+
ngram_lm_scale_1.3_attention_scale_1.5 20.4
|
1133 |
+
ngram_lm_scale_1.0_attention_scale_0.7 20.47
|
1134 |
+
ngram_lm_scale_1.2_attention_scale_1.2 20.48
|
1135 |
+
ngram_lm_scale_1.5_attention_scale_2.0 20.51
|
1136 |
+
ngram_lm_scale_0.7_attention_scale_0.08 20.53
|
1137 |
+
ngram_lm_scale_1.7_attention_scale_2.5 20.58
|
1138 |
+
ngram_lm_scale_1.1_attention_scale_0.9 20.61
|
1139 |
+
ngram_lm_scale_1.9_attention_scale_3.0 20.67
|
1140 |
+
ngram_lm_scale_1.5_attention_scale_1.9 20.76
|
1141 |
+
ngram_lm_scale_1.2_attention_scale_1.1 20.8
|
1142 |
+
ngram_lm_scale_0.7_attention_scale_0.05 20.83
|
1143 |
+
ngram_lm_scale_2.3_attention_scale_4.0 20.84
|
1144 |
+
ngram_lm_scale_1.0_attention_scale_0.6 20.96
|
1145 |
+
ngram_lm_scale_1.7_attention_scale_2.3 21.03
|
1146 |
+
ngram_lm_scale_1.3_attention_scale_1.3 21.05
|
1147 |
+
ngram_lm_scale_2.0_attention_scale_3.0 21.15
|
1148 |
+
ngram_lm_scale_1.5_attention_scale_1.7 21.25
|
1149 |
+
ngram_lm_scale_1.2_attention_scale_1.0 21.26
|
1150 |
+
ngram_lm_scale_1.7_attention_scale_2.2 21.27
|
1151 |
+
ngram_lm_scale_0.7_attention_scale_0.01 21.36
|
1152 |
+
ngram_lm_scale_1.3_attention_scale_1.2 21.39
|
1153 |
+
ngram_lm_scale_1.7_attention_scale_2.1 21.55
|
1154 |
+
ngram_lm_scale_2.5_attention_scale_4.0 21.61
|
1155 |
+
ngram_lm_scale_1.0_attention_scale_0.5 21.62
|
1156 |
+
ngram_lm_scale_2.1_attention_scale_3.0 21.68
|
1157 |
+
ngram_lm_scale_1.1_attention_scale_0.7 21.69
|
1158 |
+
ngram_lm_scale_0.9_attention_scale_0.3 21.71
|
1159 |
+
ngram_lm_scale_1.2_attention_scale_0.9 21.75
|
1160 |
+
ngram_lm_scale_1.9_attention_scale_2.5 21.79
|
1161 |
+
ngram_lm_scale_1.7_attention_scale_2.0 21.83
|
1162 |
+
ngram_lm_scale_1.3_attention_scale_1.1 21.84
|
1163 |
+
ngram_lm_scale_3.0_attention_scale_5.0 21.94
|
1164 |
+
ngram_lm_scale_1.5_attention_scale_1.5 22.0
|
1165 |
+
ngram_lm_scale_1.7_attention_scale_1.9 22.2
|
1166 |
+
ngram_lm_scale_2.2_attention_scale_3.0 22.2
|
1167 |
+
ngram_lm_scale_1.9_attention_scale_2.3 22.28
|
1168 |
+
ngram_lm_scale_1.3_attention_scale_1.0 22.33
|
1169 |
+
ngram_lm_scale_2.0_attention_scale_2.5 22.33
|
1170 |
+
ngram_lm_scale_1.1_attention_scale_0.6 22.38
|
1171 |
+
ngram_lm_scale_1.9_attention_scale_2.2 22.6
|
1172 |
+
ngram_lm_scale_2.3_attention_scale_3.0 22.7
|
1173 |
+
ngram_lm_scale_1.3_attention_scale_0.9 22.96
|
1174 |
+
ngram_lm_scale_1.5_attention_scale_1.3 22.96
|
1175 |
+
ngram_lm_scale_2.1_attention_scale_2.5 22.98
|
1176 |
+
ngram_lm_scale_1.9_attention_scale_2.1 22.99
|
1177 |
+
ngram_lm_scale_2.0_attention_scale_2.3 22.99
|
1178 |
+
ngram_lm_scale_1.7_attention_scale_1.7 23.0
|
1179 |
+
ngram_lm_scale_1.2_attention_scale_0.7 23.04
|
1180 |
+
ngram_lm_scale_1.1_attention_scale_0.5 23.26
|
1181 |
+
ngram_lm_scale_2.0_attention_scale_2.2 23.34
|
1182 |
+
ngram_lm_scale_1.9_attention_scale_2.0 23.37
|
1183 |
+
ngram_lm_scale_1.5_attention_scale_1.2 23.5
|
1184 |
+
ngram_lm_scale_1.0_attention_scale_0.3 23.52
|
1185 |
+
ngram_lm_scale_2.2_attention_scale_2.5 23.63
|
1186 |
+
ngram_lm_scale_2.1_attention_scale_2.3 23.69
|
1187 |
+
ngram_lm_scale_2.0_attention_scale_2.1 23.74
|
1188 |
+
ngram_lm_scale_1.9_attention_scale_1.9 23.8
|
1189 |
+
ngram_lm_scale_3.0_attention_scale_4.0 23.81
|
1190 |
+
ngram_lm_scale_1.2_attention_scale_0.6 23.91
|
1191 |
+
ngram_lm_scale_1.7_attention_scale_1.5 23.91
|
1192 |
+
ngram_lm_scale_2.5_attention_scale_3.0 23.93
|
1193 |
+
ngram_lm_scale_2.1_attention_scale_2.2 24.09
|
1194 |
+
ngram_lm_scale_1.5_attention_scale_1.1 24.15
|
1195 |
+
ngram_lm_scale_2.0_attention_scale_2.0 24.15
|
1196 |
+
ngram_lm_scale_0.9_attention_scale_0.1 24.2
|
1197 |
+
ngram_lm_scale_2.3_attention_scale_2.5 24.35
|
1198 |
+
ngram_lm_scale_2.2_attention_scale_2.3 24.38
|
1199 |
+
ngram_lm_scale_2.1_attention_scale_2.1 24.5
|
1200 |
+
ngram_lm_scale_1.3_attention_scale_0.7 24.53
|
1201 |
+
ngram_lm_scale_0.9_attention_scale_0.08 24.55
|
1202 |
+
ngram_lm_scale_2.0_attention_scale_1.9 24.62
|
1203 |
+
ngram_lm_scale_1.2_attention_scale_0.5 24.77
|
1204 |
+
ngram_lm_scale_1.9_attention_scale_1.7 24.77
|
1205 |
+
ngram_lm_scale_1.5_attention_scale_1.0 24.85
|
1206 |
+
ngram_lm_scale_2.2_attention_scale_2.2 24.85
|
1207 |
+
ngram_lm_scale_2.1_attention_scale_2.0 24.98
|
1208 |
+
ngram_lm_scale_0.9_attention_scale_0.05 25.09
|
1209 |
+
ngram_lm_scale_1.7_attention_scale_1.3 25.1
|
1210 |
+
ngram_lm_scale_2.3_attention_scale_2.3 25.13
|
1211 |
+
ngram_lm_scale_2.2_attention_scale_2.1 25.26
|
1212 |
+
ngram_lm_scale_1.1_attention_scale_0.3 25.41
|
1213 |
+
ngram_lm_scale_1.3_attention_scale_0.6 25.44
|
1214 |
+
ngram_lm_scale_2.1_attention_scale_1.9 25.45
|
1215 |
+
ngram_lm_scale_1.5_attention_scale_0.9 25.59
|
1216 |
+
ngram_lm_scale_2.3_attention_scale_2.2 25.61
|
1217 |
+
ngram_lm_scale_2.0_attention_scale_1.7 25.68
|
1218 |
+
ngram_lm_scale_4.0_attention_scale_5.0 25.73
|
1219 |
+
ngram_lm_scale_1.7_attention_scale_1.2 25.82
|
1220 |
+
ngram_lm_scale_2.5_attention_scale_2.5 25.82
|
1221 |
+
ngram_lm_scale_2.2_attention_scale_2.0 25.84
|
1222 |
+
ngram_lm_scale_0.9_attention_scale_0.01 25.88
|
1223 |
+
ngram_lm_scale_1.9_attention_scale_1.5 26.0
|
1224 |
+
ngram_lm_scale_2.3_attention_scale_2.1 26.21
|
1225 |
+
ngram_lm_scale_2.2_attention_scale_1.9 26.47
|
1226 |
+
ngram_lm_scale_1.0_attention_scale_0.1 26.5
|
1227 |
+
ngram_lm_scale_1.3_attention_scale_0.5 26.6
|
1228 |
+
ngram_lm_scale_1.7_attention_scale_1.1 26.6
|
1229 |
+
ngram_lm_scale_2.1_attention_scale_1.7 26.77
|
1230 |
+
ngram_lm_scale_2.3_attention_scale_2.0 26.77
|
1231 |
+
ngram_lm_scale_2.5_attention_scale_2.3 26.85
|
1232 |
+
ngram_lm_scale_1.0_attention_scale_0.08 26.86
|
1233 |
+
ngram_lm_scale_2.0_attention_scale_1.5 27.06
|
1234 |
+
ngram_lm_scale_3.0_attention_scale_3.0 27.06
|
1235 |
+
ngram_lm_scale_1.9_attention_scale_1.3 27.31
|
1236 |
+
ngram_lm_scale_1.7_attention_scale_1.0 27.35
|
1237 |
+
ngram_lm_scale_2.5_attention_scale_2.2 27.35
|
1238 |
+
ngram_lm_scale_2.3_attention_scale_1.9 27.37
|
1239 |
+
ngram_lm_scale_1.2_attention_scale_0.3 27.4
|
1240 |
+
ngram_lm_scale_1.0_attention_scale_0.05 27.43
|
1241 |
+
ngram_lm_scale_1.5_attention_scale_0.7 27.59
|
1242 |
+
ngram_lm_scale_2.2_attention_scale_1.7 27.63
|
1243 |
+
ngram_lm_scale_2.5_attention_scale_2.1 27.83
|
1244 |
+
ngram_lm_scale_2.1_attention_scale_1.5 27.96
|
1245 |
+
ngram_lm_scale_1.9_attention_scale_1.2 28.08
|
1246 |
+
ngram_lm_scale_1.0_attention_scale_0.01 28.25
|
1247 |
+
ngram_lm_scale_1.7_attention_scale_0.9 28.3
|
1248 |
+
ngram_lm_scale_2.5_attention_scale_2.0 28.31
|
1249 |
+
ngram_lm_scale_2.0_attention_scale_1.3 28.4
|
1250 |
+
ngram_lm_scale_2.3_attention_scale_1.7 28.44
|
1251 |
+
ngram_lm_scale_4.0_attention_scale_4.0 28.62
|
1252 |
+
ngram_lm_scale_1.1_attention_scale_0.1 28.76
|
1253 |
+
ngram_lm_scale_1.5_attention_scale_0.6 28.76
|
1254 |
+
ngram_lm_scale_2.5_attention_scale_1.9 28.86
|
1255 |
+
ngram_lm_scale_1.9_attention_scale_1.1 28.89
|
1256 |
+
ngram_lm_scale_2.2_attention_scale_1.5 28.92
|
1257 |
+
ngram_lm_scale_1.1_attention_scale_0.08 29.15
|
1258 |
+
ngram_lm_scale_2.0_attention_scale_1.2 29.15
|
1259 |
+
ngram_lm_scale_3.0_attention_scale_2.5 29.22
|
1260 |
+
ngram_lm_scale_2.1_attention_scale_1.3 29.35
|
1261 |
+
ngram_lm_scale_1.3_attention_scale_0.3 29.44
|
1262 |
+
ngram_lm_scale_5.0_attention_scale_5.0 29.61
|
1263 |
+
ngram_lm_scale_2.3_attention_scale_1.5 29.69
|
1264 |
+
ngram_lm_scale_1.9_attention_scale_1.0 29.75
|
1265 |
+
ngram_lm_scale_1.1_attention_scale_0.05 29.77
|
1266 |
+
ngram_lm_scale_1.5_attention_scale_0.5 29.94
|
1267 |
+
ngram_lm_scale_2.0_attention_scale_1.1 29.96
|
1268 |
+
ngram_lm_scale_2.5_attention_scale_1.7 30.06
|
1269 |
+
ngram_lm_scale_2.1_attention_scale_1.2 30.1
|
1270 |
+
ngram_lm_scale_3.0_attention_scale_2.3 30.14
|
1271 |
+
ngram_lm_scale_2.2_attention_scale_1.3 30.29
|
1272 |
+
ngram_lm_scale_1.7_attention_scale_0.7 30.3
|
1273 |
+
ngram_lm_scale_1.1_attention_scale_0.01 30.59
|
1274 |
+
ngram_lm_scale_3.0_attention_scale_2.2 30.62
|
1275 |
+
ngram_lm_scale_1.9_attention_scale_0.9 30.64
|
1276 |
+
ngram_lm_scale_2.0_attention_scale_1.0 30.76
|
1277 |
+
ngram_lm_scale_1.2_attention_scale_0.1 30.86
|
1278 |
+
ngram_lm_scale_2.1_attention_scale_1.1 30.88
|
1279 |
+
ngram_lm_scale_2.2_attention_scale_1.2 30.98
|
1280 |
+
ngram_lm_scale_2.3_attention_scale_1.3 31.04
|
1281 |
+
ngram_lm_scale_3.0_attention_scale_2.1 31.1
|
1282 |
+
ngram_lm_scale_1.2_attention_scale_0.08 31.19
|
1283 |
+
ngram_lm_scale_2.5_attention_scale_1.5 31.23
|
1284 |
+
ngram_lm_scale_1.7_attention_scale_0.6 31.43
|
1285 |
+
ngram_lm_scale_3.0_attention_scale_2.0 31.56
|
1286 |
+
ngram_lm_scale_2.0_attention_scale_0.9 31.68
|
1287 |
+
ngram_lm_scale_2.1_attention_scale_1.0 31.71
|
1288 |
+
ngram_lm_scale_1.2_attention_scale_0.05 31.72
|
1289 |
+
ngram_lm_scale_2.2_attention_scale_1.1 31.73
|
1290 |
+
ngram_lm_scale_2.3_attention_scale_1.2 31.79
|
1291 |
+
ngram_lm_scale_4.0_attention_scale_3.0 31.95
|
1292 |
+
ngram_lm_scale_3.0_attention_scale_1.9 32.03
|
1293 |
+
ngram_lm_scale_5.0_attention_scale_4.0 32.15
|
1294 |
+
ngram_lm_scale_1.2_attention_scale_0.01 32.34
|
1295 |
+
ngram_lm_scale_2.1_attention_scale_0.9 32.53
|
1296 |
+
ngram_lm_scale_2.3_attention_scale_1.1 32.53
|
1297 |
+
ngram_lm_scale_2.5_attention_scale_1.3 32.53
|
1298 |
+
ngram_lm_scale_1.9_attention_scale_0.7 32.54
|
1299 |
+
ngram_lm_scale_1.7_attention_scale_0.5 32.55
|
1300 |
+
ngram_lm_scale_2.2_attention_scale_1.0 32.55
|
1301 |
+
ngram_lm_scale_1.3_attention_scale_0.1 32.56
|
1302 |
+
ngram_lm_scale_1.5_attention_scale_0.3 32.57
|
1303 |
+
ngram_lm_scale_1.3_attention_scale_0.08 32.82
|
1304 |
+
ngram_lm_scale_3.0_attention_scale_1.7 33.11
|
1305 |
+
ngram_lm_scale_1.3_attention_scale_0.05 33.24
|
1306 |
+
ngram_lm_scale_2.5_attention_scale_1.2 33.24
|
1307 |
+
ngram_lm_scale_2.3_attention_scale_1.0 33.28
|
1308 |
+
ngram_lm_scale_2.0_attention_scale_0.7 33.33
|
1309 |
+
ngram_lm_scale_2.2_attention_scale_0.9 33.33
|
1310 |
+
ngram_lm_scale_1.9_attention_scale_0.6 33.39
|
1311 |
+
ngram_lm_scale_4.0_attention_scale_2.5 33.71
|
1312 |
+
ngram_lm_scale_1.3_attention_scale_0.01 33.8
|
1313 |
+
ngram_lm_scale_2.5_attention_scale_1.1 33.84
|
1314 |
+
ngram_lm_scale_2.3_attention_scale_0.9 33.98
|
1315 |
+
ngram_lm_scale_3.0_attention_scale_1.5 34.1
|
1316 |
+
ngram_lm_scale_2.1_attention_scale_0.7 34.13
|
1317 |
+
ngram_lm_scale_2.0_attention_scale_0.6 34.25
|
1318 |
+
ngram_lm_scale_4.0_attention_scale_2.3 34.31
|
1319 |
+
ngram_lm_scale_1.9_attention_scale_0.5 34.37
|
1320 |
+
ngram_lm_scale_2.5_attention_scale_1.0 34.49
|
1321 |
+
ngram_lm_scale_1.7_attention_scale_0.3 34.65
|
1322 |
+
ngram_lm_scale_4.0_attention_scale_2.2 34.68
|
1323 |
+
ngram_lm_scale_5.0_attention_scale_3.0 34.68
|
1324 |
+
ngram_lm_scale_2.2_attention_scale_0.7 34.85
|
1325 |
+
ngram_lm_scale_1.5_attention_scale_0.1 34.93
|
1326 |
+
ngram_lm_scale_2.1_attention_scale_0.6 34.96
|
1327 |
+
ngram_lm_scale_4.0_attention_scale_2.1 35.05
|
1328 |
+
ngram_lm_scale_2.0_attention_scale_0.5 35.1
|
1329 |
+
ngram_lm_scale_1.5_attention_scale_0.08 35.12
|
1330 |
+
ngram_lm_scale_3.0_attention_scale_1.3 35.12
|
1331 |
+
ngram_lm_scale_2.5_attention_scale_0.9 35.15
|
1332 |
+
ngram_lm_scale_4.0_attention_scale_2.0 35.36
|
1333 |
+
ngram_lm_scale_1.5_attention_scale_0.05 35.38
|
1334 |
+
ngram_lm_scale_2.3_attention_scale_0.7 35.4
|
1335 |
+
ngram_lm_scale_3.0_attention_scale_1.2 35.5
|
1336 |
+
ngram_lm_scale_2.2_attention_scale_0.6 35.56
|
1337 |
+
ngram_lm_scale_4.0_attention_scale_1.9 35.66
|
1338 |
+
ngram_lm_scale_2.1_attention_scale_0.5 35.7
|
1339 |
+
ngram_lm_scale_1.5_attention_scale_0.01 35.81
|
1340 |
+
ngram_lm_scale_3.0_attention_scale_1.1 35.88
|
1341 |
+
ngram_lm_scale_2.3_attention_scale_0.6 35.94
|
1342 |
+
ngram_lm_scale_5.0_attention_scale_2.5 35.98
|
1343 |
+
ngram_lm_scale_1.9_attention_scale_0.3 35.99
|
1344 |
+
ngram_lm_scale_2.2_attention_scale_0.5 36.09
|
1345 |
+
ngram_lm_scale_2.5_attention_scale_0.7 36.09
|
1346 |
+
ngram_lm_scale_4.0_attention_scale_1.7 36.2
|
1347 |
+
ngram_lm_scale_3.0_attention_scale_1.0 36.26
|
1348 |
+
ngram_lm_scale_1.7_attention_scale_0.1 36.33
|
1349 |
+
ngram_lm_scale_5.0_attention_scale_2.3 36.37
|
1350 |
+
ngram_lm_scale_2.3_attention_scale_0.5 36.45
|
1351 |
+
ngram_lm_scale_2.0_attention_scale_0.3 36.46
|
1352 |
+
ngram_lm_scale_1.7_attention_scale_0.08 36.48
|
1353 |
+
ngram_lm_scale_5.0_attention_scale_2.2 36.55
|
1354 |
+
ngram_lm_scale_2.5_attention_scale_0.6 36.59
|
1355 |
+
ngram_lm_scale_1.7_attention_scale_0.05 36.66
|
1356 |
+
ngram_lm_scale_3.0_attention_scale_0.9 36.66
|
1357 |
+
ngram_lm_scale_4.0_attention_scale_1.5 36.74
|
1358 |
+
ngram_lm_scale_5.0_attention_scale_2.1 36.77
|
1359 |
+
ngram_lm_scale_2.1_attention_scale_0.3 36.82
|
1360 |
+
ngram_lm_scale_1.7_attention_scale_0.01 36.93
|
1361 |
+
ngram_lm_scale_5.0_attention_scale_2.0 36.99
|
1362 |
+
ngram_lm_scale_2.5_attention_scale_0.5 37.06
|
1363 |
+
ngram_lm_scale_2.2_attention_scale_0.3 37.11
|
1364 |
+
ngram_lm_scale_5.0_attention_scale_1.9 37.22
|
1365 |
+
ngram_lm_scale_1.9_attention_scale_0.1 37.25
|
1366 |
+
ngram_lm_scale_4.0_attention_scale_1.3 37.3
|
1367 |
+
ngram_lm_scale_1.9_attention_scale_0.08 37.35
|
1368 |
+
ngram_lm_scale_2.3_attention_scale_0.3 37.42
|
1369 |
+
ngram_lm_scale_3.0_attention_scale_0.7 37.44
|
1370 |
+
ngram_lm_scale_1.9_attention_scale_0.05 37.55
|
1371 |
+
ngram_lm_scale_2.0_attention_scale_0.1 37.58
|
1372 |
+
ngram_lm_scale_4.0_attention_scale_1.2 37.6
|
1373 |
+
ngram_lm_scale_5.0_attention_scale_1.7 37.65
|
1374 |
+
ngram_lm_scale_2.0_attention_scale_0.08 37.7
|
1375 |
+
ngram_lm_scale_1.9_attention_scale_0.01 37.78
|
1376 |
+
ngram_lm_scale_3.0_attention_scale_0.6 37.8
|
1377 |
+
ngram_lm_scale_4.0_attention_scale_1.1 37.84
|
1378 |
+
ngram_lm_scale_2.1_attention_scale_0.1 37.86
|
1379 |
+
ngram_lm_scale_2.0_attention_scale_0.05 37.87
|
1380 |
+
ngram_lm_scale_2.5_attention_scale_0.3 37.92
|
1381 |
+
ngram_lm_scale_2.1_attention_scale_0.08 37.96
|
1382 |
+
ngram_lm_scale_5.0_attention_scale_1.5 38.04
|
1383 |
+
ngram_lm_scale_2.0_attention_scale_0.01 38.07
|
1384 |
+
ngram_lm_scale_4.0_attention_scale_1.0 38.07
|
1385 |
+
ngram_lm_scale_2.2_attention_scale_0.1 38.1
|
1386 |
+
ngram_lm_scale_2.1_attention_scale_0.05 38.11
|
1387 |
+
ngram_lm_scale_3.0_attention_scale_0.5 38.14
|
1388 |
+
ngram_lm_scale_2.2_attention_scale_0.08 38.23
|
1389 |
+
ngram_lm_scale_2.1_attention_scale_0.01 38.29
|
1390 |
+
ngram_lm_scale_4.0_attention_scale_0.9 38.33
|
1391 |
+
ngram_lm_scale_2.2_attention_scale_0.05 38.36
|
1392 |
+
ngram_lm_scale_2.3_attention_scale_0.1 38.37
|
1393 |
+
ngram_lm_scale_5.0_attention_scale_1.3 38.42
|
1394 |
+
ngram_lm_scale_2.3_attention_scale_0.08 38.44
|
1395 |
+
ngram_lm_scale_2.2_attention_scale_0.01 38.52
|
1396 |
+
ngram_lm_scale_2.3_attention_scale_0.05 38.58
|
1397 |
+
ngram_lm_scale_5.0_attention_scale_1.2 38.58
|
1398 |
+
ngram_lm_scale_2.5_attention_scale_0.1 38.71
|
1399 |
+
ngram_lm_scale_2.3_attention_scale_0.01 38.73
|
1400 |
+
ngram_lm_scale_3.0_attention_scale_0.3 38.75
|
1401 |
+
ngram_lm_scale_5.0_attention_scale_1.1 38.77
|
1402 |
+
ngram_lm_scale_2.5_attention_scale_0.08 38.8
|
1403 |
+
ngram_lm_scale_4.0_attention_scale_0.7 38.8
|
1404 |
+
ngram_lm_scale_2.5_attention_scale_0.05 38.87
|
1405 |
+
ngram_lm_scale_5.0_attention_scale_1.0 38.96
|
1406 |
+
ngram_lm_scale_4.0_attention_scale_0.6 38.99
|
1407 |
+
ngram_lm_scale_2.5_attention_scale_0.01 39.05
|
1408 |
+
ngram_lm_scale_5.0_attention_scale_0.9 39.14
|
1409 |
+
ngram_lm_scale_4.0_attention_scale_0.5 39.18
|
1410 |
+
ngram_lm_scale_3.0_attention_scale_0.1 39.3
|
1411 |
+
ngram_lm_scale_3.0_attention_scale_0.08 39.37
|
1412 |
+
ngram_lm_scale_3.0_attention_scale_0.05 39.45
|
1413 |
+
ngram_lm_scale_5.0_attention_scale_0.7 39.48
|
1414 |
+
ngram_lm_scale_3.0_attention_scale_0.01 39.55
|
1415 |
+
ngram_lm_scale_4.0_attention_scale_0.3 39.6
|
1416 |
+
ngram_lm_scale_5.0_attention_scale_0.6 39.65
|
1417 |
+
ngram_lm_scale_5.0_attention_scale_0.5 39.77
|
1418 |
+
ngram_lm_scale_4.0_attention_scale_0.1 39.96
|
1419 |
+
ngram_lm_scale_4.0_attention_scale_0.08 40.0
|
1420 |
+
ngram_lm_scale_5.0_attention_scale_0.3 40.03
|
1421 |
+
ngram_lm_scale_4.0_attention_scale_0.05 40.07
|
1422 |
+
ngram_lm_scale_4.0_attention_scale_0.01 40.12
|
1423 |
+
ngram_lm_scale_5.0_attention_scale_0.1 40.32
|
1424 |
+
ngram_lm_scale_5.0_attention_scale_0.08 40.34
|
1425 |
+
ngram_lm_scale_5.0_attention_scale_0.05 40.38
|
1426 |
+
ngram_lm_scale_5.0_attention_scale_0.01 40.45
|
1427 |
+
|
1428 |
+
2022-06-26 15:43:53,296 INFO [decode.py:695] Done!
|
decoding-results/log-attention-decoder/log-decode-2022-06-27-18-54-02
ADDED
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-27 18:54:02,244 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-27 18:54:02,245 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 5000, 'use_double_scores': True, 'env_info': {'k2-version': '1.16', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '3c606c27045750bbbb7a289d8b2b09825dea521a', 'k2-git-date': 'Mon Jun 27 03:06:58 2022', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-version': '1.7.1', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': 'e24e6ac-dirty', 'icefall-git-date': 'Mon Jun 27 01:23:06 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/alt-arabic/speech/amir/k2/tmp/k2/k2/python/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3srv031', 'IP address': '10.141.0.13'}, 'epoch': 45, 'avg': 10, 'method': 'attention-decoder', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 30, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': False, 'drop_last': True, 'return_cuts': True, 'num_workers': 20, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'enable_musan': False}
|
3 |
+
2022-06-27 18:54:02,500 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-27 18:54:02,544 INFO [decode.py:559] device: cuda:0
|
5 |
+
2022-06-27 18:54:33,545 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-27 18:54:34,545 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-36.pt', 'conformer_ctc/exp_5000_att0.8/epoch-37.pt', 'conformer_ctc/exp_5000_att0.8/epoch-38.pt', 'conformer_ctc/exp_5000_att0.8/epoch-39.pt', 'conformer_ctc/exp_5000_att0.8/epoch-40.pt', 'conformer_ctc/exp_5000_att0.8/epoch-41.pt', 'conformer_ctc/exp_5000_att0.8/epoch-42.pt', 'conformer_ctc/exp_5000_att0.8/epoch-43.pt', 'conformer_ctc/exp_5000_att0.8/epoch-44.pt', 'conformer_ctc/exp_5000_att0.8/epoch-45.pt']
|
decoding-results/log-attention-decoder/log-decode-2022-06-27-19-04-48
ADDED
@@ -0,0 +1,1308 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-27 19:04:48,400 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-27 19:04:48,401 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 5000, 'use_double_scores': True, 'env_info': {'k2-version': '1.16', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '3c606c27045750bbbb7a289d8b2b09825dea521a', 'k2-git-date': 'Mon Jun 27 03:06:58 2022', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-version': '1.7.1', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': 'e24e6ac-dirty', 'icefall-git-date': 'Mon Jun 27 01:23:06 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/alt-arabic/speech/amir/k2/tmp/k2/k2/python/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3mgpu016', 'IP address': '10.141.0.3'}, 'epoch': 45, 'avg': 5, 'method': 'attention-decoder', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 30, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': False, 'drop_last': True, 'return_cuts': True, 'num_workers': 20, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'enable_musan': False}
|
3 |
+
2022-06-27 19:04:48,655 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-27 19:04:48,686 INFO [decode.py:559] device: cuda:0
|
5 |
+
2022-06-27 19:05:17,818 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-27 19:05:18,546 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-41.pt', 'conformer_ctc/exp_5000_att0.8/epoch-42.pt', 'conformer_ctc/exp_5000_att0.8/epoch-43.pt', 'conformer_ctc/exp_5000_att0.8/epoch-44.pt', 'conformer_ctc/exp_5000_att0.8/epoch-45.pt']
|
7 |
+
2022-06-27 19:05:21,863 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-27 19:05:21,864 INFO [asr_datamodule.py:362] About to get test cuts
|
9 |
+
2022-06-27 19:05:21,867 INFO [asr_datamodule.py:357] About to get dev cuts
|
10 |
+
2022-06-27 19:05:24,421 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
11 |
+
2022-06-27 19:07:22,224 INFO [decode.py:783] Caught exception:
|
12 |
+
CUDA out of memory. Tried to allocate 1.70 GiB (GPU 0; 31.75 GiB total capacity; 27.32 GiB already allocated; 470.50 MiB free; 30.09 GiB reserved in total by PyTorch)
|
13 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
14 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
15 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
16 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
17 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
18 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
19 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
20 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0x71 (0x2aab11432f51 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
21 |
+
frame #7: <unknown function> + 0x2472bd (0x2aab115c32bd in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
22 |
+
frame #8: k2::RaggedShapeFromTotSizes(std::shared_ptr<k2::Context>, int, int const*) + 0x213 (0x2aab115c3b83 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
23 |
+
frame #9: k2::IndexAxis0(k2::RaggedShape&, k2::Array1<int> const&, k2::Array1<int>*) + 0x32c (0x2aab115d77ec in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
24 |
+
frame #10: k2::Index(k2::RaggedShape&, int, k2::Array1<int> const&, k2::Array1<int>*) + 0x353 (0x2aab115dc943 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
25 |
+
frame #11: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x407 (0x2aab11552327 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
26 |
+
frame #12: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x3a2 (0x2aab11545682 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
27 |
+
frame #13: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
28 |
+
frame #14: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
29 |
+
<omitting python frames>
|
30 |
+
frame #44: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
31 |
+
|
32 |
+
|
33 |
+
2022-06-27 19:07:22,225 INFO [decode.py:789] num_arcs before pruning: 940457
|
34 |
+
2022-06-27 19:07:22,225 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
35 |
+
2022-06-27 19:07:22,238 INFO [decode.py:803] num_arcs after pruning: 8198
|
36 |
+
2022-06-27 19:08:00,940 INFO [decode.py:483] batch 100/?, cuts processed until now is 407
|
37 |
+
2022-06-27 19:10:32,748 INFO [decode.py:483] batch 200/?, cuts processed until now is 839
|
38 |
+
2022-06-27 19:12:53,794 INFO [decode.py:483] batch 300/?, cuts processed until now is 1272
|
39 |
+
2022-06-27 19:15:03,281 INFO [decode.py:483] batch 400/?, cuts processed until now is 1702
|
40 |
+
2022-06-27 19:17:04,528 INFO [decode.py:483] batch 500/?, cuts processed until now is 2109
|
41 |
+
2022-06-27 19:19:19,810 INFO [decode.py:483] batch 600/?, cuts processed until now is 2544
|
42 |
+
2022-06-27 19:21:50,925 INFO [decode.py:483] batch 700/?, cuts processed until now is 2978
|
43 |
+
2022-06-27 19:24:17,295 INFO [decode.py:483] batch 800/?, cuts processed until now is 3384
|
44 |
+
2022-06-27 19:26:40,070 INFO [decode.py:483] batch 900/?, cuts processed until now is 3811
|
45 |
+
2022-06-27 19:28:46,514 INFO [decode.py:783] Caught exception:
|
46 |
+
CUDA out of memory. Tried to allocate 1.67 GiB (GPU 0; 31.75 GiB total capacity; 27.27 GiB already allocated; 990.50 MiB free; 29.58 GiB reserved in total by PyTorch)
|
47 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
48 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
49 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
50 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
51 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
52 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
53 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
54 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0x71 (0x2aab11432f51 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
55 |
+
frame #7: <unknown function> + 0x2472bd (0x2aab115c32bd in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
56 |
+
frame #8: k2::RaggedShapeFromTotSizes(std::shared_ptr<k2::Context>, int, int const*) + 0x213 (0x2aab115c3b83 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
57 |
+
frame #9: k2::IndexAxis0(k2::RaggedShape&, k2::Array1<int> const&, k2::Array1<int>*) + 0x32c (0x2aab115d77ec in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
58 |
+
frame #10: k2::Index(k2::RaggedShape&, int, k2::Array1<int> const&, k2::Array1<int>*) + 0x353 (0x2aab115dc943 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
59 |
+
frame #11: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x407 (0x2aab11552327 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
60 |
+
frame #12: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x3a2 (0x2aab11545682 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
61 |
+
frame #13: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
62 |
+
frame #14: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
63 |
+
<omitting python frames>
|
64 |
+
frame #44: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
65 |
+
|
66 |
+
|
67 |
+
2022-06-27 19:28:46,515 INFO [decode.py:789] num_arcs before pruning: 1034414
|
68 |
+
2022-06-27 19:28:46,515 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
69 |
+
2022-06-27 19:28:46,527 INFO [decode.py:803] num_arcs after pruning: 5251
|
70 |
+
2022-06-27 19:29:03,366 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4220
|
71 |
+
2022-06-27 19:31:12,021 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4631
|
72 |
+
2022-06-27 19:33:19,271 INFO [decode.py:483] batch 1200/?, cuts processed until now is 5033
|
73 |
+
2022-06-27 19:35:04,562 INFO [decode.py:783] Caught exception:
|
74 |
+
CUDA out of memory. Tried to allocate 1.23 GiB (GPU 0; 31.75 GiB total capacity; 26.53 GiB already allocated; 1010.50 MiB free; 29.56 GiB reserved in total by PyTorch)
|
75 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
76 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
77 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
78 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
79 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
80 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
81 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
82 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0x71 (0x2aab11432f51 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
83 |
+
frame #7: <unknown function> + 0x2472bd (0x2aab115c32bd in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
84 |
+
frame #8: k2::IndexAxis0(k2::RaggedShape&, k2::Array1<int> const&, k2::Array1<int>*) + 0x2d9 (0x2aab115d7799 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
85 |
+
frame #9: k2::Index(k2::RaggedShape&, int, k2::Array1<int> const&, k2::Array1<int>*) + 0x353 (0x2aab115dc943 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
86 |
+
frame #10: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x407 (0x2aab11552327 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
87 |
+
frame #11: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x3a2 (0x2aab11545682 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
88 |
+
frame #12: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
89 |
+
frame #13: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
90 |
+
<omitting python frames>
|
91 |
+
frame #43: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
92 |
+
|
93 |
+
|
94 |
+
2022-06-27 19:35:04,563 INFO [decode.py:789] num_arcs before pruning: 1081951
|
95 |
+
2022-06-27 19:35:04,563 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
96 |
+
2022-06-27 19:35:04,576 INFO [decode.py:803] num_arcs after pruning: 6154
|
97 |
+
2022-06-27 19:35:29,511 INFO [decode.py:483] batch 1300/?, cuts processed until now is 5355
|
98 |
+
2022-06-27 19:39:19,317 INFO [decode.py:532]
|
99 |
+
For test, WER of different settings are:
|
100 |
+
ngram_lm_scale_0.01_attention_scale_0.3 15.08 best for test
|
101 |
+
ngram_lm_scale_0.01_attention_scale_0.5 15.1
|
102 |
+
ngram_lm_scale_0.01_attention_scale_0.6 15.1
|
103 |
+
ngram_lm_scale_0.01_attention_scale_0.7 15.11
|
104 |
+
ngram_lm_scale_0.05_attention_scale_0.3 15.11
|
105 |
+
ngram_lm_scale_0.05_attention_scale_0.5 15.11
|
106 |
+
ngram_lm_scale_0.01_attention_scale_0.9 15.13
|
107 |
+
ngram_lm_scale_0.05_attention_scale_0.6 15.13
|
108 |
+
ngram_lm_scale_0.05_attention_scale_0.7 15.13
|
109 |
+
ngram_lm_scale_0.08_attention_scale_0.5 15.13
|
110 |
+
ngram_lm_scale_0.08_attention_scale_0.3 15.14
|
111 |
+
ngram_lm_scale_0.08_attention_scale_0.6 15.14
|
112 |
+
ngram_lm_scale_0.1_attention_scale_0.3 15.14
|
113 |
+
ngram_lm_scale_0.01_attention_scale_1.0 15.15
|
114 |
+
ngram_lm_scale_0.08_attention_scale_0.7 15.15
|
115 |
+
ngram_lm_scale_0.1_attention_scale_0.6 15.16
|
116 |
+
ngram_lm_scale_0.1_attention_scale_0.5 15.17
|
117 |
+
ngram_lm_scale_0.1_attention_scale_0.7 15.17
|
118 |
+
ngram_lm_scale_0.01_attention_scale_1.1 15.18
|
119 |
+
ngram_lm_scale_0.05_attention_scale_0.9 15.18
|
120 |
+
ngram_lm_scale_0.01_attention_scale_0.1 15.19
|
121 |
+
ngram_lm_scale_0.05_attention_scale_1.0 15.19
|
122 |
+
ngram_lm_scale_0.08_attention_scale_0.9 15.19
|
123 |
+
ngram_lm_scale_0.01_attention_scale_1.2 15.2
|
124 |
+
ngram_lm_scale_0.05_attention_scale_0.1 15.2
|
125 |
+
ngram_lm_scale_0.1_attention_scale_0.9 15.2
|
126 |
+
ngram_lm_scale_0.05_attention_scale_1.1 15.21
|
127 |
+
ngram_lm_scale_0.01_attention_scale_0.08 15.22
|
128 |
+
ngram_lm_scale_0.01_attention_scale_1.3 15.22
|
129 |
+
ngram_lm_scale_0.08_attention_scale_1.0 15.22
|
130 |
+
ngram_lm_scale_0.01_attention_scale_0.05 15.23
|
131 |
+
ngram_lm_scale_0.05_attention_scale_0.08 15.23
|
132 |
+
ngram_lm_scale_0.05_attention_scale_1.2 15.23
|
133 |
+
ngram_lm_scale_0.08_attention_scale_0.1 15.23
|
134 |
+
ngram_lm_scale_0.08_attention_scale_1.1 15.23
|
135 |
+
ngram_lm_scale_0.1_attention_scale_1.0 15.23
|
136 |
+
ngram_lm_scale_0.05_attention_scale_0.05 15.24
|
137 |
+
ngram_lm_scale_0.1_attention_scale_0.1 15.24
|
138 |
+
ngram_lm_scale_0.1_attention_scale_1.1 15.24
|
139 |
+
ngram_lm_scale_0.05_attention_scale_1.3 15.25
|
140 |
+
ngram_lm_scale_0.08_attention_scale_0.08 15.25
|
141 |
+
ngram_lm_scale_0.08_attention_scale_1.2 15.25
|
142 |
+
ngram_lm_scale_0.01_attention_scale_1.5 15.26
|
143 |
+
ngram_lm_scale_0.08_attention_scale_1.3 15.26
|
144 |
+
ngram_lm_scale_0.05_attention_scale_1.5 15.27
|
145 |
+
ngram_lm_scale_0.1_attention_scale_0.08 15.27
|
146 |
+
ngram_lm_scale_0.01_attention_scale_0.01 15.28
|
147 |
+
ngram_lm_scale_0.08_attention_scale_0.05 15.28
|
148 |
+
ngram_lm_scale_0.1_attention_scale_1.2 15.28
|
149 |
+
ngram_lm_scale_0.1_attention_scale_1.3 15.28
|
150 |
+
ngram_lm_scale_0.01_attention_scale_1.7 15.29
|
151 |
+
ngram_lm_scale_0.08_attention_scale_1.5 15.3
|
152 |
+
ngram_lm_scale_0.1_attention_scale_1.5 15.3
|
153 |
+
ngram_lm_scale_0.05_attention_scale_0.01 15.31
|
154 |
+
ngram_lm_scale_0.05_attention_scale_1.7 15.31
|
155 |
+
ngram_lm_scale_0.1_attention_scale_0.05 15.32
|
156 |
+
ngram_lm_scale_0.08_attention_scale_0.01 15.33
|
157 |
+
ngram_lm_scale_0.08_attention_scale_1.7 15.33
|
158 |
+
ngram_lm_scale_0.01_attention_scale_1.9 15.34
|
159 |
+
ngram_lm_scale_0.01_attention_scale_2.0 15.34
|
160 |
+
ngram_lm_scale_0.1_attention_scale_0.01 15.35
|
161 |
+
ngram_lm_scale_0.01_attention_scale_2.1 15.37
|
162 |
+
ngram_lm_scale_0.05_attention_scale_1.9 15.37
|
163 |
+
ngram_lm_scale_0.3_attention_scale_0.5 15.37
|
164 |
+
ngram_lm_scale_0.3_attention_scale_0.6 15.37
|
165 |
+
ngram_lm_scale_0.05_attention_scale_2.0 15.38
|
166 |
+
ngram_lm_scale_0.1_attention_scale_1.7 15.38
|
167 |
+
ngram_lm_scale_0.3_attention_scale_0.7 15.38
|
168 |
+
ngram_lm_scale_0.01_attention_scale_2.2 15.39
|
169 |
+
ngram_lm_scale_0.08_attention_scale_1.9 15.39
|
170 |
+
ngram_lm_scale_0.3_attention_scale_0.9 15.39
|
171 |
+
ngram_lm_scale_0.01_attention_scale_2.3 15.4
|
172 |
+
ngram_lm_scale_0.01_attention_scale_2.5 15.4
|
173 |
+
ngram_lm_scale_0.05_attention_scale_2.1 15.4
|
174 |
+
ngram_lm_scale_0.05_attention_scale_2.2 15.41
|
175 |
+
ngram_lm_scale_0.05_attention_scale_2.3 15.41
|
176 |
+
ngram_lm_scale_0.08_attention_scale_2.0 15.41
|
177 |
+
ngram_lm_scale_0.1_attention_scale_1.9 15.41
|
178 |
+
ngram_lm_scale_0.05_attention_scale_2.5 15.42
|
179 |
+
ngram_lm_scale_0.08_attention_scale_2.1 15.42
|
180 |
+
ngram_lm_scale_0.08_attention_scale_2.2 15.42
|
181 |
+
ngram_lm_scale_0.08_attention_scale_2.3 15.43
|
182 |
+
ngram_lm_scale_0.08_attention_scale_2.5 15.43
|
183 |
+
ngram_lm_scale_0.1_attention_scale_2.0 15.43
|
184 |
+
ngram_lm_scale_0.1_attention_scale_2.1 15.43
|
185 |
+
ngram_lm_scale_0.1_attention_scale_2.2 15.43
|
186 |
+
ngram_lm_scale_0.1_attention_scale_2.3 15.43
|
187 |
+
ngram_lm_scale_0.3_attention_scale_1.0 15.43
|
188 |
+
ngram_lm_scale_0.1_attention_scale_2.5 15.44
|
189 |
+
ngram_lm_scale_0.01_attention_scale_3.0 15.45
|
190 |
+
ngram_lm_scale_0.3_attention_scale_0.3 15.45
|
191 |
+
ngram_lm_scale_0.3_attention_scale_1.1 15.45
|
192 |
+
ngram_lm_scale_0.3_attention_scale_1.2 15.45
|
193 |
+
ngram_lm_scale_0.3_attention_scale_1.3 15.45
|
194 |
+
ngram_lm_scale_0.05_attention_scale_3.0 15.48
|
195 |
+
ngram_lm_scale_0.3_attention_scale_1.5 15.48
|
196 |
+
ngram_lm_scale_0.08_attention_scale_3.0 15.49
|
197 |
+
ngram_lm_scale_0.1_attention_scale_3.0 15.5
|
198 |
+
ngram_lm_scale_0.3_attention_scale_1.7 15.52
|
199 |
+
ngram_lm_scale_0.3_attention_scale_1.9 15.54
|
200 |
+
ngram_lm_scale_0.01_attention_scale_4.0 15.56
|
201 |
+
ngram_lm_scale_0.3_attention_scale_2.0 15.56
|
202 |
+
ngram_lm_scale_0.3_attention_scale_2.1 15.57
|
203 |
+
ngram_lm_scale_0.3_attention_scale_2.2 15.57
|
204 |
+
ngram_lm_scale_0.05_attention_scale_4.0 15.58
|
205 |
+
ngram_lm_scale_0.3_attention_scale_2.3 15.58
|
206 |
+
ngram_lm_scale_0.08_attention_scale_4.0 15.59
|
207 |
+
ngram_lm_scale_0.1_attention_scale_4.0 15.6
|
208 |
+
ngram_lm_scale_0.3_attention_scale_2.5 15.61
|
209 |
+
ngram_lm_scale_0.01_attention_scale_5.0 15.64
|
210 |
+
ngram_lm_scale_0.3_attention_scale_0.1 15.64
|
211 |
+
ngram_lm_scale_0.3_attention_scale_3.0 15.66
|
212 |
+
ngram_lm_scale_0.05_attention_scale_5.0 15.67
|
213 |
+
ngram_lm_scale_0.08_attention_scale_5.0 15.69
|
214 |
+
ngram_lm_scale_0.3_attention_scale_0.08 15.69
|
215 |
+
ngram_lm_scale_0.1_attention_scale_5.0 15.7
|
216 |
+
ngram_lm_scale_0.3_attention_scale_4.0 15.75
|
217 |
+
ngram_lm_scale_0.3_attention_scale_0.05 15.78
|
218 |
+
ngram_lm_scale_0.3_attention_scale_5.0 15.81
|
219 |
+
ngram_lm_scale_0.5_attention_scale_1.5 15.81
|
220 |
+
ngram_lm_scale_0.5_attention_scale_2.0 15.81
|
221 |
+
ngram_lm_scale_0.5_attention_scale_2.1 15.81
|
222 |
+
ngram_lm_scale_0.5_attention_scale_1.3 15.82
|
223 |
+
ngram_lm_scale_0.5_attention_scale_1.7 15.82
|
224 |
+
ngram_lm_scale_0.5_attention_scale_1.9 15.82
|
225 |
+
ngram_lm_scale_0.5_attention_scale_2.2 15.82
|
226 |
+
ngram_lm_scale_0.5_attention_scale_2.3 15.82
|
227 |
+
ngram_lm_scale_0.5_attention_scale_0.9 15.84
|
228 |
+
ngram_lm_scale_0.5_attention_scale_1.2 15.84
|
229 |
+
ngram_lm_scale_0.5_attention_scale_1.0 15.85
|
230 |
+
ngram_lm_scale_0.5_attention_scale_1.1 15.85
|
231 |
+
ngram_lm_scale_0.5_attention_scale_2.5 15.85
|
232 |
+
ngram_lm_scale_0.5_attention_scale_3.0 15.88
|
233 |
+
ngram_lm_scale_0.5_attention_scale_4.0 15.89
|
234 |
+
ngram_lm_scale_0.5_attention_scale_0.7 15.9
|
235 |
+
ngram_lm_scale_0.5_attention_scale_0.6 15.91
|
236 |
+
ngram_lm_scale_0.3_attention_scale_0.01 15.94
|
237 |
+
ngram_lm_scale_0.5_attention_scale_0.5 15.95
|
238 |
+
ngram_lm_scale_0.5_attention_scale_5.0 15.95
|
239 |
+
ngram_lm_scale_0.6_attention_scale_2.3 15.97
|
240 |
+
ngram_lm_scale_0.6_attention_scale_2.2 15.98
|
241 |
+
ngram_lm_scale_0.6_attention_scale_1.9 15.99
|
242 |
+
ngram_lm_scale_0.6_attention_scale_2.0 15.99
|
243 |
+
ngram_lm_scale_0.6_attention_scale_2.1 15.99
|
244 |
+
ngram_lm_scale_0.6_attention_scale_2.5 15.99
|
245 |
+
ngram_lm_scale_0.6_attention_scale_3.0 15.99
|
246 |
+
ngram_lm_scale_0.6_attention_scale_1.7 16.0
|
247 |
+
ngram_lm_scale_0.6_attention_scale_1.5 16.01
|
248 |
+
ngram_lm_scale_0.6_attention_scale_4.0 16.01
|
249 |
+
ngram_lm_scale_0.6_attention_scale_1.3 16.04
|
250 |
+
ngram_lm_scale_0.6_attention_scale_5.0 16.05
|
251 |
+
ngram_lm_scale_0.6_attention_scale_1.1 16.07
|
252 |
+
ngram_lm_scale_0.6_attention_scale_1.2 16.07
|
253 |
+
ngram_lm_scale_0.6_attention_scale_1.0 16.11
|
254 |
+
ngram_lm_scale_0.6_attention_scale_0.9 16.13
|
255 |
+
ngram_lm_scale_0.7_attention_scale_4.0 16.15
|
256 |
+
ngram_lm_scale_0.7_attention_scale_5.0 16.15
|
257 |
+
ngram_lm_scale_0.5_attention_scale_0.3 16.16
|
258 |
+
ngram_lm_scale_0.7_attention_scale_3.0 16.16
|
259 |
+
ngram_lm_scale_0.7_attention_scale_2.5 16.17
|
260 |
+
ngram_lm_scale_0.7_attention_scale_2.3 16.18
|
261 |
+
ngram_lm_scale_0.7_attention_scale_2.2 16.21
|
262 |
+
ngram_lm_scale_0.7_attention_scale_2.1 16.22
|
263 |
+
ngram_lm_scale_0.6_attention_scale_0.7 16.23
|
264 |
+
ngram_lm_scale_0.7_attention_scale_1.9 16.23
|
265 |
+
ngram_lm_scale_0.7_attention_scale_2.0 16.23
|
266 |
+
ngram_lm_scale_0.7_attention_scale_1.7 16.25
|
267 |
+
ngram_lm_scale_0.7_attention_scale_1.5 16.29
|
268 |
+
ngram_lm_scale_0.6_attention_scale_0.6 16.31
|
269 |
+
ngram_lm_scale_0.7_attention_scale_1.3 16.33
|
270 |
+
ngram_lm_scale_0.9_attention_scale_5.0 16.36
|
271 |
+
ngram_lm_scale_0.7_attention_scale_1.2 16.38
|
272 |
+
ngram_lm_scale_0.9_attention_scale_4.0 16.38
|
273 |
+
ngram_lm_scale_0.7_attention_scale_1.1 16.42
|
274 |
+
ngram_lm_scale_0.6_attention_scale_0.5 16.43
|
275 |
+
ngram_lm_scale_0.7_attention_scale_1.0 16.45
|
276 |
+
ngram_lm_scale_1.0_attention_scale_5.0 16.47
|
277 |
+
ngram_lm_scale_0.9_attention_scale_3.0 16.5
|
278 |
+
ngram_lm_scale_1.0_attention_scale_4.0 16.55
|
279 |
+
ngram_lm_scale_0.7_attention_scale_0.9 16.56
|
280 |
+
ngram_lm_scale_1.1_attention_scale_5.0 16.58
|
281 |
+
ngram_lm_scale_0.9_attention_scale_2.5 16.59
|
282 |
+
ngram_lm_scale_0.9_attention_scale_2.3 16.64
|
283 |
+
ngram_lm_scale_0.9_attention_scale_2.2 16.67
|
284 |
+
ngram_lm_scale_1.0_attention_scale_3.0 16.67
|
285 |
+
ngram_lm_scale_1.1_attention_scale_4.0 16.69
|
286 |
+
ngram_lm_scale_1.2_attention_scale_5.0 16.69
|
287 |
+
ngram_lm_scale_0.9_attention_scale_2.1 16.7
|
288 |
+
ngram_lm_scale_0.9_attention_scale_2.0 16.73
|
289 |
+
ngram_lm_scale_0.7_attention_scale_0.7 16.75
|
290 |
+
ngram_lm_scale_0.9_attention_scale_1.9 16.76
|
291 |
+
ngram_lm_scale_0.5_attention_scale_0.1 16.77
|
292 |
+
ngram_lm_scale_1.0_attention_scale_2.5 16.8
|
293 |
+
ngram_lm_scale_1.3_attention_scale_5.0 16.81
|
294 |
+
ngram_lm_scale_0.6_attention_scale_0.3 16.84
|
295 |
+
ngram_lm_scale_1.2_attention_scale_4.0 16.85
|
296 |
+
ngram_lm_scale_0.9_attention_scale_1.7 16.86
|
297 |
+
ngram_lm_scale_1.1_attention_scale_3.0 16.88
|
298 |
+
ngram_lm_scale_0.7_attention_scale_0.6 16.9
|
299 |
+
ngram_lm_scale_1.0_attention_scale_2.3 16.91
|
300 |
+
ngram_lm_scale_0.5_attention_scale_0.08 16.92
|
301 |
+
ngram_lm_scale_1.0_attention_scale_2.2 16.95
|
302 |
+
ngram_lm_scale_0.9_attention_scale_1.5 16.99
|
303 |
+
ngram_lm_scale_1.0_attention_scale_2.1 17.03
|
304 |
+
ngram_lm_scale_1.3_attention_scale_4.0 17.04
|
305 |
+
ngram_lm_scale_1.0_attention_scale_2.0 17.06
|
306 |
+
ngram_lm_scale_0.7_attention_scale_0.5 17.1
|
307 |
+
ngram_lm_scale_1.0_attention_scale_1.9 17.11
|
308 |
+
ngram_lm_scale_1.1_attention_scale_2.5 17.11
|
309 |
+
ngram_lm_scale_1.5_attention_scale_5.0 17.13
|
310 |
+
ngram_lm_scale_0.5_attention_scale_0.05 17.16
|
311 |
+
ngram_lm_scale_1.2_attention_scale_3.0 17.16
|
312 |
+
ngram_lm_scale_1.1_attention_scale_2.3 17.2
|
313 |
+
ngram_lm_scale_1.1_attention_scale_2.2 17.25
|
314 |
+
ngram_lm_scale_0.9_attention_scale_1.3 17.26
|
315 |
+
ngram_lm_scale_1.0_attention_scale_1.7 17.26
|
316 |
+
ngram_lm_scale_1.1_attention_scale_2.1 17.33
|
317 |
+
ngram_lm_scale_0.9_attention_scale_1.2 17.34
|
318 |
+
ngram_lm_scale_1.3_attention_scale_3.0 17.36
|
319 |
+
ngram_lm_scale_1.2_attention_scale_2.5 17.37
|
320 |
+
ngram_lm_scale_1.5_attention_scale_4.0 17.38
|
321 |
+
ngram_lm_scale_1.1_attention_scale_2.0 17.4
|
322 |
+
ngram_lm_scale_1.7_attention_scale_5.0 17.43
|
323 |
+
ngram_lm_scale_0.5_attention_scale_0.01 17.45
|
324 |
+
ngram_lm_scale_1.0_attention_scale_1.5 17.46
|
325 |
+
ngram_lm_scale_1.1_attention_scale_1.9 17.47
|
326 |
+
ngram_lm_scale_0.9_attention_scale_1.1 17.48
|
327 |
+
ngram_lm_scale_1.2_attention_scale_2.3 17.48
|
328 |
+
ngram_lm_scale_1.2_attention_scale_2.2 17.58
|
329 |
+
ngram_lm_scale_0.9_attention_scale_1.0 17.61
|
330 |
+
ngram_lm_scale_1.1_attention_scale_1.7 17.68
|
331 |
+
ngram_lm_scale_1.2_attention_scale_2.1 17.69
|
332 |
+
ngram_lm_scale_1.3_attention_scale_2.5 17.71
|
333 |
+
ngram_lm_scale_1.0_attention_scale_1.3 17.72
|
334 |
+
ngram_lm_scale_0.9_attention_scale_0.9 17.77
|
335 |
+
ngram_lm_scale_1.2_attention_scale_2.0 17.77
|
336 |
+
ngram_lm_scale_1.9_attention_scale_5.0 17.81
|
337 |
+
ngram_lm_scale_0.7_attention_scale_0.3 17.82
|
338 |
+
ngram_lm_scale_1.0_attention_scale_1.2 17.88
|
339 |
+
ngram_lm_scale_1.7_attention_scale_4.0 17.88
|
340 |
+
ngram_lm_scale_0.6_attention_scale_0.1 17.91
|
341 |
+
ngram_lm_scale_1.2_attention_scale_1.9 17.91
|
342 |
+
ngram_lm_scale_1.3_attention_scale_2.3 17.91
|
343 |
+
ngram_lm_scale_1.1_attention_scale_1.5 17.97
|
344 |
+
ngram_lm_scale_1.5_attention_scale_3.0 18.02
|
345 |
+
ngram_lm_scale_1.3_attention_scale_2.2 18.03
|
346 |
+
ngram_lm_scale_2.0_attention_scale_5.0 18.05
|
347 |
+
ngram_lm_scale_1.0_attention_scale_1.1 18.06
|
348 |
+
ngram_lm_scale_0.6_attention_scale_0.08 18.07
|
349 |
+
ngram_lm_scale_1.3_attention_scale_2.1 18.17
|
350 |
+
ngram_lm_scale_2.1_attention_scale_5.0 18.21
|
351 |
+
ngram_lm_scale_1.2_attention_scale_1.7 18.24
|
352 |
+
ngram_lm_scale_0.9_attention_scale_0.7 18.25
|
353 |
+
ngram_lm_scale_1.0_attention_scale_1.0 18.27
|
354 |
+
ngram_lm_scale_1.3_attention_scale_2.0 18.3
|
355 |
+
ngram_lm_scale_1.1_attention_scale_1.3 18.34
|
356 |
+
ngram_lm_scale_0.6_attention_scale_0.05 18.35
|
357 |
+
ngram_lm_scale_1.9_attention_scale_4.0 18.4
|
358 |
+
ngram_lm_scale_1.3_attention_scale_1.9 18.45
|
359 |
+
ngram_lm_scale_2.2_attention_scale_5.0 18.47
|
360 |
+
ngram_lm_scale_1.5_attention_scale_2.5 18.52
|
361 |
+
ngram_lm_scale_1.0_attention_scale_0.9 18.55
|
362 |
+
ngram_lm_scale_1.1_attention_scale_1.2 18.56
|
363 |
+
ngram_lm_scale_1.2_attention_scale_1.5 18.63
|
364 |
+
ngram_lm_scale_0.9_attention_scale_0.6 18.67
|
365 |
+
ngram_lm_scale_2.3_attention_scale_5.0 18.72
|
366 |
+
ngram_lm_scale_2.0_attention_scale_4.0 18.74
|
367 |
+
ngram_lm_scale_1.7_attention_scale_3.0 18.75
|
368 |
+
ngram_lm_scale_1.3_attention_scale_1.7 18.77
|
369 |
+
ngram_lm_scale_1.5_attention_scale_2.3 18.77
|
370 |
+
ngram_lm_scale_1.1_attention_scale_1.1 18.81
|
371 |
+
ngram_lm_scale_0.6_attention_scale_0.01 18.82
|
372 |
+
ngram_lm_scale_1.5_attention_scale_2.2 18.98
|
373 |
+
ngram_lm_scale_1.2_attention_scale_1.3 19.01
|
374 |
+
ngram_lm_scale_2.1_attention_scale_4.0 19.04
|
375 |
+
ngram_lm_scale_1.1_attention_scale_1.0 19.15
|
376 |
+
ngram_lm_scale_1.5_attention_scale_2.1 19.19
|
377 |
+
ngram_lm_scale_2.5_attention_scale_5.0 19.21
|
378 |
+
ngram_lm_scale_1.3_attention_scale_1.5 19.27
|
379 |
+
ngram_lm_scale_0.9_attention_scale_0.5 19.29
|
380 |
+
ngram_lm_scale_2.2_attention_scale_4.0 19.33
|
381 |
+
ngram_lm_scale_1.2_attention_scale_1.2 19.36
|
382 |
+
ngram_lm_scale_1.5_attention_scale_2.0 19.38
|
383 |
+
ngram_lm_scale_0.7_attention_scale_0.1 19.4
|
384 |
+
ngram_lm_scale_1.0_attention_scale_0.7 19.5
|
385 |
+
ngram_lm_scale_1.7_attention_scale_2.5 19.52
|
386 |
+
ngram_lm_scale_1.1_attention_scale_0.9 19.59
|
387 |
+
ngram_lm_scale_1.9_attention_scale_3.0 19.6
|
388 |
+
ngram_lm_scale_0.7_attention_scale_0.08 19.65
|
389 |
+
ngram_lm_scale_1.5_attention_scale_1.9 19.66
|
390 |
+
ngram_lm_scale_2.3_attention_scale_4.0 19.69
|
391 |
+
ngram_lm_scale_1.2_attention_scale_1.1 19.8
|
392 |
+
ngram_lm_scale_1.3_attention_scale_1.3 19.97
|
393 |
+
ngram_lm_scale_1.7_attention_scale_2.3 19.97
|
394 |
+
ngram_lm_scale_1.0_attention_scale_0.6 20.09
|
395 |
+
ngram_lm_scale_2.0_attention_scale_3.0 20.09
|
396 |
+
ngram_lm_scale_0.7_attention_scale_0.05 20.1
|
397 |
+
ngram_lm_scale_1.7_attention_scale_2.2 20.26
|
398 |
+
ngram_lm_scale_1.2_attention_scale_1.0 20.33
|
399 |
+
ngram_lm_scale_1.5_attention_scale_1.7 20.33
|
400 |
+
ngram_lm_scale_1.3_attention_scale_1.2 20.47
|
401 |
+
ngram_lm_scale_1.7_attention_scale_2.1 20.54
|
402 |
+
ngram_lm_scale_2.5_attention_scale_4.0 20.56
|
403 |
+
ngram_lm_scale_2.1_attention_scale_3.0 20.65
|
404 |
+
ngram_lm_scale_1.9_attention_scale_2.5 20.74
|
405 |
+
ngram_lm_scale_0.7_attention_scale_0.01 20.76
|
406 |
+
ngram_lm_scale_1.0_attention_scale_0.5 20.83
|
407 |
+
ngram_lm_scale_3.0_attention_scale_5.0 20.84
|
408 |
+
ngram_lm_scale_1.1_attention_scale_0.7 20.88
|
409 |
+
ngram_lm_scale_1.2_attention_scale_0.9 20.88
|
410 |
+
ngram_lm_scale_1.7_attention_scale_2.0 20.92
|
411 |
+
ngram_lm_scale_0.9_attention_scale_0.3 20.97
|
412 |
+
ngram_lm_scale_1.3_attention_scale_1.1 20.98
|
413 |
+
ngram_lm_scale_1.5_attention_scale_1.5 21.09
|
414 |
+
ngram_lm_scale_1.7_attention_scale_1.9 21.28
|
415 |
+
ngram_lm_scale_2.2_attention_scale_3.0 21.29
|
416 |
+
ngram_lm_scale_1.9_attention_scale_2.3 21.41
|
417 |
+
ngram_lm_scale_2.0_attention_scale_2.5 21.48
|
418 |
+
ngram_lm_scale_1.3_attention_scale_1.0 21.57
|
419 |
+
ngram_lm_scale_1.1_attention_scale_0.6 21.61
|
420 |
+
ngram_lm_scale_1.9_attention_scale_2.2 21.68
|
421 |
+
ngram_lm_scale_2.3_attention_scale_3.0 21.84
|
422 |
+
ngram_lm_scale_1.5_attention_scale_1.3 22.04
|
423 |
+
ngram_lm_scale_1.7_attention_scale_1.7 22.04
|
424 |
+
ngram_lm_scale_1.9_attention_scale_2.1 22.05
|
425 |
+
ngram_lm_scale_2.0_attention_scale_2.3 22.05
|
426 |
+
ngram_lm_scale_2.1_attention_scale_2.5 22.05
|
427 |
+
ngram_lm_scale_1.3_attention_scale_0.9 22.2
|
428 |
+
ngram_lm_scale_1.2_attention_scale_0.7 22.32
|
429 |
+
ngram_lm_scale_2.0_attention_scale_2.2 22.45
|
430 |
+
ngram_lm_scale_1.9_attention_scale_2.0 22.48
|
431 |
+
ngram_lm_scale_1.1_attention_scale_0.5 22.53
|
432 |
+
ngram_lm_scale_1.5_attention_scale_1.2 22.64
|
433 |
+
ngram_lm_scale_2.2_attention_scale_2.5 22.77
|
434 |
+
ngram_lm_scale_2.1_attention_scale_2.3 22.86
|
435 |
+
ngram_lm_scale_2.0_attention_scale_2.1 22.92
|
436 |
+
ngram_lm_scale_3.0_attention_scale_4.0 22.92
|
437 |
+
ngram_lm_scale_1.0_attention_scale_0.3 22.94
|
438 |
+
ngram_lm_scale_1.9_attention_scale_1.9 22.99
|
439 |
+
ngram_lm_scale_2.5_attention_scale_3.0 23.05
|
440 |
+
ngram_lm_scale_1.7_attention_scale_1.5 23.11
|
441 |
+
ngram_lm_scale_1.2_attention_scale_0.6 23.21
|
442 |
+
ngram_lm_scale_2.1_attention_scale_2.2 23.3
|
443 |
+
ngram_lm_scale_1.5_attention_scale_1.1 23.35
|
444 |
+
ngram_lm_scale_2.0_attention_scale_2.0 23.35
|
445 |
+
ngram_lm_scale_2.3_attention_scale_2.5 23.66
|
446 |
+
ngram_lm_scale_2.2_attention_scale_2.3 23.72
|
447 |
+
ngram_lm_scale_0.9_attention_scale_0.1 23.75
|
448 |
+
ngram_lm_scale_1.3_attention_scale_0.7 23.81
|
449 |
+
ngram_lm_scale_2.1_attention_scale_2.1 23.81
|
450 |
+
ngram_lm_scale_2.0_attention_scale_1.9 23.85
|
451 |
+
ngram_lm_scale_1.9_attention_scale_1.7 24.02
|
452 |
+
ngram_lm_scale_0.9_attention_scale_0.08 24.16
|
453 |
+
ngram_lm_scale_1.5_attention_scale_1.0 24.16
|
454 |
+
ngram_lm_scale_2.2_attention_scale_2.2 24.16
|
455 |
+
ngram_lm_scale_1.2_attention_scale_0.5 24.29
|
456 |
+
ngram_lm_scale_2.1_attention_scale_2.0 24.31
|
457 |
+
ngram_lm_scale_1.7_attention_scale_1.3 24.36
|
458 |
+
ngram_lm_scale_2.3_attention_scale_2.3 24.49
|
459 |
+
ngram_lm_scale_2.2_attention_scale_2.1 24.62
|
460 |
+
ngram_lm_scale_0.9_attention_scale_0.05 24.74
|
461 |
+
ngram_lm_scale_2.1_attention_scale_1.9 24.83
|
462 |
+
ngram_lm_scale_1.3_attention_scale_0.6 24.9
|
463 |
+
ngram_lm_scale_2.3_attention_scale_2.2 24.95
|
464 |
+
ngram_lm_scale_2.5_attention_scale_2.5 25.02
|
465 |
+
ngram_lm_scale_1.1_attention_scale_0.3 25.03
|
466 |
+
ngram_lm_scale_2.0_attention_scale_1.7 25.05
|
467 |
+
ngram_lm_scale_1.5_attention_scale_0.9 25.06
|
468 |
+
ngram_lm_scale_4.0_attention_scale_5.0 25.06
|
469 |
+
ngram_lm_scale_2.2_attention_scale_2.0 25.13
|
470 |
+
ngram_lm_scale_1.7_attention_scale_1.2 25.22
|
471 |
+
ngram_lm_scale_1.9_attention_scale_1.5 25.34
|
472 |
+
ngram_lm_scale_2.3_attention_scale_2.1 25.43
|
473 |
+
ngram_lm_scale_0.9_attention_scale_0.01 25.64
|
474 |
+
ngram_lm_scale_2.2_attention_scale_1.9 25.69
|
475 |
+
ngram_lm_scale_2.1_attention_scale_1.7 25.96
|
476 |
+
ngram_lm_scale_2.3_attention_scale_2.0 25.98
|
477 |
+
ngram_lm_scale_2.5_attention_scale_2.3 26.04
|
478 |
+
ngram_lm_scale_1.7_attention_scale_1.1 26.05
|
479 |
+
ngram_lm_scale_1.3_attention_scale_0.5 26.14
|
480 |
+
ngram_lm_scale_3.0_attention_scale_3.0 26.24
|
481 |
+
ngram_lm_scale_2.0_attention_scale_1.5 26.28
|
482 |
+
ngram_lm_scale_1.0_attention_scale_0.1 26.31
|
483 |
+
ngram_lm_scale_2.5_attention_scale_2.2 26.54
|
484 |
+
ngram_lm_scale_2.3_attention_scale_1.9 26.56
|
485 |
+
ngram_lm_scale_1.9_attention_scale_1.3 26.7
|
486 |
+
ngram_lm_scale_1.0_attention_scale_0.08 26.74
|
487 |
+
ngram_lm_scale_1.7_attention_scale_1.0 26.88
|
488 |
+
ngram_lm_scale_2.2_attention_scale_1.7 26.92
|
489 |
+
ngram_lm_scale_2.5_attention_scale_2.1 27.08
|
490 |
+
ngram_lm_scale_1.5_attention_scale_0.7 27.13
|
491 |
+
ngram_lm_scale_1.2_attention_scale_0.3 27.25
|
492 |
+
ngram_lm_scale_1.0_attention_scale_0.05 27.33
|
493 |
+
ngram_lm_scale_2.1_attention_scale_1.5 27.36
|
494 |
+
ngram_lm_scale_1.9_attention_scale_1.2 27.52
|
495 |
+
ngram_lm_scale_2.5_attention_scale_2.0 27.71
|
496 |
+
ngram_lm_scale_1.7_attention_scale_0.9 27.86
|
497 |
+
ngram_lm_scale_2.0_attention_scale_1.3 27.87
|
498 |
+
ngram_lm_scale_4.0_attention_scale_4.0 27.91
|
499 |
+
ngram_lm_scale_2.3_attention_scale_1.7 27.92
|
500 |
+
ngram_lm_scale_1.0_attention_scale_0.01 28.24
|
501 |
+
ngram_lm_scale_2.5_attention_scale_1.9 28.37
|
502 |
+
ngram_lm_scale_1.5_attention_scale_0.6 28.42
|
503 |
+
ngram_lm_scale_2.2_attention_scale_1.5 28.43
|
504 |
+
ngram_lm_scale_1.9_attention_scale_1.1 28.49
|
505 |
+
ngram_lm_scale_3.0_attention_scale_2.5 28.64
|
506 |
+
ngram_lm_scale_1.1_attention_scale_0.1 28.71
|
507 |
+
ngram_lm_scale_2.0_attention_scale_1.2 28.76
|
508 |
+
ngram_lm_scale_5.0_attention_scale_5.0 28.94
|
509 |
+
ngram_lm_scale_2.1_attention_scale_1.3 28.98
|
510 |
+
ngram_lm_scale_1.1_attention_scale_0.08 29.13
|
511 |
+
ngram_lm_scale_1.3_attention_scale_0.3 29.25
|
512 |
+
ngram_lm_scale_2.3_attention_scale_1.5 29.34
|
513 |
+
ngram_lm_scale_1.9_attention_scale_1.0 29.45
|
514 |
+
ngram_lm_scale_2.0_attention_scale_1.1 29.62
|
515 |
+
ngram_lm_scale_2.5_attention_scale_1.7 29.63
|
516 |
+
ngram_lm_scale_3.0_attention_scale_2.3 29.63
|
517 |
+
ngram_lm_scale_1.5_attention_scale_0.5 29.71
|
518 |
+
ngram_lm_scale_1.1_attention_scale_0.05 29.74
|
519 |
+
ngram_lm_scale_2.1_attention_scale_1.2 29.78
|
520 |
+
ngram_lm_scale_2.2_attention_scale_1.3 29.92
|
521 |
+
ngram_lm_scale_1.7_attention_scale_0.7 30.08
|
522 |
+
ngram_lm_scale_3.0_attention_scale_2.2 30.18
|
523 |
+
ngram_lm_scale_1.9_attention_scale_0.9 30.39
|
524 |
+
ngram_lm_scale_2.0_attention_scale_1.0 30.48
|
525 |
+
ngram_lm_scale_1.1_attention_scale_0.01 30.56
|
526 |
+
ngram_lm_scale_2.1_attention_scale_1.1 30.61
|
527 |
+
ngram_lm_scale_3.0_attention_scale_2.1 30.65
|
528 |
+
ngram_lm_scale_2.2_attention_scale_1.2 30.76
|
529 |
+
ngram_lm_scale_1.2_attention_scale_0.1 30.8
|
530 |
+
ngram_lm_scale_2.3_attention_scale_1.3 30.82
|
531 |
+
ngram_lm_scale_2.5_attention_scale_1.5 30.9
|
532 |
+
ngram_lm_scale_3.0_attention_scale_2.0 31.15
|
533 |
+
ngram_lm_scale_1.2_attention_scale_0.08 31.18
|
534 |
+
ngram_lm_scale_1.7_attention_scale_0.6 31.23
|
535 |
+
ngram_lm_scale_2.0_attention_scale_0.9 31.43
|
536 |
+
ngram_lm_scale_4.0_attention_scale_3.0 31.46
|
537 |
+
ngram_lm_scale_2.1_attention_scale_1.0 31.48
|
538 |
+
ngram_lm_scale_2.2_attention_scale_1.1 31.52
|
539 |
+
ngram_lm_scale_2.3_attention_scale_1.2 31.56
|
540 |
+
ngram_lm_scale_5.0_attention_scale_4.0 31.6
|
541 |
+
ngram_lm_scale_3.0_attention_scale_1.9 31.72
|
542 |
+
ngram_lm_scale_1.2_attention_scale_0.05 31.77
|
543 |
+
ngram_lm_scale_2.3_attention_scale_1.1 32.34
|
544 |
+
ngram_lm_scale_2.5_attention_scale_1.3 32.37
|
545 |
+
ngram_lm_scale_2.1_attention_scale_0.9 32.38
|
546 |
+
ngram_lm_scale_2.2_attention_scale_1.0 32.39
|
547 |
+
ngram_lm_scale_1.9_attention_scale_0.7 32.41
|
548 |
+
ngram_lm_scale_1.7_attention_scale_0.5 32.42
|
549 |
+
ngram_lm_scale_1.2_attention_scale_0.01 32.46
|
550 |
+
ngram_lm_scale_1.5_attention_scale_0.3 32.5
|
551 |
+
ngram_lm_scale_1.3_attention_scale_0.1 32.52
|
552 |
+
ngram_lm_scale_1.3_attention_scale_0.08 32.84
|
553 |
+
ngram_lm_scale_3.0_attention_scale_1.7 32.85
|
554 |
+
ngram_lm_scale_2.5_attention_scale_1.2 33.03
|
555 |
+
ngram_lm_scale_2.2_attention_scale_0.9 33.12
|
556 |
+
ngram_lm_scale_2.3_attention_scale_1.0 33.12
|
557 |
+
ngram_lm_scale_2.0_attention_scale_0.7 33.27
|
558 |
+
ngram_lm_scale_4.0_attention_scale_2.5 33.32
|
559 |
+
ngram_lm_scale_1.3_attention_scale_0.05 33.33
|
560 |
+
ngram_lm_scale_1.9_attention_scale_0.6 33.39
|
561 |
+
ngram_lm_scale_2.5_attention_scale_1.1 33.65
|
562 |
+
ngram_lm_scale_2.3_attention_scale_0.9 33.81
|
563 |
+
ngram_lm_scale_1.3_attention_scale_0.01 33.86
|
564 |
+
ngram_lm_scale_3.0_attention_scale_1.5 33.86
|
565 |
+
ngram_lm_scale_2.1_attention_scale_0.7 33.99
|
566 |
+
ngram_lm_scale_4.0_attention_scale_2.3 34.1
|
567 |
+
ngram_lm_scale_2.0_attention_scale_0.6 34.13
|
568 |
+
ngram_lm_scale_2.5_attention_scale_1.0 34.24
|
569 |
+
ngram_lm_scale_1.9_attention_scale_0.5 34.3
|
570 |
+
ngram_lm_scale_4.0_attention_scale_2.2 34.4
|
571 |
+
ngram_lm_scale_5.0_attention_scale_3.0 34.41
|
572 |
+
ngram_lm_scale_1.7_attention_scale_0.3 34.6
|
573 |
+
ngram_lm_scale_2.2_attention_scale_0.7 34.63
|
574 |
+
ngram_lm_scale_4.0_attention_scale_2.1 34.72
|
575 |
+
ngram_lm_scale_3.0_attention_scale_1.3 34.74
|
576 |
+
ngram_lm_scale_2.1_attention_scale_0.6 34.76
|
577 |
+
ngram_lm_scale_2.5_attention_scale_0.9 34.83
|
578 |
+
ngram_lm_scale_1.5_attention_scale_0.1 34.93
|
579 |
+
ngram_lm_scale_2.0_attention_scale_0.5 34.93
|
580 |
+
ngram_lm_scale_4.0_attention_scale_2.0 35.0
|
581 |
+
ngram_lm_scale_1.5_attention_scale_0.08 35.12
|
582 |
+
ngram_lm_scale_2.3_attention_scale_0.7 35.17
|
583 |
+
ngram_lm_scale_3.0_attention_scale_1.2 35.25
|
584 |
+
ngram_lm_scale_4.0_attention_scale_1.9 35.32
|
585 |
+
ngram_lm_scale_2.2_attention_scale_0.6 35.34
|
586 |
+
ngram_lm_scale_1.5_attention_scale_0.05 35.44
|
587 |
+
ngram_lm_scale_2.1_attention_scale_0.5 35.46
|
588 |
+
ngram_lm_scale_3.0_attention_scale_1.1 35.65
|
589 |
+
ngram_lm_scale_5.0_attention_scale_2.5 35.65
|
590 |
+
ngram_lm_scale_2.3_attention_scale_0.6 35.73
|
591 |
+
ngram_lm_scale_1.5_attention_scale_0.01 35.85
|
592 |
+
ngram_lm_scale_1.9_attention_scale_0.3 35.87
|
593 |
+
ngram_lm_scale_2.2_attention_scale_0.5 35.9
|
594 |
+
ngram_lm_scale_2.5_attention_scale_0.7 35.95
|
595 |
+
ngram_lm_scale_4.0_attention_scale_1.7 35.95
|
596 |
+
ngram_lm_scale_3.0_attention_scale_1.0 36.05
|
597 |
+
ngram_lm_scale_5.0_attention_scale_2.3 36.07
|
598 |
+
ngram_lm_scale_1.7_attention_scale_0.1 36.25
|
599 |
+
ngram_lm_scale_2.3_attention_scale_0.5 36.28
|
600 |
+
ngram_lm_scale_2.0_attention_scale_0.3 36.29
|
601 |
+
ngram_lm_scale_5.0_attention_scale_2.2 36.3
|
602 |
+
ngram_lm_scale_2.5_attention_scale_0.6 36.38
|
603 |
+
ngram_lm_scale_1.7_attention_scale_0.08 36.42
|
604 |
+
ngram_lm_scale_3.0_attention_scale_0.9 36.47
|
605 |
+
ngram_lm_scale_4.0_attention_scale_1.5 36.52
|
606 |
+
ngram_lm_scale_5.0_attention_scale_2.1 36.53
|
607 |
+
ngram_lm_scale_1.7_attention_scale_0.05 36.62
|
608 |
+
ngram_lm_scale_2.1_attention_scale_0.3 36.67
|
609 |
+
ngram_lm_scale_5.0_attention_scale_2.0 36.72
|
610 |
+
ngram_lm_scale_2.5_attention_scale_0.5 36.89
|
611 |
+
ngram_lm_scale_5.0_attention_scale_1.9 36.94
|
612 |
+
ngram_lm_scale_1.7_attention_scale_0.01 36.95
|
613 |
+
ngram_lm_scale_2.2_attention_scale_0.3 36.99
|
614 |
+
ngram_lm_scale_4.0_attention_scale_1.3 37.0
|
615 |
+
ngram_lm_scale_1.9_attention_scale_0.1 37.16
|
616 |
+
ngram_lm_scale_3.0_attention_scale_0.7 37.17
|
617 |
+
ngram_lm_scale_2.3_attention_scale_0.3 37.28
|
618 |
+
ngram_lm_scale_4.0_attention_scale_1.2 37.28
|
619 |
+
ngram_lm_scale_1.9_attention_scale_0.08 37.3
|
620 |
+
ngram_lm_scale_5.0_attention_scale_1.7 37.35
|
621 |
+
ngram_lm_scale_1.9_attention_scale_0.05 37.5
|
622 |
+
ngram_lm_scale_2.0_attention_scale_0.1 37.51
|
623 |
+
ngram_lm_scale_3.0_attention_scale_0.6 37.51
|
624 |
+
ngram_lm_scale_4.0_attention_scale_1.1 37.55
|
625 |
+
ngram_lm_scale_2.0_attention_scale_0.08 37.6
|
626 |
+
ngram_lm_scale_1.9_attention_scale_0.01 37.67
|
627 |
+
ngram_lm_scale_5.0_attention_scale_1.5 37.73
|
628 |
+
ngram_lm_scale_2.5_attention_scale_0.3 37.74
|
629 |
+
ngram_lm_scale_2.0_attention_scale_0.05 37.75
|
630 |
+
ngram_lm_scale_2.1_attention_scale_0.1 37.77
|
631 |
+
ngram_lm_scale_4.0_attention_scale_1.0 37.79
|
632 |
+
ngram_lm_scale_2.1_attention_scale_0.08 37.85
|
633 |
+
ngram_lm_scale_3.0_attention_scale_0.5 37.85
|
634 |
+
ngram_lm_scale_2.0_attention_scale_0.01 37.94
|
635 |
+
ngram_lm_scale_2.2_attention_scale_0.1 37.96
|
636 |
+
ngram_lm_scale_2.1_attention_scale_0.05 37.98
|
637 |
+
ngram_lm_scale_4.0_attention_scale_0.9 38.03
|
638 |
+
ngram_lm_scale_2.2_attention_scale_0.08 38.05
|
639 |
+
ngram_lm_scale_5.0_attention_scale_1.3 38.1
|
640 |
+
ngram_lm_scale_2.3_attention_scale_0.1 38.18
|
641 |
+
ngram_lm_scale_2.1_attention_scale_0.01 38.19
|
642 |
+
ngram_lm_scale_2.2_attention_scale_0.05 38.21
|
643 |
+
ngram_lm_scale_2.3_attention_scale_0.08 38.28
|
644 |
+
ngram_lm_scale_5.0_attention_scale_1.2 38.3
|
645 |
+
ngram_lm_scale_2.2_attention_scale_0.01 38.36
|
646 |
+
ngram_lm_scale_2.3_attention_scale_0.05 38.38
|
647 |
+
ngram_lm_scale_5.0_attention_scale_1.1 38.45
|
648 |
+
ngram_lm_scale_2.5_attention_scale_0.1 38.5
|
649 |
+
ngram_lm_scale_4.0_attention_scale_0.7 38.53
|
650 |
+
ngram_lm_scale_2.3_attention_scale_0.01 38.54
|
651 |
+
ngram_lm_scale_3.0_attention_scale_0.3 38.55
|
652 |
+
ngram_lm_scale_2.5_attention_scale_0.08 38.57
|
653 |
+
ngram_lm_scale_2.5_attention_scale_0.05 38.67
|
654 |
+
ngram_lm_scale_5.0_attention_scale_1.0 38.7
|
655 |
+
ngram_lm_scale_4.0_attention_scale_0.6 38.76
|
656 |
+
ngram_lm_scale_2.5_attention_scale_0.01 38.83
|
657 |
+
ngram_lm_scale_5.0_attention_scale_0.9 38.89
|
658 |
+
ngram_lm_scale_4.0_attention_scale_0.5 39.02
|
659 |
+
ngram_lm_scale_3.0_attention_scale_0.1 39.18
|
660 |
+
ngram_lm_scale_5.0_attention_scale_0.7 39.22
|
661 |
+
ngram_lm_scale_3.0_attention_scale_0.08 39.23
|
662 |
+
ngram_lm_scale_3.0_attention_scale_0.05 39.31
|
663 |
+
ngram_lm_scale_5.0_attention_scale_0.6 39.39
|
664 |
+
ngram_lm_scale_3.0_attention_scale_0.01 39.4
|
665 |
+
ngram_lm_scale_4.0_attention_scale_0.3 39.4
|
666 |
+
ngram_lm_scale_5.0_attention_scale_0.5 39.52
|
667 |
+
ngram_lm_scale_4.0_attention_scale_0.1 39.75
|
668 |
+
ngram_lm_scale_4.0_attention_scale_0.08 39.77
|
669 |
+
ngram_lm_scale_5.0_attention_scale_0.3 39.77
|
670 |
+
ngram_lm_scale_4.0_attention_scale_0.05 39.84
|
671 |
+
ngram_lm_scale_4.0_attention_scale_0.01 39.92
|
672 |
+
ngram_lm_scale_5.0_attention_scale_0.1 40.05
|
673 |
+
ngram_lm_scale_5.0_attention_scale_0.08 40.07
|
674 |
+
ngram_lm_scale_5.0_attention_scale_0.05 40.1
|
675 |
+
ngram_lm_scale_5.0_attention_scale_0.01 40.17
|
676 |
+
|
677 |
+
2022-06-27 19:39:22,402 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
678 |
+
2022-06-27 19:41:38,635 INFO [decode.py:483] batch 100/?, cuts processed until now is 428
|
679 |
+
2022-06-27 19:43:58,275 INFO [decode.py:483] batch 200/?, cuts processed until now is 888
|
680 |
+
2022-06-27 19:46:15,303 INFO [decode.py:483] batch 300/?, cuts processed until now is 1363
|
681 |
+
2022-06-27 19:48:20,796 INFO [decode.py:483] batch 400/?, cuts processed until now is 1815
|
682 |
+
2022-06-27 19:50:41,802 INFO [decode.py:483] batch 500/?, cuts processed until now is 2243
|
683 |
+
2022-06-27 19:52:47,093 INFO [decode.py:483] batch 600/?, cuts processed until now is 2717
|
684 |
+
2022-06-27 19:54:50,131 INFO [decode.py:483] batch 700/?, cuts processed until now is 3192
|
685 |
+
2022-06-27 19:56:50,896 INFO [decode.py:783] Caught exception:
|
686 |
+
CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 16.89 GiB already allocated; 3.44 GiB free; 27.11 GiB reserved in total by PyTorch)
|
687 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
688 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
689 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
690 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
691 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
692 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
693 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
694 |
+
frame #6: k2::Hash::Hash(std::shared_ptr<k2::Context>, int, int, int) + 0x2f7 (0x2aab11538957 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
695 |
+
frame #7: k2::Hash::Resize(int, int, int, bool) + 0x1b4 (0x2aab1152e464 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
696 |
+
frame #8: k2::DeviceIntersector::ForwardSortedA() + 0x53e (0x2aab1156355e in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
697 |
+
frame #9: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x4cd (0x2aab115457ad in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
698 |
+
frame #10: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
699 |
+
frame #11: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
700 |
+
<omitting python frames>
|
701 |
+
frame #41: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
702 |
+
|
703 |
+
|
704 |
+
2022-06-27 19:56:50,897 INFO [decode.py:789] num_arcs before pruning: 1001553
|
705 |
+
2022-06-27 19:56:50,897 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
706 |
+
2022-06-27 19:56:50,909 INFO [decode.py:803] num_arcs after pruning: 5815
|
707 |
+
2022-06-27 19:57:04,620 INFO [decode.py:483] batch 800/?, cuts processed until now is 3635
|
708 |
+
2022-06-27 19:59:30,814 INFO [decode.py:483] batch 900/?, cuts processed until now is 4082
|
709 |
+
2022-06-27 20:00:05,093 INFO [decode.py:888] Caught exception:
|
710 |
+
|
711 |
+
Some bad things happened. Please read the above error messages and stack
|
712 |
+
trace. If you are using Python, the following command may be helpful:
|
713 |
+
|
714 |
+
gdb --args python /path/to/your/code.py
|
715 |
+
|
716 |
+
(You can use `gdb` to debug the code. Please consider compiling
|
717 |
+
a debug version of k2.).
|
718 |
+
|
719 |
+
If you are unable to fix it, please open an issue at:
|
720 |
+
|
721 |
+
https://github.com/k2-fsa/k2/issues/new
|
722 |
+
|
723 |
+
|
724 |
+
2022-06-27 20:00:05,094 INFO [decode.py:889] num_paths before decreasing: 1000
|
725 |
+
2022-06-27 20:00:05,094 INFO [decode.py:896] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
726 |
+
2022-06-27 20:00:05,094 INFO [decode.py:902] num_paths after decreasing: 500
|
727 |
+
2022-06-27 20:02:41,083 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4500
|
728 |
+
2022-06-27 20:05:44,787 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4869
|
729 |
+
2022-06-27 20:10:33,885 INFO [decode.py:532]
|
730 |
+
For dev, WER of different settings are:
|
731 |
+
ngram_lm_scale_0.01_attention_scale_0.5 15.89 best for dev
|
732 |
+
ngram_lm_scale_0.01_attention_scale_0.3 15.9
|
733 |
+
ngram_lm_scale_0.01_attention_scale_0.6 15.9
|
734 |
+
ngram_lm_scale_0.01_attention_scale_0.7 15.9
|
735 |
+
ngram_lm_scale_0.05_attention_scale_0.6 15.92
|
736 |
+
ngram_lm_scale_0.08_attention_scale_0.6 15.92
|
737 |
+
ngram_lm_scale_0.01_attention_scale_0.9 15.93
|
738 |
+
ngram_lm_scale_0.08_attention_scale_0.5 15.93
|
739 |
+
ngram_lm_scale_0.05_attention_scale_0.5 15.94
|
740 |
+
ngram_lm_scale_0.05_attention_scale_0.7 15.94
|
741 |
+
ngram_lm_scale_0.05_attention_scale_0.3 15.95
|
742 |
+
ngram_lm_scale_0.01_attention_scale_1.0 15.96
|
743 |
+
ngram_lm_scale_0.1_attention_scale_0.5 15.96
|
744 |
+
ngram_lm_scale_0.1_attention_scale_0.6 15.96
|
745 |
+
ngram_lm_scale_0.05_attention_scale_0.9 15.97
|
746 |
+
ngram_lm_scale_0.08_attention_scale_0.7 15.97
|
747 |
+
ngram_lm_scale_0.08_attention_scale_0.3 15.98
|
748 |
+
ngram_lm_scale_0.1_attention_scale_0.7 15.98
|
749 |
+
ngram_lm_scale_0.01_attention_scale_0.1 16.0
|
750 |
+
ngram_lm_scale_0.01_attention_scale_1.1 16.0
|
751 |
+
ngram_lm_scale_0.05_attention_scale_1.0 16.01
|
752 |
+
ngram_lm_scale_0.1_attention_scale_0.3 16.01
|
753 |
+
ngram_lm_scale_0.01_attention_scale_0.08 16.02
|
754 |
+
ngram_lm_scale_0.08_attention_scale_0.9 16.02
|
755 |
+
ngram_lm_scale_0.01_attention_scale_1.2 16.03
|
756 |
+
ngram_lm_scale_0.08_attention_scale_1.0 16.03
|
757 |
+
ngram_lm_scale_0.1_attention_scale_0.9 16.03
|
758 |
+
ngram_lm_scale_0.1_attention_scale_1.0 16.03
|
759 |
+
ngram_lm_scale_0.05_attention_scale_1.1 16.04
|
760 |
+
ngram_lm_scale_0.08_attention_scale_1.1 16.04
|
761 |
+
ngram_lm_scale_0.05_attention_scale_0.1 16.05
|
762 |
+
ngram_lm_scale_0.05_attention_scale_0.08 16.06
|
763 |
+
ngram_lm_scale_0.01_attention_scale_0.05 16.07
|
764 |
+
ngram_lm_scale_0.05_attention_scale_1.2 16.07
|
765 |
+
ngram_lm_scale_0.08_attention_scale_0.1 16.07
|
766 |
+
ngram_lm_scale_0.1_attention_scale_1.1 16.08
|
767 |
+
ngram_lm_scale_0.01_attention_scale_1.3 16.09
|
768 |
+
ngram_lm_scale_0.08_attention_scale_0.08 16.09
|
769 |
+
ngram_lm_scale_0.1_attention_scale_0.1 16.09
|
770 |
+
ngram_lm_scale_0.08_attention_scale_1.2 16.1
|
771 |
+
ngram_lm_scale_0.1_attention_scale_0.08 16.1
|
772 |
+
ngram_lm_scale_0.05_attention_scale_0.05 16.11
|
773 |
+
ngram_lm_scale_0.05_attention_scale_1.3 16.13
|
774 |
+
ngram_lm_scale_0.1_attention_scale_1.2 16.13
|
775 |
+
ngram_lm_scale_0.01_attention_scale_1.5 16.15
|
776 |
+
ngram_lm_scale_0.08_attention_scale_0.05 16.15
|
777 |
+
ngram_lm_scale_0.08_attention_scale_1.3 16.15
|
778 |
+
ngram_lm_scale_0.01_attention_scale_0.01 16.16
|
779 |
+
ngram_lm_scale_0.1_attention_scale_1.3 16.16
|
780 |
+
ngram_lm_scale_0.1_attention_scale_0.05 16.17
|
781 |
+
ngram_lm_scale_0.05_attention_scale_0.01 16.18
|
782 |
+
ngram_lm_scale_0.05_attention_scale_1.5 16.19
|
783 |
+
ngram_lm_scale_0.01_attention_scale_1.7 16.21
|
784 |
+
ngram_lm_scale_0.08_attention_scale_1.5 16.21
|
785 |
+
ngram_lm_scale_0.01_attention_scale_1.9 16.22
|
786 |
+
ngram_lm_scale_0.05_attention_scale_1.7 16.22
|
787 |
+
ngram_lm_scale_0.08_attention_scale_0.01 16.23
|
788 |
+
ngram_lm_scale_0.1_attention_scale_1.5 16.24
|
789 |
+
ngram_lm_scale_0.01_attention_scale_2.0 16.26
|
790 |
+
ngram_lm_scale_0.08_attention_scale_1.7 16.26
|
791 |
+
ngram_lm_scale_0.1_attention_scale_0.01 16.28
|
792 |
+
ngram_lm_scale_0.05_attention_scale_1.9 16.29
|
793 |
+
ngram_lm_scale_0.1_attention_scale_1.7 16.29
|
794 |
+
ngram_lm_scale_0.01_attention_scale_2.1 16.31
|
795 |
+
ngram_lm_scale_0.05_attention_scale_2.0 16.31
|
796 |
+
ngram_lm_scale_0.08_attention_scale_1.9 16.32
|
797 |
+
ngram_lm_scale_0.3_attention_scale_0.3 16.32
|
798 |
+
ngram_lm_scale_0.3_attention_scale_0.5 16.32
|
799 |
+
ngram_lm_scale_0.3_attention_scale_0.6 16.32
|
800 |
+
ngram_lm_scale_0.3_attention_scale_0.7 16.33
|
801 |
+
ngram_lm_scale_0.05_attention_scale_2.1 16.34
|
802 |
+
ngram_lm_scale_0.08_attention_scale_2.0 16.34
|
803 |
+
ngram_lm_scale_0.01_attention_scale_2.2 16.35
|
804 |
+
ngram_lm_scale_0.01_attention_scale_2.3 16.36
|
805 |
+
ngram_lm_scale_0.1_attention_scale_1.9 16.36
|
806 |
+
ngram_lm_scale_0.1_attention_scale_2.0 16.36
|
807 |
+
ngram_lm_scale_0.3_attention_scale_0.9 16.36
|
808 |
+
ngram_lm_scale_0.08_attention_scale_2.1 16.37
|
809 |
+
ngram_lm_scale_0.05_attention_scale_2.2 16.38
|
810 |
+
ngram_lm_scale_0.1_attention_scale_2.1 16.4
|
811 |
+
ngram_lm_scale_0.01_attention_scale_2.5 16.41
|
812 |
+
ngram_lm_scale_0.05_attention_scale_2.3 16.41
|
813 |
+
ngram_lm_scale_0.08_attention_scale_2.2 16.41
|
814 |
+
ngram_lm_scale_0.3_attention_scale_1.0 16.41
|
815 |
+
ngram_lm_scale_0.08_attention_scale_2.3 16.43
|
816 |
+
ngram_lm_scale_0.1_attention_scale_2.2 16.43
|
817 |
+
ngram_lm_scale_0.1_attention_scale_2.3 16.43
|
818 |
+
ngram_lm_scale_0.3_attention_scale_1.1 16.44
|
819 |
+
ngram_lm_scale_0.05_attention_scale_2.5 16.45
|
820 |
+
ngram_lm_scale_0.08_attention_scale_2.5 16.45
|
821 |
+
ngram_lm_scale_0.01_attention_scale_3.0 16.47
|
822 |
+
ngram_lm_scale_0.1_attention_scale_2.5 16.47
|
823 |
+
ngram_lm_scale_0.3_attention_scale_1.2 16.49
|
824 |
+
ngram_lm_scale_0.3_attention_scale_1.3 16.49
|
825 |
+
ngram_lm_scale_0.3_attention_scale_1.5 16.52
|
826 |
+
ngram_lm_scale_0.05_attention_scale_3.0 16.53
|
827 |
+
ngram_lm_scale_0.08_attention_scale_3.0 16.54
|
828 |
+
ngram_lm_scale_0.1_attention_scale_3.0 16.56
|
829 |
+
ngram_lm_scale_0.3_attention_scale_0.1 16.57
|
830 |
+
ngram_lm_scale_0.3_attention_scale_1.7 16.57
|
831 |
+
ngram_lm_scale_0.3_attention_scale_1.9 16.63
|
832 |
+
ngram_lm_scale_0.01_attention_scale_4.0 16.64
|
833 |
+
ngram_lm_scale_0.3_attention_scale_0.08 16.64
|
834 |
+
ngram_lm_scale_0.3_attention_scale_2.0 16.64
|
835 |
+
ngram_lm_scale_0.3_attention_scale_2.1 16.66
|
836 |
+
ngram_lm_scale_0.05_attention_scale_4.0 16.67
|
837 |
+
ngram_lm_scale_0.08_attention_scale_4.0 16.69
|
838 |
+
ngram_lm_scale_0.3_attention_scale_0.05 16.69
|
839 |
+
ngram_lm_scale_0.3_attention_scale_2.2 16.69
|
840 |
+
ngram_lm_scale_0.3_attention_scale_2.3 16.7
|
841 |
+
ngram_lm_scale_0.1_attention_scale_4.0 16.71
|
842 |
+
ngram_lm_scale_0.3_attention_scale_2.5 16.74
|
843 |
+
ngram_lm_scale_0.01_attention_scale_5.0 16.75
|
844 |
+
ngram_lm_scale_0.05_attention_scale_5.0 16.77
|
845 |
+
ngram_lm_scale_0.08_attention_scale_5.0 16.79
|
846 |
+
ngram_lm_scale_0.1_attention_scale_5.0 16.79
|
847 |
+
ngram_lm_scale_0.3_attention_scale_3.0 16.82
|
848 |
+
ngram_lm_scale_0.3_attention_scale_0.01 16.84
|
849 |
+
ngram_lm_scale_0.3_attention_scale_4.0 16.86
|
850 |
+
ngram_lm_scale_0.3_attention_scale_5.0 16.94
|
851 |
+
ngram_lm_scale_0.5_attention_scale_1.5 16.94
|
852 |
+
ngram_lm_scale_0.5_attention_scale_1.3 16.95
|
853 |
+
ngram_lm_scale_0.5_attention_scale_1.2 16.96
|
854 |
+
ngram_lm_scale_0.5_attention_scale_1.0 16.97
|
855 |
+
ngram_lm_scale_0.5_attention_scale_1.1 16.98
|
856 |
+
ngram_lm_scale_0.5_attention_scale_1.7 16.99
|
857 |
+
ngram_lm_scale_0.5_attention_scale_0.9 17.0
|
858 |
+
ngram_lm_scale_0.5_attention_scale_1.9 17.0
|
859 |
+
ngram_lm_scale_0.5_attention_scale_2.0 17.01
|
860 |
+
ngram_lm_scale_0.5_attention_scale_2.2 17.01
|
861 |
+
ngram_lm_scale_0.5_attention_scale_2.1 17.02
|
862 |
+
ngram_lm_scale_0.5_attention_scale_0.7 17.04
|
863 |
+
ngram_lm_scale_0.5_attention_scale_2.3 17.04
|
864 |
+
ngram_lm_scale_0.5_attention_scale_2.5 17.05
|
865 |
+
ngram_lm_scale_0.5_attention_scale_3.0 17.05
|
866 |
+
ngram_lm_scale_0.5_attention_scale_0.6 17.06
|
867 |
+
ngram_lm_scale_0.5_attention_scale_4.0 17.09
|
868 |
+
ngram_lm_scale_0.5_attention_scale_0.5 17.1
|
869 |
+
ngram_lm_scale_0.5_attention_scale_5.0 17.11
|
870 |
+
ngram_lm_scale_0.6_attention_scale_3.0 17.22
|
871 |
+
ngram_lm_scale_0.6_attention_scale_5.0 17.22
|
872 |
+
ngram_lm_scale_0.6_attention_scale_1.7 17.23
|
873 |
+
ngram_lm_scale_0.6_attention_scale_2.3 17.23
|
874 |
+
ngram_lm_scale_0.6_attention_scale_2.5 17.23
|
875 |
+
ngram_lm_scale_0.6_attention_scale_4.0 17.23
|
876 |
+
ngram_lm_scale_0.6_attention_scale_1.5 17.24
|
877 |
+
ngram_lm_scale_0.6_attention_scale_1.9 17.24
|
878 |
+
ngram_lm_scale_0.6_attention_scale_2.1 17.24
|
879 |
+
ngram_lm_scale_0.6_attention_scale_2.2 17.24
|
880 |
+
ngram_lm_scale_0.6_attention_scale_2.0 17.25
|
881 |
+
ngram_lm_scale_0.6_attention_scale_1.3 17.26
|
882 |
+
ngram_lm_scale_0.6_attention_scale_1.1 17.27
|
883 |
+
ngram_lm_scale_0.5_attention_scale_0.3 17.28
|
884 |
+
ngram_lm_scale_0.6_attention_scale_1.2 17.28
|
885 |
+
ngram_lm_scale_0.6_attention_scale_1.0 17.31
|
886 |
+
ngram_lm_scale_0.6_attention_scale_0.9 17.33
|
887 |
+
ngram_lm_scale_0.7_attention_scale_5.0 17.33
|
888 |
+
ngram_lm_scale_0.7_attention_scale_4.0 17.37
|
889 |
+
ngram_lm_scale_0.6_attention_scale_0.7 17.44
|
890 |
+
ngram_lm_scale_0.7_attention_scale_3.0 17.44
|
891 |
+
ngram_lm_scale_0.7_attention_scale_2.2 17.49
|
892 |
+
ngram_lm_scale_0.7_attention_scale_2.3 17.49
|
893 |
+
ngram_lm_scale_0.7_attention_scale_2.5 17.49
|
894 |
+
ngram_lm_scale_0.7_attention_scale_2.0 17.5
|
895 |
+
ngram_lm_scale_0.7_attention_scale_2.1 17.5
|
896 |
+
ngram_lm_scale_0.7_attention_scale_1.7 17.51
|
897 |
+
ngram_lm_scale_0.7_attention_scale_1.9 17.51
|
898 |
+
ngram_lm_scale_0.6_attention_scale_0.6 17.55
|
899 |
+
ngram_lm_scale_0.7_attention_scale_1.5 17.56
|
900 |
+
ngram_lm_scale_0.9_attention_scale_5.0 17.61
|
901 |
+
ngram_lm_scale_0.7_attention_scale_1.3 17.62
|
902 |
+
ngram_lm_scale_0.7_attention_scale_1.2 17.68
|
903 |
+
ngram_lm_scale_0.9_attention_scale_4.0 17.69
|
904 |
+
ngram_lm_scale_0.6_attention_scale_0.5 17.72
|
905 |
+
ngram_lm_scale_0.7_attention_scale_1.1 17.74
|
906 |
+
ngram_lm_scale_0.9_attention_scale_3.0 17.77
|
907 |
+
ngram_lm_scale_1.0_attention_scale_5.0 17.77
|
908 |
+
ngram_lm_scale_0.7_attention_scale_1.0 17.79
|
909 |
+
ngram_lm_scale_1.0_attention_scale_4.0 17.82
|
910 |
+
ngram_lm_scale_0.7_attention_scale_0.9 17.84
|
911 |
+
ngram_lm_scale_0.9_attention_scale_2.5 17.87
|
912 |
+
ngram_lm_scale_1.1_attention_scale_5.0 17.88
|
913 |
+
ngram_lm_scale_0.9_attention_scale_2.3 17.93
|
914 |
+
ngram_lm_scale_0.5_attention_scale_0.1 17.95
|
915 |
+
ngram_lm_scale_0.9_attention_scale_2.2 17.96
|
916 |
+
ngram_lm_scale_0.9_attention_scale_2.1 18.0
|
917 |
+
ngram_lm_scale_1.0_attention_scale_3.0 18.0
|
918 |
+
ngram_lm_scale_1.1_attention_scale_4.0 18.01
|
919 |
+
ngram_lm_scale_1.2_attention_scale_5.0 18.02
|
920 |
+
ngram_lm_scale_0.9_attention_scale_2.0 18.03
|
921 |
+
ngram_lm_scale_0.5_attention_scale_0.08 18.05
|
922 |
+
ngram_lm_scale_0.7_attention_scale_0.7 18.07
|
923 |
+
ngram_lm_scale_0.9_attention_scale_1.9 18.09
|
924 |
+
ngram_lm_scale_0.6_attention_scale_0.3 18.1
|
925 |
+
ngram_lm_scale_1.3_attention_scale_5.0 18.15
|
926 |
+
ngram_lm_scale_1.0_attention_scale_2.5 18.17
|
927 |
+
ngram_lm_scale_1.2_attention_scale_4.0 18.18
|
928 |
+
ngram_lm_scale_0.9_attention_scale_1.7 18.2
|
929 |
+
ngram_lm_scale_0.5_attention_scale_0.05 18.24
|
930 |
+
ngram_lm_scale_0.7_attention_scale_0.6 18.24
|
931 |
+
ngram_lm_scale_1.0_attention_scale_2.3 18.28
|
932 |
+
ngram_lm_scale_1.1_attention_scale_3.0 18.28
|
933 |
+
ngram_lm_scale_1.0_attention_scale_2.2 18.31
|
934 |
+
ngram_lm_scale_0.9_attention_scale_1.5 18.36
|
935 |
+
ngram_lm_scale_1.0_attention_scale_2.1 18.37
|
936 |
+
ngram_lm_scale_1.0_attention_scale_2.0 18.41
|
937 |
+
ngram_lm_scale_1.3_attention_scale_4.0 18.42
|
938 |
+
ngram_lm_scale_0.7_attention_scale_0.5 18.45
|
939 |
+
ngram_lm_scale_1.0_attention_scale_1.9 18.48
|
940 |
+
ngram_lm_scale_1.1_attention_scale_2.5 18.49
|
941 |
+
ngram_lm_scale_1.5_attention_scale_5.0 18.5
|
942 |
+
ngram_lm_scale_1.2_attention_scale_3.0 18.52
|
943 |
+
ngram_lm_scale_0.9_attention_scale_1.3 18.57
|
944 |
+
ngram_lm_scale_1.1_attention_scale_2.3 18.58
|
945 |
+
ngram_lm_scale_0.5_attention_scale_0.01 18.61
|
946 |
+
ngram_lm_scale_0.9_attention_scale_1.2 18.66
|
947 |
+
ngram_lm_scale_1.0_attention_scale_1.7 18.66
|
948 |
+
ngram_lm_scale_1.1_attention_scale_2.2 18.66
|
949 |
+
ngram_lm_scale_1.1_attention_scale_2.1 18.78
|
950 |
+
ngram_lm_scale_1.2_attention_scale_2.5 18.83
|
951 |
+
ngram_lm_scale_1.3_attention_scale_3.0 18.83
|
952 |
+
ngram_lm_scale_0.9_attention_scale_1.1 18.85
|
953 |
+
ngram_lm_scale_1.5_attention_scale_4.0 18.88
|
954 |
+
ngram_lm_scale_1.1_attention_scale_2.0 18.89
|
955 |
+
ngram_lm_scale_1.7_attention_scale_5.0 18.89
|
956 |
+
ngram_lm_scale_1.0_attention_scale_1.5 18.94
|
957 |
+
ngram_lm_scale_1.1_attention_scale_1.9 19.04
|
958 |
+
ngram_lm_scale_0.9_attention_scale_1.0 19.07
|
959 |
+
ngram_lm_scale_1.2_attention_scale_2.3 19.07
|
960 |
+
ngram_lm_scale_1.2_attention_scale_2.2 19.15
|
961 |
+
ngram_lm_scale_0.6_attention_scale_0.1 19.17
|
962 |
+
ngram_lm_scale_0.7_attention_scale_0.3 19.17
|
963 |
+
ngram_lm_scale_1.0_attention_scale_1.3 19.21
|
964 |
+
ngram_lm_scale_1.1_attention_scale_1.7 19.24
|
965 |
+
ngram_lm_scale_1.2_attention_scale_2.1 19.28
|
966 |
+
ngram_lm_scale_0.9_attention_scale_0.9 19.29
|
967 |
+
ngram_lm_scale_1.3_attention_scale_2.5 19.3
|
968 |
+
ngram_lm_scale_1.9_attention_scale_5.0 19.32
|
969 |
+
ngram_lm_scale_0.6_attention_scale_0.08 19.35
|
970 |
+
ngram_lm_scale_1.2_attention_scale_2.0 19.36
|
971 |
+
ngram_lm_scale_1.7_attention_scale_4.0 19.41
|
972 |
+
ngram_lm_scale_1.0_attention_scale_1.2 19.43
|
973 |
+
ngram_lm_scale_1.2_attention_scale_1.9 19.49
|
974 |
+
ngram_lm_scale_1.3_attention_scale_2.3 19.52
|
975 |
+
ngram_lm_scale_1.1_attention_scale_1.5 19.54
|
976 |
+
ngram_lm_scale_2.0_attention_scale_5.0 19.56
|
977 |
+
ngram_lm_scale_1.5_attention_scale_3.0 19.59
|
978 |
+
ngram_lm_scale_1.3_attention_scale_2.2 19.66
|
979 |
+
ngram_lm_scale_0.6_attention_scale_0.05 19.67
|
980 |
+
ngram_lm_scale_1.0_attention_scale_1.1 19.68
|
981 |
+
ngram_lm_scale_2.1_attention_scale_5.0 19.79
|
982 |
+
ngram_lm_scale_1.3_attention_scale_2.1 19.8
|
983 |
+
ngram_lm_scale_1.2_attention_scale_1.7 19.86
|
984 |
+
ngram_lm_scale_0.9_attention_scale_0.7 19.92
|
985 |
+
ngram_lm_scale_1.3_attention_scale_2.0 19.93
|
986 |
+
ngram_lm_scale_1.0_attention_scale_1.0 19.94
|
987 |
+
ngram_lm_scale_1.1_attention_scale_1.3 20.03
|
988 |
+
ngram_lm_scale_1.9_attention_scale_4.0 20.08
|
989 |
+
ngram_lm_scale_1.3_attention_scale_1.9 20.11
|
990 |
+
ngram_lm_scale_2.2_attention_scale_5.0 20.16
|
991 |
+
ngram_lm_scale_1.5_attention_scale_2.5 20.2
|
992 |
+
ngram_lm_scale_0.6_attention_scale_0.01 20.22
|
993 |
+
ngram_lm_scale_1.0_attention_scale_0.9 20.26
|
994 |
+
ngram_lm_scale_1.1_attention_scale_1.2 20.27
|
995 |
+
ngram_lm_scale_0.9_attention_scale_0.6 20.29
|
996 |
+
ngram_lm_scale_1.2_attention_scale_1.5 20.34
|
997 |
+
ngram_lm_scale_2.3_attention_scale_5.0 20.42
|
998 |
+
ngram_lm_scale_2.0_attention_scale_4.0 20.47
|
999 |
+
ngram_lm_scale_1.7_attention_scale_3.0 20.48
|
1000 |
+
ngram_lm_scale_1.3_attention_scale_1.7 20.56
|
1001 |
+
ngram_lm_scale_1.1_attention_scale_1.1 20.57
|
1002 |
+
ngram_lm_scale_1.5_attention_scale_2.3 20.61
|
1003 |
+
ngram_lm_scale_1.2_attention_scale_1.3 20.8
|
1004 |
+
ngram_lm_scale_1.5_attention_scale_2.2 20.8
|
1005 |
+
ngram_lm_scale_2.1_attention_scale_4.0 20.81
|
1006 |
+
ngram_lm_scale_0.9_attention_scale_0.5 20.85
|
1007 |
+
ngram_lm_scale_1.1_attention_scale_1.0 20.87
|
1008 |
+
ngram_lm_scale_0.7_attention_scale_0.1 20.93
|
1009 |
+
ngram_lm_scale_1.5_attention_scale_2.1 20.98
|
1010 |
+
ngram_lm_scale_2.5_attention_scale_5.0 21.02
|
1011 |
+
ngram_lm_scale_1.3_attention_scale_1.5 21.06
|
1012 |
+
ngram_lm_scale_1.0_attention_scale_0.7 21.09
|
1013 |
+
ngram_lm_scale_2.2_attention_scale_4.0 21.14
|
1014 |
+
ngram_lm_scale_1.2_attention_scale_1.2 21.21
|
1015 |
+
ngram_lm_scale_1.5_attention_scale_2.0 21.24
|
1016 |
+
ngram_lm_scale_0.7_attention_scale_0.08 21.25
|
1017 |
+
ngram_lm_scale_1.7_attention_scale_2.5 21.32
|
1018 |
+
ngram_lm_scale_1.1_attention_scale_0.9 21.38
|
1019 |
+
ngram_lm_scale_1.9_attention_scale_3.0 21.4
|
1020 |
+
ngram_lm_scale_1.5_attention_scale_1.9 21.5
|
1021 |
+
ngram_lm_scale_2.3_attention_scale_4.0 21.53
|
1022 |
+
ngram_lm_scale_1.2_attention_scale_1.1 21.61
|
1023 |
+
ngram_lm_scale_0.7_attention_scale_0.05 21.66
|
1024 |
+
ngram_lm_scale_1.0_attention_scale_0.6 21.68
|
1025 |
+
ngram_lm_scale_1.7_attention_scale_2.3 21.76
|
1026 |
+
ngram_lm_scale_1.3_attention_scale_1.3 21.78
|
1027 |
+
ngram_lm_scale_2.0_attention_scale_3.0 21.87
|
1028 |
+
ngram_lm_scale_1.7_attention_scale_2.2 21.99
|
1029 |
+
ngram_lm_scale_1.5_attention_scale_1.7 22.06
|
1030 |
+
ngram_lm_scale_1.2_attention_scale_1.0 22.09
|
1031 |
+
ngram_lm_scale_1.3_attention_scale_1.2 22.2
|
1032 |
+
ngram_lm_scale_2.5_attention_scale_4.0 22.25
|
1033 |
+
ngram_lm_scale_1.7_attention_scale_2.1 22.26
|
1034 |
+
ngram_lm_scale_0.7_attention_scale_0.01 22.32
|
1035 |
+
ngram_lm_scale_2.1_attention_scale_3.0 22.35
|
1036 |
+
ngram_lm_scale_1.9_attention_scale_2.5 22.46
|
1037 |
+
ngram_lm_scale_3.0_attention_scale_5.0 22.52
|
1038 |
+
ngram_lm_scale_1.1_attention_scale_0.7 22.56
|
1039 |
+
ngram_lm_scale_1.0_attention_scale_0.5 22.58
|
1040 |
+
ngram_lm_scale_0.9_attention_scale_0.3 22.61
|
1041 |
+
ngram_lm_scale_1.2_attention_scale_0.9 22.62
|
1042 |
+
ngram_lm_scale_1.7_attention_scale_2.0 22.63
|
1043 |
+
ngram_lm_scale_1.3_attention_scale_1.1 22.66
|
1044 |
+
ngram_lm_scale_1.5_attention_scale_1.5 22.76
|
1045 |
+
ngram_lm_scale_1.7_attention_scale_1.9 22.94
|
1046 |
+
ngram_lm_scale_2.2_attention_scale_3.0 22.94
|
1047 |
+
ngram_lm_scale_1.9_attention_scale_2.3 23.06
|
1048 |
+
ngram_lm_scale_2.0_attention_scale_2.5 23.14
|
1049 |
+
ngram_lm_scale_1.3_attention_scale_1.0 23.29
|
1050 |
+
ngram_lm_scale_1.1_attention_scale_0.6 23.36
|
1051 |
+
ngram_lm_scale_1.9_attention_scale_2.2 23.44
|
1052 |
+
ngram_lm_scale_2.3_attention_scale_3.0 23.55
|
1053 |
+
ngram_lm_scale_1.9_attention_scale_2.1 23.78
|
1054 |
+
ngram_lm_scale_1.7_attention_scale_1.7 23.79
|
1055 |
+
ngram_lm_scale_2.0_attention_scale_2.3 23.79
|
1056 |
+
ngram_lm_scale_1.5_attention_scale_1.3 23.8
|
1057 |
+
ngram_lm_scale_2.1_attention_scale_2.5 23.83
|
1058 |
+
ngram_lm_scale_1.3_attention_scale_0.9 23.96
|
1059 |
+
ngram_lm_scale_1.2_attention_scale_0.7 24.08
|
1060 |
+
ngram_lm_scale_1.9_attention_scale_2.0 24.22
|
1061 |
+
ngram_lm_scale_2.0_attention_scale_2.2 24.22
|
1062 |
+
ngram_lm_scale_1.1_attention_scale_0.5 24.3
|
1063 |
+
ngram_lm_scale_1.5_attention_scale_1.2 24.47
|
1064 |
+
ngram_lm_scale_2.2_attention_scale_2.5 24.56
|
1065 |
+
ngram_lm_scale_1.0_attention_scale_0.3 24.59
|
1066 |
+
ngram_lm_scale_2.1_attention_scale_2.3 24.6
|
1067 |
+
ngram_lm_scale_2.0_attention_scale_2.1 24.66
|
1068 |
+
ngram_lm_scale_3.0_attention_scale_4.0 24.67
|
1069 |
+
ngram_lm_scale_1.9_attention_scale_1.9 24.74
|
1070 |
+
ngram_lm_scale_2.5_attention_scale_3.0 24.82
|
1071 |
+
ngram_lm_scale_1.7_attention_scale_1.5 24.87
|
1072 |
+
ngram_lm_scale_1.2_attention_scale_0.6 25.02
|
1073 |
+
ngram_lm_scale_2.1_attention_scale_2.2 25.05
|
1074 |
+
ngram_lm_scale_2.0_attention_scale_2.0 25.16
|
1075 |
+
ngram_lm_scale_1.5_attention_scale_1.1 25.21
|
1076 |
+
ngram_lm_scale_2.3_attention_scale_2.5 25.31
|
1077 |
+
ngram_lm_scale_2.2_attention_scale_2.3 25.44
|
1078 |
+
ngram_lm_scale_2.1_attention_scale_2.1 25.49
|
1079 |
+
ngram_lm_scale_0.9_attention_scale_0.1 25.53
|
1080 |
+
ngram_lm_scale_2.0_attention_scale_1.9 25.63
|
1081 |
+
ngram_lm_scale_1.3_attention_scale_0.7 25.68
|
1082 |
+
ngram_lm_scale_1.9_attention_scale_1.7 25.78
|
1083 |
+
ngram_lm_scale_2.2_attention_scale_2.2 25.85
|
1084 |
+
ngram_lm_scale_0.9_attention_scale_0.08 25.91
|
1085 |
+
ngram_lm_scale_1.5_attention_scale_1.0 25.93
|
1086 |
+
ngram_lm_scale_2.1_attention_scale_2.0 26.0
|
1087 |
+
ngram_lm_scale_1.2_attention_scale_0.5 26.09
|
1088 |
+
ngram_lm_scale_1.7_attention_scale_1.3 26.16
|
1089 |
+
ngram_lm_scale_2.3_attention_scale_2.3 26.19
|
1090 |
+
ngram_lm_scale_2.2_attention_scale_2.1 26.34
|
1091 |
+
ngram_lm_scale_0.9_attention_scale_0.05 26.44
|
1092 |
+
ngram_lm_scale_2.1_attention_scale_1.9 26.5
|
1093 |
+
ngram_lm_scale_4.0_attention_scale_5.0 26.63
|
1094 |
+
ngram_lm_scale_2.3_attention_scale_2.2 26.64
|
1095 |
+
ngram_lm_scale_2.0_attention_scale_1.7 26.7
|
1096 |
+
ngram_lm_scale_1.3_attention_scale_0.6 26.71
|
1097 |
+
ngram_lm_scale_1.5_attention_scale_0.9 26.72
|
1098 |
+
ngram_lm_scale_2.5_attention_scale_2.5 26.78
|
1099 |
+
ngram_lm_scale_1.1_attention_scale_0.3 26.79
|
1100 |
+
ngram_lm_scale_2.2_attention_scale_2.0 26.83
|
1101 |
+
ngram_lm_scale_1.7_attention_scale_1.2 26.89
|
1102 |
+
ngram_lm_scale_1.9_attention_scale_1.5 26.98
|
1103 |
+
ngram_lm_scale_2.3_attention_scale_2.1 27.17
|
1104 |
+
ngram_lm_scale_0.9_attention_scale_0.01 27.35
|
1105 |
+
ngram_lm_scale_2.2_attention_scale_1.9 27.42
|
1106 |
+
ngram_lm_scale_1.7_attention_scale_1.1 27.64
|
1107 |
+
ngram_lm_scale_2.1_attention_scale_1.7 27.66
|
1108 |
+
ngram_lm_scale_2.3_attention_scale_2.0 27.69
|
1109 |
+
ngram_lm_scale_2.5_attention_scale_2.3 27.71
|
1110 |
+
ngram_lm_scale_1.0_attention_scale_0.1 27.84
|
1111 |
+
ngram_lm_scale_3.0_attention_scale_3.0 27.9
|
1112 |
+
ngram_lm_scale_1.3_attention_scale_0.5 27.92
|
1113 |
+
ngram_lm_scale_2.0_attention_scale_1.5 28.0
|
1114 |
+
ngram_lm_scale_2.5_attention_scale_2.2 28.22
|
1115 |
+
ngram_lm_scale_1.0_attention_scale_0.08 28.24
|
1116 |
+
ngram_lm_scale_2.3_attention_scale_1.9 28.24
|
1117 |
+
ngram_lm_scale_1.9_attention_scale_1.3 28.34
|
1118 |
+
ngram_lm_scale_1.7_attention_scale_1.0 28.47
|
1119 |
+
ngram_lm_scale_2.2_attention_scale_1.7 28.6
|
1120 |
+
ngram_lm_scale_2.5_attention_scale_2.1 28.74
|
1121 |
+
ngram_lm_scale_1.5_attention_scale_0.7 28.75
|
1122 |
+
ngram_lm_scale_1.2_attention_scale_0.3 28.83
|
1123 |
+
ngram_lm_scale_1.0_attention_scale_0.05 28.91
|
1124 |
+
ngram_lm_scale_2.1_attention_scale_1.5 28.96
|
1125 |
+
ngram_lm_scale_1.9_attention_scale_1.2 29.09
|
1126 |
+
ngram_lm_scale_2.5_attention_scale_2.0 29.23
|
1127 |
+
ngram_lm_scale_2.0_attention_scale_1.3 29.38
|
1128 |
+
ngram_lm_scale_2.3_attention_scale_1.7 29.39
|
1129 |
+
ngram_lm_scale_1.7_attention_scale_0.9 29.4
|
1130 |
+
ngram_lm_scale_4.0_attention_scale_4.0 29.41
|
1131 |
+
ngram_lm_scale_1.0_attention_scale_0.01 29.71
|
1132 |
+
ngram_lm_scale_2.5_attention_scale_1.9 29.79
|
1133 |
+
ngram_lm_scale_1.5_attention_scale_0.6 29.84
|
1134 |
+
ngram_lm_scale_2.2_attention_scale_1.5 29.84
|
1135 |
+
ngram_lm_scale_1.9_attention_scale_1.1 29.97
|
1136 |
+
ngram_lm_scale_3.0_attention_scale_2.5 30.05
|
1137 |
+
ngram_lm_scale_2.0_attention_scale_1.2 30.18
|
1138 |
+
ngram_lm_scale_1.1_attention_scale_0.1 30.22
|
1139 |
+
ngram_lm_scale_5.0_attention_scale_5.0 30.24
|
1140 |
+
ngram_lm_scale_2.1_attention_scale_1.3 30.34
|
1141 |
+
ngram_lm_scale_1.1_attention_scale_0.08 30.61
|
1142 |
+
ngram_lm_scale_1.3_attention_scale_0.3 30.69
|
1143 |
+
ngram_lm_scale_2.3_attention_scale_1.5 30.69
|
1144 |
+
ngram_lm_scale_1.9_attention_scale_1.0 30.8
|
1145 |
+
ngram_lm_scale_2.5_attention_scale_1.7 30.96
|
1146 |
+
ngram_lm_scale_3.0_attention_scale_2.3 31.0
|
1147 |
+
ngram_lm_scale_2.0_attention_scale_1.1 31.01
|
1148 |
+
ngram_lm_scale_1.5_attention_scale_0.5 31.06
|
1149 |
+
ngram_lm_scale_1.1_attention_scale_0.05 31.14
|
1150 |
+
ngram_lm_scale_2.1_attention_scale_1.2 31.14
|
1151 |
+
ngram_lm_scale_2.2_attention_scale_1.3 31.28
|
1152 |
+
ngram_lm_scale_1.7_attention_scale_0.7 31.42
|
1153 |
+
ngram_lm_scale_3.0_attention_scale_2.2 31.48
|
1154 |
+
ngram_lm_scale_1.9_attention_scale_0.9 31.72
|
1155 |
+
ngram_lm_scale_2.0_attention_scale_1.0 31.83
|
1156 |
+
ngram_lm_scale_2.1_attention_scale_1.1 31.92
|
1157 |
+
ngram_lm_scale_1.1_attention_scale_0.01 31.94
|
1158 |
+
ngram_lm_scale_3.0_attention_scale_2.1 31.98
|
1159 |
+
ngram_lm_scale_2.2_attention_scale_1.2 32.0
|
1160 |
+
ngram_lm_scale_2.3_attention_scale_1.3 32.07
|
1161 |
+
ngram_lm_scale_1.2_attention_scale_0.1 32.24
|
1162 |
+
ngram_lm_scale_2.5_attention_scale_1.5 32.27
|
1163 |
+
ngram_lm_scale_3.0_attention_scale_2.0 32.49
|
1164 |
+
ngram_lm_scale_1.7_attention_scale_0.6 32.53
|
1165 |
+
ngram_lm_scale_1.2_attention_scale_0.08 32.63
|
1166 |
+
ngram_lm_scale_2.0_attention_scale_0.9 32.71
|
1167 |
+
ngram_lm_scale_2.1_attention_scale_1.0 32.79
|
1168 |
+
ngram_lm_scale_2.2_attention_scale_1.1 32.82
|
1169 |
+
ngram_lm_scale_4.0_attention_scale_3.0 32.82
|
1170 |
+
ngram_lm_scale_2.3_attention_scale_1.2 32.89
|
1171 |
+
ngram_lm_scale_5.0_attention_scale_4.0 32.97
|
1172 |
+
ngram_lm_scale_3.0_attention_scale_1.9 33.03
|
1173 |
+
ngram_lm_scale_1.2_attention_scale_0.05 33.09
|
1174 |
+
ngram_lm_scale_2.5_attention_scale_1.3 33.56
|
1175 |
+
ngram_lm_scale_2.1_attention_scale_0.9 33.57
|
1176 |
+
ngram_lm_scale_2.2_attention_scale_1.0 33.58
|
1177 |
+
ngram_lm_scale_2.3_attention_scale_1.1 33.58
|
1178 |
+
ngram_lm_scale_1.9_attention_scale_0.7 33.66
|
1179 |
+
ngram_lm_scale_1.7_attention_scale_0.5 33.67
|
1180 |
+
ngram_lm_scale_1.5_attention_scale_0.3 33.73
|
1181 |
+
ngram_lm_scale_1.2_attention_scale_0.01 33.78
|
1182 |
+
ngram_lm_scale_1.3_attention_scale_0.1 33.86
|
1183 |
+
ngram_lm_scale_3.0_attention_scale_1.7 34.01
|
1184 |
+
ngram_lm_scale_1.3_attention_scale_0.08 34.16
|
1185 |
+
ngram_lm_scale_2.5_attention_scale_1.2 34.22
|
1186 |
+
ngram_lm_scale_2.3_attention_scale_1.0 34.32
|
1187 |
+
ngram_lm_scale_2.2_attention_scale_0.9 34.37
|
1188 |
+
ngram_lm_scale_2.0_attention_scale_0.7 34.46
|
1189 |
+
ngram_lm_scale_1.9_attention_scale_0.6 34.54
|
1190 |
+
ngram_lm_scale_4.0_attention_scale_2.5 34.54
|
1191 |
+
ngram_lm_scale_1.3_attention_scale_0.05 34.59
|
1192 |
+
ngram_lm_scale_2.5_attention_scale_1.1 34.83
|
1193 |
+
ngram_lm_scale_2.3_attention_scale_0.9 34.99
|
1194 |
+
ngram_lm_scale_3.0_attention_scale_1.5 35.02
|
1195 |
+
ngram_lm_scale_1.3_attention_scale_0.01 35.15
|
1196 |
+
ngram_lm_scale_2.1_attention_scale_0.7 35.19
|
1197 |
+
ngram_lm_scale_4.0_attention_scale_2.3 35.21
|
1198 |
+
ngram_lm_scale_2.0_attention_scale_0.6 35.3
|
1199 |
+
ngram_lm_scale_1.9_attention_scale_0.5 35.46
|
1200 |
+
ngram_lm_scale_2.5_attention_scale_1.0 35.46
|
1201 |
+
ngram_lm_scale_4.0_attention_scale_2.2 35.58
|
1202 |
+
ngram_lm_scale_5.0_attention_scale_3.0 35.61
|
1203 |
+
ngram_lm_scale_2.2_attention_scale_0.7 35.8
|
1204 |
+
ngram_lm_scale_1.7_attention_scale_0.3 35.87
|
1205 |
+
ngram_lm_scale_4.0_attention_scale_2.1 35.92
|
1206 |
+
ngram_lm_scale_2.1_attention_scale_0.6 35.98
|
1207 |
+
ngram_lm_scale_3.0_attention_scale_1.3 36.05
|
1208 |
+
ngram_lm_scale_2.5_attention_scale_0.9 36.08
|
1209 |
+
ngram_lm_scale_2.0_attention_scale_0.5 36.11
|
1210 |
+
ngram_lm_scale_1.5_attention_scale_0.1 36.23
|
1211 |
+
ngram_lm_scale_4.0_attention_scale_2.0 36.31
|
1212 |
+
ngram_lm_scale_2.3_attention_scale_0.7 36.34
|
1213 |
+
ngram_lm_scale_1.5_attention_scale_0.08 36.47
|
1214 |
+
ngram_lm_scale_3.0_attention_scale_1.2 36.47
|
1215 |
+
ngram_lm_scale_2.2_attention_scale_0.6 36.51
|
1216 |
+
ngram_lm_scale_4.0_attention_scale_1.9 36.59
|
1217 |
+
ngram_lm_scale_2.1_attention_scale_0.5 36.72
|
1218 |
+
ngram_lm_scale_1.5_attention_scale_0.05 36.74
|
1219 |
+
ngram_lm_scale_5.0_attention_scale_2.5 36.86
|
1220 |
+
ngram_lm_scale_3.0_attention_scale_1.1 36.93
|
1221 |
+
ngram_lm_scale_2.3_attention_scale_0.6 36.98
|
1222 |
+
ngram_lm_scale_1.5_attention_scale_0.01 37.11
|
1223 |
+
ngram_lm_scale_1.9_attention_scale_0.3 37.17
|
1224 |
+
ngram_lm_scale_2.5_attention_scale_0.7 37.17
|
1225 |
+
ngram_lm_scale_2.2_attention_scale_0.5 37.2
|
1226 |
+
ngram_lm_scale_4.0_attention_scale_1.7 37.21
|
1227 |
+
ngram_lm_scale_5.0_attention_scale_2.3 37.29
|
1228 |
+
ngram_lm_scale_3.0_attention_scale_1.0 37.35
|
1229 |
+
ngram_lm_scale_5.0_attention_scale_2.2 37.51
|
1230 |
+
ngram_lm_scale_2.3_attention_scale_0.5 37.61
|
1231 |
+
ngram_lm_scale_1.7_attention_scale_0.1 37.63
|
1232 |
+
ngram_lm_scale_2.0_attention_scale_0.3 37.63
|
1233 |
+
ngram_lm_scale_3.0_attention_scale_0.9 37.73
|
1234 |
+
ngram_lm_scale_5.0_attention_scale_2.1 37.75
|
1235 |
+
ngram_lm_scale_4.0_attention_scale_1.5 37.76
|
1236 |
+
ngram_lm_scale_2.5_attention_scale_0.6 37.78
|
1237 |
+
ngram_lm_scale_1.7_attention_scale_0.08 37.81
|
1238 |
+
ngram_lm_scale_5.0_attention_scale_2.0 37.96
|
1239 |
+
ngram_lm_scale_2.1_attention_scale_0.3 37.98
|
1240 |
+
ngram_lm_scale_1.7_attention_scale_0.05 38.01
|
1241 |
+
ngram_lm_scale_2.5_attention_scale_0.5 38.19
|
1242 |
+
ngram_lm_scale_5.0_attention_scale_1.9 38.2
|
1243 |
+
ngram_lm_scale_1.7_attention_scale_0.01 38.25
|
1244 |
+
ngram_lm_scale_2.2_attention_scale_0.3 38.28
|
1245 |
+
ngram_lm_scale_4.0_attention_scale_1.3 38.3
|
1246 |
+
ngram_lm_scale_1.9_attention_scale_0.1 38.43
|
1247 |
+
ngram_lm_scale_3.0_attention_scale_0.7 38.48
|
1248 |
+
ngram_lm_scale_2.3_attention_scale_0.3 38.53
|
1249 |
+
ngram_lm_scale_1.9_attention_scale_0.08 38.55
|
1250 |
+
ngram_lm_scale_4.0_attention_scale_1.2 38.56
|
1251 |
+
ngram_lm_scale_5.0_attention_scale_1.7 38.62
|
1252 |
+
ngram_lm_scale_1.9_attention_scale_0.05 38.69
|
1253 |
+
ngram_lm_scale_2.0_attention_scale_0.1 38.74
|
1254 |
+
ngram_lm_scale_3.0_attention_scale_0.6 38.82
|
1255 |
+
ngram_lm_scale_2.0_attention_scale_0.08 38.83
|
1256 |
+
ngram_lm_scale_4.0_attention_scale_1.1 38.84
|
1257 |
+
ngram_lm_scale_1.9_attention_scale_0.01 38.91
|
1258 |
+
ngram_lm_scale_2.0_attention_scale_0.05 38.96
|
1259 |
+
ngram_lm_scale_2.1_attention_scale_0.1 38.97
|
1260 |
+
ngram_lm_scale_2.5_attention_scale_0.3 38.97
|
1261 |
+
ngram_lm_scale_5.0_attention_scale_1.5 39.04
|
1262 |
+
ngram_lm_scale_2.1_attention_scale_0.08 39.05
|
1263 |
+
ngram_lm_scale_4.0_attention_scale_1.0 39.07
|
1264 |
+
ngram_lm_scale_3.0_attention_scale_0.5 39.14
|
1265 |
+
ngram_lm_scale_2.0_attention_scale_0.01 39.18
|
1266 |
+
ngram_lm_scale_2.2_attention_scale_0.1 39.22
|
1267 |
+
ngram_lm_scale_2.1_attention_scale_0.05 39.24
|
1268 |
+
ngram_lm_scale_2.2_attention_scale_0.08 39.31
|
1269 |
+
ngram_lm_scale_4.0_attention_scale_0.9 39.33
|
1270 |
+
ngram_lm_scale_5.0_attention_scale_1.3 39.41
|
1271 |
+
ngram_lm_scale_2.1_attention_scale_0.01 39.43
|
1272 |
+
ngram_lm_scale_2.3_attention_scale_0.1 39.46
|
1273 |
+
ngram_lm_scale_2.2_attention_scale_0.05 39.48
|
1274 |
+
ngram_lm_scale_2.3_attention_scale_0.08 39.52
|
1275 |
+
ngram_lm_scale_5.0_attention_scale_1.2 39.58
|
1276 |
+
ngram_lm_scale_2.2_attention_scale_0.01 39.6
|
1277 |
+
ngram_lm_scale_2.3_attention_scale_0.05 39.62
|
1278 |
+
ngram_lm_scale_2.5_attention_scale_0.1 39.71
|
1279 |
+
ngram_lm_scale_3.0_attention_scale_0.3 39.72
|
1280 |
+
ngram_lm_scale_2.3_attention_scale_0.01 39.74
|
1281 |
+
ngram_lm_scale_4.0_attention_scale_0.7 39.74
|
1282 |
+
ngram_lm_scale_5.0_attention_scale_1.1 39.74
|
1283 |
+
ngram_lm_scale_2.5_attention_scale_0.08 39.77
|
1284 |
+
ngram_lm_scale_2.5_attention_scale_0.05 39.89
|
1285 |
+
ngram_lm_scale_5.0_attention_scale_1.0 39.91
|
1286 |
+
ngram_lm_scale_4.0_attention_scale_0.6 39.95
|
1287 |
+
ngram_lm_scale_2.5_attention_scale_0.01 40.03
|
1288 |
+
ngram_lm_scale_5.0_attention_scale_0.9 40.07
|
1289 |
+
ngram_lm_scale_4.0_attention_scale_0.5 40.14
|
1290 |
+
ngram_lm_scale_3.0_attention_scale_0.1 40.27
|
1291 |
+
ngram_lm_scale_3.0_attention_scale_0.08 40.31
|
1292 |
+
ngram_lm_scale_5.0_attention_scale_0.7 40.34
|
1293 |
+
ngram_lm_scale_3.0_attention_scale_0.05 40.38
|
1294 |
+
ngram_lm_scale_3.0_attention_scale_0.01 40.47
|
1295 |
+
ngram_lm_scale_4.0_attention_scale_0.3 40.47
|
1296 |
+
ngram_lm_scale_5.0_attention_scale_0.6 40.49
|
1297 |
+
ngram_lm_scale_5.0_attention_scale_0.5 40.64
|
1298 |
+
ngram_lm_scale_4.0_attention_scale_0.1 40.84
|
1299 |
+
ngram_lm_scale_4.0_attention_scale_0.08 40.87
|
1300 |
+
ngram_lm_scale_4.0_attention_scale_0.05 40.91
|
1301 |
+
ngram_lm_scale_5.0_attention_scale_0.3 40.91
|
1302 |
+
ngram_lm_scale_4.0_attention_scale_0.01 40.96
|
1303 |
+
ngram_lm_scale_5.0_attention_scale_0.1 41.12
|
1304 |
+
ngram_lm_scale_5.0_attention_scale_0.08 41.17
|
1305 |
+
ngram_lm_scale_5.0_attention_scale_0.05 41.22
|
1306 |
+
ngram_lm_scale_5.0_attention_scale_0.01 41.29
|
1307 |
+
|
1308 |
+
2022-06-27 20:10:33,885 INFO [decode.py:695] Done!
|
decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-26-22-37-17
ADDED
@@ -0,0 +1,202 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-26 22:37:17,938 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-26 22:37:17,939 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 4, 'min_active_states': 30, 'max_active_states': 1000, 'use_double_scores': True, 'env_info': {'k2-version': '1.11', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '', 'k2-git-date': '', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': '7e72d78-dirty', 'icefall-git-date': 'Sat May 28 19:13:53 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3mgpu008', 'IP address': '10.141.0.6'}, 'epoch': 44, 'avg': 5, 'method': 'whole-lattice-rescoring', 'num_paths': 100, 'nbest_scale': 0.2, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 30, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 8, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True}
|
3 |
+
2022-06-26 22:37:18,176 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-26 22:37:18,206 INFO [decode.py:559] device: cuda:0
|
5 |
+
2022-06-26 22:37:22,549 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-26 22:37:23,265 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-40.pt', 'conformer_ctc/exp_5000_att0.8/epoch-41.pt', 'conformer_ctc/exp_5000_att0.8/epoch-42.pt', 'conformer_ctc/exp_5000_att0.8/epoch-43.pt', 'conformer_ctc/exp_5000_att0.8/epoch-44.pt']
|
7 |
+
2022-06-26 22:38:42,767 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-26 22:38:42,768 INFO [asr_datamodule.py:374] About to get test cuts
|
9 |
+
2022-06-26 22:38:42,799 INFO [asr_datamodule.py:367] About to get dev cuts
|
10 |
+
2022-06-26 22:38:46,538 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
11 |
+
2022-06-26 22:39:19,008 INFO [decode.py:483] batch 100/?, cuts processed until now is 407
|
12 |
+
2022-06-26 22:39:51,978 INFO [decode.py:483] batch 200/?, cuts processed until now is 839
|
13 |
+
2022-06-26 22:40:24,708 INFO [decode.py:483] batch 300/?, cuts processed until now is 1272
|
14 |
+
2022-06-26 22:40:58,103 INFO [decode.py:483] batch 400/?, cuts processed until now is 1702
|
15 |
+
2022-06-26 22:41:30,530 INFO [decode.py:483] batch 500/?, cuts processed until now is 2109
|
16 |
+
2022-06-26 22:42:03,513 INFO [decode.py:483] batch 600/?, cuts processed until now is 2544
|
17 |
+
2022-06-26 22:42:35,204 INFO [decode.py:483] batch 700/?, cuts processed until now is 2978
|
18 |
+
2022-06-26 22:43:08,567 INFO [decode.py:483] batch 800/?, cuts processed until now is 3384
|
19 |
+
2022-06-26 22:43:40,827 INFO [decode.py:483] batch 900/?, cuts processed until now is 3811
|
20 |
+
2022-06-26 22:44:13,123 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4220
|
21 |
+
2022-06-26 22:44:45,400 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4631
|
22 |
+
2022-06-26 22:45:18,544 INFO [decode.py:483] batch 1200/?, cuts processed until now is 5033
|
23 |
+
2022-06-26 22:45:52,136 INFO [decode.py:483] batch 1300/?, cuts processed until now is 5355
|
24 |
+
2022-06-26 22:45:53,135 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.1.txt
|
25 |
+
2022-06-26 22:45:53,234 INFO [utils.py:404] [test-lm_scale_0.1] %WER 18.57% [11956 / 64388, 172 ins, 7090 del, 4694 sub ]
|
26 |
+
2022-06-26 22:45:53,689 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.1.txt
|
27 |
+
2022-06-26 22:45:53,741 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.2.txt
|
28 |
+
2022-06-26 22:45:53,828 INFO [utils.py:404] [test-lm_scale_0.2] %WER 18.59% [11972 / 64388, 167 ins, 7142 del, 4663 sub ]
|
29 |
+
2022-06-26 22:45:54,029 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.2.txt
|
30 |
+
2022-06-26 22:45:54,073 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.3.txt
|
31 |
+
2022-06-26 22:45:54,157 INFO [utils.py:404] [test-lm_scale_0.3] %WER 18.68% [12030 / 64388, 161 ins, 7235 del, 4634 sub ]
|
32 |
+
2022-06-26 22:45:54,357 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.3.txt
|
33 |
+
2022-06-26 22:45:54,401 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.4.txt
|
34 |
+
2022-06-26 22:45:54,496 INFO [utils.py:404] [test-lm_scale_0.4] %WER 18.90% [12170 / 64388, 156 ins, 7412 del, 4602 sub ]
|
35 |
+
2022-06-26 22:45:54,914 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.4.txt
|
36 |
+
2022-06-26 22:45:54,961 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.5.txt
|
37 |
+
2022-06-26 22:45:55,046 INFO [utils.py:404] [test-lm_scale_0.5] %WER 19.38% [12477 / 64388, 145 ins, 7769 del, 4563 sub ]
|
38 |
+
2022-06-26 22:45:55,247 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.5.txt
|
39 |
+
2022-06-26 22:45:55,302 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.6.txt
|
40 |
+
2022-06-26 22:45:55,389 INFO [utils.py:404] [test-lm_scale_0.6] %WER 20.31% [13075 / 64388, 134 ins, 8461 del, 4480 sub ]
|
41 |
+
2022-06-26 22:45:55,589 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.6.txt
|
42 |
+
2022-06-26 22:45:55,640 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.7.txt
|
43 |
+
2022-06-26 22:45:55,732 INFO [utils.py:404] [test-lm_scale_0.7] %WER 22.15% [14262 / 64388, 120 ins, 9817 del, 4325 sub ]
|
44 |
+
2022-06-26 22:45:55,933 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.7.txt
|
45 |
+
2022-06-26 22:45:55,980 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.8.txt
|
46 |
+
2022-06-26 22:45:56,274 INFO [utils.py:404] [test-lm_scale_0.8] %WER 24.89% [16029 / 64388, 99 ins, 11848 del, 4082 sub ]
|
47 |
+
2022-06-26 22:45:56,482 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.8.txt
|
48 |
+
2022-06-26 22:45:56,526 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.9.txt
|
49 |
+
2022-06-26 22:45:56,621 INFO [utils.py:404] [test-lm_scale_0.9] %WER 28.21% [18166 / 64388, 87 ins, 14210 del, 3869 sub ]
|
50 |
+
2022-06-26 22:45:56,822 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.9.txt
|
51 |
+
2022-06-26 22:45:56,862 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.0.txt
|
52 |
+
2022-06-26 22:45:56,945 INFO [utils.py:404] [test-lm_scale_1.0] %WER 32.19% [20725 / 64388, 73 ins, 17007 del, 3645 sub ]
|
53 |
+
2022-06-26 22:45:57,148 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.0.txt
|
54 |
+
2022-06-26 22:45:57,192 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.1.txt
|
55 |
+
2022-06-26 22:45:57,282 INFO [utils.py:404] [test-lm_scale_1.1] %WER 36.29% [23365 / 64388, 57 ins, 19898 del, 3410 sub ]
|
56 |
+
2022-06-26 22:45:57,487 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.1.txt
|
57 |
+
2022-06-26 22:45:57,526 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.2.txt
|
58 |
+
2022-06-26 22:45:57,816 INFO [utils.py:404] [test-lm_scale_1.2] %WER 40.47% [26055 / 64388, 48 ins, 22805 del, 3202 sub ]
|
59 |
+
2022-06-26 22:45:58,024 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.2.txt
|
60 |
+
2022-06-26 22:45:58,066 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.3.txt
|
61 |
+
2022-06-26 22:45:58,146 INFO [utils.py:404] [test-lm_scale_1.3] %WER 43.89% [28261 / 64388, 42 ins, 25155 del, 3064 sub ]
|
62 |
+
2022-06-26 22:45:58,352 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.3.txt
|
63 |
+
2022-06-26 22:45:58,390 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.4.txt
|
64 |
+
2022-06-26 22:45:58,481 INFO [utils.py:404] [test-lm_scale_1.4] %WER 46.33% [29833 / 64388, 38 ins, 26847 del, 2948 sub ]
|
65 |
+
2022-06-26 22:45:58,686 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.4.txt
|
66 |
+
2022-06-26 22:45:58,725 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.5.txt
|
67 |
+
2022-06-26 22:45:58,805 INFO [utils.py:404] [test-lm_scale_1.5] %WER 48.03% [30928 / 64388, 35 ins, 27980 del, 2913 sub ]
|
68 |
+
2022-06-26 22:45:59,011 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.5.txt
|
69 |
+
2022-06-26 22:45:59,048 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.6.txt
|
70 |
+
2022-06-26 22:45:59,339 INFO [utils.py:404] [test-lm_scale_1.6] %WER 49.31% [31747 / 64388, 34 ins, 28810 del, 2903 sub ]
|
71 |
+
2022-06-26 22:45:59,546 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.6.txt
|
72 |
+
2022-06-26 22:45:59,585 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.7.txt
|
73 |
+
2022-06-26 22:45:59,666 INFO [utils.py:404] [test-lm_scale_1.7] %WER 50.30% [32385 / 64388, 29 ins, 29439 del, 2917 sub ]
|
74 |
+
2022-06-26 22:45:59,872 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.7.txt
|
75 |
+
2022-06-26 22:45:59,919 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.8.txt
|
76 |
+
2022-06-26 22:46:00,003 INFO [utils.py:404] [test-lm_scale_1.8] %WER 51.09% [32893 / 64388, 30 ins, 29947 del, 2916 sub ]
|
77 |
+
2022-06-26 22:46:00,224 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.8.txt
|
78 |
+
2022-06-26 22:46:00,264 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.9.txt
|
79 |
+
2022-06-26 22:46:00,345 INFO [utils.py:404] [test-lm_scale_1.9] %WER 51.73% [33308 / 64388, 25 ins, 30381 del, 2902 sub ]
|
80 |
+
2022-06-26 22:46:00,769 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.9.txt
|
81 |
+
2022-06-26 22:46:00,817 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_2.0.txt
|
82 |
+
2022-06-26 22:46:00,898 INFO [utils.py:404] [test-lm_scale_2.0] %WER 52.17% [33592 / 64388, 27 ins, 30665 del, 2900 sub ]
|
83 |
+
2022-06-26 22:46:01,107 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_2.0.txt
|
84 |
+
2022-06-26 22:46:01,113 INFO [decode.py:532]
|
85 |
+
For test, WER of different settings are:
|
86 |
+
lm_scale_0.1 18.57 best for test
|
87 |
+
lm_scale_0.2 18.59
|
88 |
+
lm_scale_0.3 18.68
|
89 |
+
lm_scale_0.4 18.9
|
90 |
+
lm_scale_0.5 19.38
|
91 |
+
lm_scale_0.6 20.31
|
92 |
+
lm_scale_0.7 22.15
|
93 |
+
lm_scale_0.8 24.89
|
94 |
+
lm_scale_0.9 28.21
|
95 |
+
lm_scale_1.0 32.19
|
96 |
+
lm_scale_1.1 36.29
|
97 |
+
lm_scale_1.2 40.47
|
98 |
+
lm_scale_1.3 43.89
|
99 |
+
lm_scale_1.4 46.33
|
100 |
+
lm_scale_1.5 48.03
|
101 |
+
lm_scale_1.6 49.31
|
102 |
+
lm_scale_1.7 50.3
|
103 |
+
lm_scale_1.8 51.09
|
104 |
+
lm_scale_1.9 51.73
|
105 |
+
lm_scale_2.0 52.17
|
106 |
+
|
107 |
+
2022-06-26 22:46:02,285 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
108 |
+
2022-06-26 22:46:35,071 INFO [decode.py:483] batch 100/?, cuts processed until now is 428
|
109 |
+
2022-06-26 22:47:08,653 INFO [decode.py:483] batch 200/?, cuts processed until now is 888
|
110 |
+
2022-06-26 22:47:40,809 INFO [decode.py:483] batch 300/?, cuts processed until now is 1363
|
111 |
+
2022-06-26 22:48:12,930 INFO [decode.py:483] batch 400/?, cuts processed until now is 1815
|
112 |
+
2022-06-26 22:48:46,311 INFO [decode.py:483] batch 500/?, cuts processed until now is 2243
|
113 |
+
2022-06-26 22:49:18,323 INFO [decode.py:483] batch 600/?, cuts processed until now is 2717
|
114 |
+
2022-06-26 22:49:51,593 INFO [decode.py:483] batch 700/?, cuts processed until now is 3192
|
115 |
+
2022-06-26 22:50:24,480 INFO [decode.py:483] batch 800/?, cuts processed until now is 3635
|
116 |
+
2022-06-26 22:50:56,383 INFO [decode.py:483] batch 900/?, cuts processed until now is 4082
|
117 |
+
2022-06-26 22:51:28,560 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4500
|
118 |
+
2022-06-26 22:52:02,855 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4869
|
119 |
+
2022-06-26 22:52:16,282 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.1.txt
|
120 |
+
2022-06-26 22:52:16,364 INFO [utils.py:404] [dev-lm_scale_0.1] %WER 20.01% [12039 / 60169, 166 ins, 7746 del, 4127 sub ]
|
121 |
+
2022-06-26 22:52:16,555 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.1.txt
|
122 |
+
2022-06-26 22:52:16,596 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.2.txt
|
123 |
+
2022-06-26 22:52:16,671 INFO [utils.py:404] [dev-lm_scale_0.2] %WER 20.07% [12077 / 60169, 155 ins, 7797 del, 4125 sub ]
|
124 |
+
2022-06-26 22:52:16,857 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.2.txt
|
125 |
+
2022-06-26 22:52:16,894 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.3.txt
|
126 |
+
2022-06-26 22:52:16,980 INFO [utils.py:404] [dev-lm_scale_0.3] %WER 20.20% [12155 / 60169, 150 ins, 7886 del, 4119 sub ]
|
127 |
+
2022-06-26 22:52:17,166 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.3.txt
|
128 |
+
2022-06-26 22:52:17,215 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.4.txt
|
129 |
+
2022-06-26 22:52:17,290 INFO [utils.py:404] [dev-lm_scale_0.4] %WER 20.44% [12301 / 60169, 142 ins, 8066 del, 4093 sub ]
|
130 |
+
2022-06-26 22:52:17,484 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.4.txt
|
131 |
+
2022-06-26 22:52:17,523 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.5.txt
|
132 |
+
2022-06-26 22:52:17,598 INFO [utils.py:404] [dev-lm_scale_0.5] %WER 20.92% [12589 / 60169, 133 ins, 8415 del, 4041 sub ]
|
133 |
+
2022-06-26 22:52:17,789 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.5.txt
|
134 |
+
2022-06-26 22:52:17,826 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.6.txt
|
135 |
+
2022-06-26 22:52:18,051 INFO [utils.py:404] [dev-lm_scale_0.6] %WER 21.95% [13210 / 60169, 114 ins, 9123 del, 3973 sub ]
|
136 |
+
2022-06-26 22:52:18,244 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.6.txt
|
137 |
+
2022-06-26 22:52:18,281 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.7.txt
|
138 |
+
2022-06-26 22:52:18,357 INFO [utils.py:404] [dev-lm_scale_0.7] %WER 23.88% [14368 / 60169, 105 ins, 10478 del, 3785 sub ]
|
139 |
+
2022-06-26 22:52:18,549 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.7.txt
|
140 |
+
2022-06-26 22:52:18,584 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.8.txt
|
141 |
+
2022-06-26 22:52:18,658 INFO [utils.py:404] [dev-lm_scale_0.8] %WER 26.64% [16029 / 60169, 89 ins, 12382 del, 3558 sub ]
|
142 |
+
2022-06-26 22:52:18,850 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.8.txt
|
143 |
+
2022-06-26 22:52:18,884 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.9.txt
|
144 |
+
2022-06-26 22:52:18,959 INFO [utils.py:404] [dev-lm_scale_0.9] %WER 30.08% [18099 / 60169, 76 ins, 14702 del, 3321 sub ]
|
145 |
+
2022-06-26 22:52:19,278 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.9.txt
|
146 |
+
2022-06-26 22:52:19,324 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.0.txt
|
147 |
+
2022-06-26 22:52:19,396 INFO [utils.py:404] [dev-lm_scale_1.0] %WER 33.80% [20337 / 60169, 67 ins, 17194 del, 3076 sub ]
|
148 |
+
2022-06-26 22:52:19,587 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.0.txt
|
149 |
+
2022-06-26 22:52:19,628 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.1.txt
|
150 |
+
2022-06-26 22:52:19,704 INFO [utils.py:404] [dev-lm_scale_1.1] %WER 38.05% [22893 / 60169, 59 ins, 19979 del, 2855 sub ]
|
151 |
+
2022-06-26 22:52:19,893 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.1.txt
|
152 |
+
2022-06-26 22:52:19,926 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.2.txt
|
153 |
+
2022-06-26 22:52:19,999 INFO [utils.py:404] [dev-lm_scale_1.2] %WER 42.04% [25295 / 60169, 47 ins, 22608 del, 2640 sub ]
|
154 |
+
2022-06-26 22:52:20,188 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.2.txt
|
155 |
+
2022-06-26 22:52:20,220 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.3.txt
|
156 |
+
2022-06-26 22:52:20,413 INFO [utils.py:404] [dev-lm_scale_1.3] %WER 45.64% [27462 / 60169, 43 ins, 24922 del, 2497 sub ]
|
157 |
+
2022-06-26 22:52:20,605 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.3.txt
|
158 |
+
2022-06-26 22:52:20,638 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.4.txt
|
159 |
+
2022-06-26 22:52:20,709 INFO [utils.py:404] [dev-lm_scale_1.4] %WER 48.05% [28913 / 60169, 42 ins, 26468 del, 2403 sub ]
|
160 |
+
2022-06-26 22:52:20,900 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.4.txt
|
161 |
+
2022-06-26 22:52:20,945 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.5.txt
|
162 |
+
2022-06-26 22:52:21,026 INFO [utils.py:404] [dev-lm_scale_1.5] %WER 49.60% [29841 / 60169, 40 ins, 27451 del, 2350 sub ]
|
163 |
+
2022-06-26 22:52:21,224 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.5.txt
|
164 |
+
2022-06-26 22:52:21,258 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.6.txt
|
165 |
+
2022-06-26 22:52:21,339 INFO [utils.py:404] [dev-lm_scale_1.6] %WER 50.69% [30499 / 60169, 40 ins, 28138 del, 2321 sub ]
|
166 |
+
2022-06-26 22:52:21,662 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.6.txt
|
167 |
+
2022-06-26 22:52:21,697 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.7.txt
|
168 |
+
2022-06-26 22:52:21,769 INFO [utils.py:404] [dev-lm_scale_1.7] %WER 51.66% [31086 / 60169, 41 ins, 28748 del, 2297 sub ]
|
169 |
+
2022-06-26 22:52:21,966 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.7.txt
|
170 |
+
2022-06-26 22:52:21,999 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.8.txt
|
171 |
+
2022-06-26 22:52:22,070 INFO [utils.py:404] [dev-lm_scale_1.8] %WER 52.42% [31540 / 60169, 39 ins, 29206 del, 2295 sub ]
|
172 |
+
2022-06-26 22:52:22,268 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.8.txt
|
173 |
+
2022-06-26 22:52:22,300 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.9.txt
|
174 |
+
2022-06-26 22:52:22,370 INFO [utils.py:404] [dev-lm_scale_1.9] %WER 52.97% [31873 / 60169, 36 ins, 29517 del, 2320 sub ]
|
175 |
+
2022-06-26 22:52:22,568 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.9.txt
|
176 |
+
2022-06-26 22:52:22,608 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_2.0.txt
|
177 |
+
2022-06-26 22:52:22,688 INFO [utils.py:404] [dev-lm_scale_2.0] %WER 53.42% [32142 / 60169, 37 ins, 29787 del, 2318 sub ]
|
178 |
+
2022-06-26 22:52:23,010 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_2.0.txt
|
179 |
+
2022-06-26 22:52:23,017 INFO [decode.py:532]
|
180 |
+
For dev, WER of different settings are:
|
181 |
+
lm_scale_0.1 20.01 best for dev
|
182 |
+
lm_scale_0.2 20.07
|
183 |
+
lm_scale_0.3 20.2
|
184 |
+
lm_scale_0.4 20.44
|
185 |
+
lm_scale_0.5 20.92
|
186 |
+
lm_scale_0.6 21.95
|
187 |
+
lm_scale_0.7 23.88
|
188 |
+
lm_scale_0.8 26.64
|
189 |
+
lm_scale_0.9 30.08
|
190 |
+
lm_scale_1.0 33.8
|
191 |
+
lm_scale_1.1 38.05
|
192 |
+
lm_scale_1.2 42.04
|
193 |
+
lm_scale_1.3 45.64
|
194 |
+
lm_scale_1.4 48.05
|
195 |
+
lm_scale_1.5 49.6
|
196 |
+
lm_scale_1.6 50.69
|
197 |
+
lm_scale_1.7 51.66
|
198 |
+
lm_scale_1.8 52.42
|
199 |
+
lm_scale_1.9 52.97
|
200 |
+
lm_scale_2.0 53.42
|
201 |
+
|
202 |
+
2022-06-26 22:52:23,017 INFO [decode.py:695] Done!
|
decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-26-23-11-51
ADDED
@@ -0,0 +1,202 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-26 23:11:51,173 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-26 23:11:51,174 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 4, 'min_active_states': 30, 'max_active_states': 1000, 'use_double_scores': True, 'env_info': {'k2-version': '1.11', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '', 'k2-git-date': '', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': '7e72d78-dirty', 'icefall-git-date': 'Sat May 28 19:13:53 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3mgpu008', 'IP address': '10.141.0.6'}, 'epoch': 44, 'avg': 5, 'method': 'whole-lattice-rescoring', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 30, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 8, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True}
|
3 |
+
2022-06-26 23:11:51,423 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-26 23:11:51,454 INFO [decode.py:559] device: cuda:0
|
5 |
+
2022-06-26 23:11:55,905 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-26 23:11:56,647 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-40.pt', 'conformer_ctc/exp_5000_att0.8/epoch-41.pt', 'conformer_ctc/exp_5000_att0.8/epoch-42.pt', 'conformer_ctc/exp_5000_att0.8/epoch-43.pt', 'conformer_ctc/exp_5000_att0.8/epoch-44.pt']
|
7 |
+
2022-06-26 23:12:00,210 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-26 23:12:00,210 INFO [asr_datamodule.py:374] About to get test cuts
|
9 |
+
2022-06-26 23:12:00,214 INFO [asr_datamodule.py:367] About to get dev cuts
|
10 |
+
2022-06-26 23:12:01,821 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
11 |
+
2022-06-26 23:12:34,348 INFO [decode.py:483] batch 100/?, cuts processed until now is 407
|
12 |
+
2022-06-26 23:13:06,294 INFO [decode.py:483] batch 200/?, cuts processed until now is 839
|
13 |
+
2022-06-26 23:13:38,794 INFO [decode.py:483] batch 300/?, cuts processed until now is 1272
|
14 |
+
2022-06-26 23:14:11,304 INFO [decode.py:483] batch 400/?, cuts processed until now is 1702
|
15 |
+
2022-06-26 23:14:43,455 INFO [decode.py:483] batch 500/?, cuts processed until now is 2109
|
16 |
+
2022-06-26 23:15:16,309 INFO [decode.py:483] batch 600/?, cuts processed until now is 2544
|
17 |
+
2022-06-26 23:15:50,970 INFO [decode.py:483] batch 700/?, cuts processed until now is 2978
|
18 |
+
2022-06-26 23:16:25,113 INFO [decode.py:483] batch 800/?, cuts processed until now is 3384
|
19 |
+
2022-06-26 23:16:58,532 INFO [decode.py:483] batch 900/?, cuts processed until now is 3811
|
20 |
+
2022-06-26 23:17:33,161 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4220
|
21 |
+
2022-06-26 23:18:07,464 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4631
|
22 |
+
2022-06-26 23:18:42,532 INFO [decode.py:483] batch 1200/?, cuts processed until now is 5033
|
23 |
+
2022-06-26 23:19:18,733 INFO [decode.py:483] batch 1300/?, cuts processed until now is 5355
|
24 |
+
2022-06-26 23:19:19,776 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.1.txt
|
25 |
+
2022-06-26 23:19:19,874 INFO [utils.py:404] [test-lm_scale_0.1] %WER 18.57% [11956 / 64388, 172 ins, 7090 del, 4694 sub ]
|
26 |
+
2022-06-26 23:19:20,304 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.1.txt
|
27 |
+
2022-06-26 23:19:20,367 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.2.txt
|
28 |
+
2022-06-26 23:19:20,463 INFO [utils.py:404] [test-lm_scale_0.2] %WER 18.59% [11972 / 64388, 167 ins, 7142 del, 4663 sub ]
|
29 |
+
2022-06-26 23:19:20,662 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.2.txt
|
30 |
+
2022-06-26 23:19:20,707 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.3.txt
|
31 |
+
2022-06-26 23:19:20,794 INFO [utils.py:404] [test-lm_scale_0.3] %WER 18.68% [12030 / 64388, 161 ins, 7235 del, 4634 sub ]
|
32 |
+
2022-06-26 23:19:20,991 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.3.txt
|
33 |
+
2022-06-26 23:19:21,035 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.4.txt
|
34 |
+
2022-06-26 23:19:21,119 INFO [utils.py:404] [test-lm_scale_0.4] %WER 18.90% [12170 / 64388, 156 ins, 7412 del, 4602 sub ]
|
35 |
+
2022-06-26 23:19:21,318 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.4.txt
|
36 |
+
2022-06-26 23:19:21,361 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.5.txt
|
37 |
+
2022-06-26 23:19:21,629 INFO [utils.py:404] [test-lm_scale_0.5] %WER 19.38% [12477 / 64388, 145 ins, 7769 del, 4563 sub ]
|
38 |
+
2022-06-26 23:19:21,833 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.5.txt
|
39 |
+
2022-06-26 23:19:21,877 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.6.txt
|
40 |
+
2022-06-26 23:19:21,967 INFO [utils.py:404] [test-lm_scale_0.6] %WER 20.31% [13075 / 64388, 134 ins, 8461 del, 4480 sub ]
|
41 |
+
2022-06-26 23:19:22,164 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.6.txt
|
42 |
+
2022-06-26 23:19:22,210 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.7.txt
|
43 |
+
2022-06-26 23:19:22,312 INFO [utils.py:404] [test-lm_scale_0.7] %WER 22.15% [14262 / 64388, 120 ins, 9817 del, 4325 sub ]
|
44 |
+
2022-06-26 23:19:22,514 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.7.txt
|
45 |
+
2022-06-26 23:19:22,568 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.8.txt
|
46 |
+
2022-06-26 23:19:22,653 INFO [utils.py:404] [test-lm_scale_0.8] %WER 24.89% [16029 / 64388, 99 ins, 11848 del, 4082 sub ]
|
47 |
+
2022-06-26 23:19:23,035 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.8.txt
|
48 |
+
2022-06-26 23:19:23,081 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.9.txt
|
49 |
+
2022-06-26 23:19:23,168 INFO [utils.py:404] [test-lm_scale_0.9] %WER 28.21% [18166 / 64388, 87 ins, 14210 del, 3869 sub ]
|
50 |
+
2022-06-26 23:19:23,369 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.9.txt
|
51 |
+
2022-06-26 23:19:23,422 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.0.txt
|
52 |
+
2022-06-26 23:19:23,517 INFO [utils.py:404] [test-lm_scale_1.0] %WER 32.19% [20725 / 64388, 73 ins, 17007 del, 3645 sub ]
|
53 |
+
2022-06-26 23:19:23,720 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.0.txt
|
54 |
+
2022-06-26 23:19:23,764 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.1.txt
|
55 |
+
2022-06-26 23:19:23,857 INFO [utils.py:404] [test-lm_scale_1.1] %WER 36.29% [23365 / 64388, 57 ins, 19898 del, 3410 sub ]
|
56 |
+
2022-06-26 23:19:24,060 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.1.txt
|
57 |
+
2022-06-26 23:19:24,105 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.2.txt
|
58 |
+
2022-06-26 23:19:24,197 INFO [utils.py:404] [test-lm_scale_1.2] %WER 40.47% [26055 / 64388, 48 ins, 22805 del, 3202 sub ]
|
59 |
+
2022-06-26 23:19:24,583 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.2.txt
|
60 |
+
2022-06-26 23:19:24,627 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.3.txt
|
61 |
+
2022-06-26 23:19:24,712 INFO [utils.py:404] [test-lm_scale_1.3] %WER 43.89% [28261 / 64388, 42 ins, 25155 del, 3064 sub ]
|
62 |
+
2022-06-26 23:19:24,916 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.3.txt
|
63 |
+
2022-06-26 23:19:24,956 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.4.txt
|
64 |
+
2022-06-26 23:19:25,047 INFO [utils.py:404] [test-lm_scale_1.4] %WER 46.33% [29833 / 64388, 38 ins, 26847 del, 2948 sub ]
|
65 |
+
2022-06-26 23:19:25,252 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.4.txt
|
66 |
+
2022-06-26 23:19:25,297 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.5.txt
|
67 |
+
2022-06-26 23:19:25,377 INFO [utils.py:404] [test-lm_scale_1.5] %WER 48.03% [30928 / 64388, 35 ins, 27980 del, 2913 sub ]
|
68 |
+
2022-06-26 23:19:25,581 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.5.txt
|
69 |
+
2022-06-26 23:19:25,626 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.6.txt
|
70 |
+
2022-06-26 23:19:25,893 INFO [utils.py:404] [test-lm_scale_1.6] %WER 49.31% [31747 / 64388, 34 ins, 28810 del, 2903 sub ]
|
71 |
+
2022-06-26 23:19:26,107 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.6.txt
|
72 |
+
2022-06-26 23:19:26,150 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.7.txt
|
73 |
+
2022-06-26 23:19:26,232 INFO [utils.py:404] [test-lm_scale_1.7] %WER 50.30% [32385 / 64388, 29 ins, 29439 del, 2917 sub ]
|
74 |
+
2022-06-26 23:19:26,437 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.7.txt
|
75 |
+
2022-06-26 23:19:26,476 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.8.txt
|
76 |
+
2022-06-26 23:19:26,556 INFO [utils.py:404] [test-lm_scale_1.8] %WER 51.09% [32893 / 64388, 30 ins, 29947 del, 2916 sub ]
|
77 |
+
2022-06-26 23:19:26,763 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.8.txt
|
78 |
+
2022-06-26 23:19:26,804 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.9.txt
|
79 |
+
2022-06-26 23:19:26,894 INFO [utils.py:404] [test-lm_scale_1.9] %WER 51.73% [33308 / 64388, 25 ins, 30381 del, 2902 sub ]
|
80 |
+
2022-06-26 23:19:27,100 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.9.txt
|
81 |
+
2022-06-26 23:19:27,141 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_2.0.txt
|
82 |
+
2022-06-26 23:19:27,221 INFO [utils.py:404] [test-lm_scale_2.0] %WER 52.17% [33592 / 64388, 27 ins, 30665 del, 2900 sub ]
|
83 |
+
2022-06-26 23:19:27,604 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_2.0.txt
|
84 |
+
2022-06-26 23:19:27,609 INFO [decode.py:532]
|
85 |
+
For test, WER of different settings are:
|
86 |
+
lm_scale_0.1 18.57 best for test
|
87 |
+
lm_scale_0.2 18.59
|
88 |
+
lm_scale_0.3 18.68
|
89 |
+
lm_scale_0.4 18.9
|
90 |
+
lm_scale_0.5 19.38
|
91 |
+
lm_scale_0.6 20.31
|
92 |
+
lm_scale_0.7 22.15
|
93 |
+
lm_scale_0.8 24.89
|
94 |
+
lm_scale_0.9 28.21
|
95 |
+
lm_scale_1.0 32.19
|
96 |
+
lm_scale_1.1 36.29
|
97 |
+
lm_scale_1.2 40.47
|
98 |
+
lm_scale_1.3 43.89
|
99 |
+
lm_scale_1.4 46.33
|
100 |
+
lm_scale_1.5 48.03
|
101 |
+
lm_scale_1.6 49.31
|
102 |
+
lm_scale_1.7 50.3
|
103 |
+
lm_scale_1.8 51.09
|
104 |
+
lm_scale_1.9 51.73
|
105 |
+
lm_scale_2.0 52.17
|
106 |
+
|
107 |
+
2022-06-26 23:19:28,540 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
108 |
+
2022-06-26 23:20:02,462 INFO [decode.py:483] batch 100/?, cuts processed until now is 428
|
109 |
+
2022-06-26 23:20:38,014 INFO [decode.py:483] batch 200/?, cuts processed until now is 888
|
110 |
+
2022-06-26 23:21:11,425 INFO [decode.py:483] batch 300/?, cuts processed until now is 1363
|
111 |
+
2022-06-26 23:21:44,755 INFO [decode.py:483] batch 400/?, cuts processed until now is 1815
|
112 |
+
2022-06-26 23:22:18,702 INFO [decode.py:483] batch 500/?, cuts processed until now is 2243
|
113 |
+
2022-06-26 23:22:52,000 INFO [decode.py:483] batch 600/?, cuts processed until now is 2717
|
114 |
+
2022-06-26 23:23:25,539 INFO [decode.py:483] batch 700/?, cuts processed until now is 3192
|
115 |
+
2022-06-26 23:23:59,158 INFO [decode.py:483] batch 800/?, cuts processed until now is 3635
|
116 |
+
2022-06-26 23:24:33,611 INFO [decode.py:483] batch 900/?, cuts processed until now is 4082
|
117 |
+
2022-06-26 23:25:05,267 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4500
|
118 |
+
2022-06-26 23:25:38,452 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4869
|
119 |
+
2022-06-26 23:25:53,215 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.1.txt
|
120 |
+
2022-06-26 23:25:53,522 INFO [utils.py:404] [dev-lm_scale_0.1] %WER 20.01% [12039 / 60169, 166 ins, 7746 del, 4127 sub ]
|
121 |
+
2022-06-26 23:25:53,718 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.1.txt
|
122 |
+
2022-06-26 23:25:53,767 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.2.txt
|
123 |
+
2022-06-26 23:25:53,851 INFO [utils.py:404] [dev-lm_scale_0.2] %WER 20.07% [12077 / 60169, 155 ins, 7797 del, 4125 sub ]
|
124 |
+
2022-06-26 23:25:54,041 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.2.txt
|
125 |
+
2022-06-26 23:25:54,089 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.3.txt
|
126 |
+
2022-06-26 23:25:54,170 INFO [utils.py:404] [dev-lm_scale_0.3] %WER 20.20% [12155 / 60169, 150 ins, 7886 del, 4119 sub ]
|
127 |
+
2022-06-26 23:25:54,361 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.3.txt
|
128 |
+
2022-06-26 23:25:54,409 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.4.txt
|
129 |
+
2022-06-26 23:25:54,672 INFO [utils.py:404] [dev-lm_scale_0.4] %WER 20.44% [12301 / 60169, 142 ins, 8066 del, 4093 sub ]
|
130 |
+
2022-06-26 23:25:54,866 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.4.txt
|
131 |
+
2022-06-26 23:25:54,914 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.5.txt
|
132 |
+
2022-06-26 23:25:54,994 INFO [utils.py:404] [dev-lm_scale_0.5] %WER 20.92% [12589 / 60169, 133 ins, 8415 del, 4041 sub ]
|
133 |
+
2022-06-26 23:25:55,184 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.5.txt
|
134 |
+
2022-06-26 23:25:55,234 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.6.txt
|
135 |
+
2022-06-26 23:25:55,323 INFO [utils.py:404] [dev-lm_scale_0.6] %WER 21.95% [13210 / 60169, 114 ins, 9123 del, 3973 sub ]
|
136 |
+
2022-06-26 23:25:55,512 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.6.txt
|
137 |
+
2022-06-26 23:25:55,552 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.7.txt
|
138 |
+
2022-06-26 23:25:55,632 INFO [utils.py:404] [dev-lm_scale_0.7] %WER 23.88% [14368 / 60169, 105 ins, 10478 del, 3785 sub ]
|
139 |
+
2022-06-26 23:25:56,005 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.7.txt
|
140 |
+
2022-06-26 23:25:56,053 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.8.txt
|
141 |
+
2022-06-26 23:25:56,133 INFO [utils.py:404] [dev-lm_scale_0.8] %WER 26.64% [16029 / 60169, 89 ins, 12382 del, 3558 sub ]
|
142 |
+
2022-06-26 23:25:56,324 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.8.txt
|
143 |
+
2022-06-26 23:25:56,374 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.9.txt
|
144 |
+
2022-06-26 23:25:56,463 INFO [utils.py:404] [dev-lm_scale_0.9] %WER 30.08% [18099 / 60169, 76 ins, 14702 del, 3321 sub ]
|
145 |
+
2022-06-26 23:25:56,668 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.9.txt
|
146 |
+
2022-06-26 23:25:56,720 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.0.txt
|
147 |
+
2022-06-26 23:25:56,799 INFO [utils.py:404] [dev-lm_scale_1.0] %WER 33.80% [20337 / 60169, 67 ins, 17194 del, 3076 sub ]
|
148 |
+
2022-06-26 23:25:57,000 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.0.txt
|
149 |
+
2022-06-26 23:25:57,041 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.1.txt
|
150 |
+
2022-06-26 23:25:57,119 INFO [utils.py:404] [dev-lm_scale_1.1] %WER 38.05% [22893 / 60169, 59 ins, 19979 del, 2855 sub ]
|
151 |
+
2022-06-26 23:25:57,495 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.1.txt
|
152 |
+
2022-06-26 23:25:57,535 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.2.txt
|
153 |
+
2022-06-26 23:25:57,613 INFO [utils.py:404] [dev-lm_scale_1.2] %WER 42.04% [25295 / 60169, 47 ins, 22608 del, 2640 sub ]
|
154 |
+
2022-06-26 23:25:57,806 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.2.txt
|
155 |
+
2022-06-26 23:25:57,845 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.3.txt
|
156 |
+
2022-06-26 23:25:57,931 INFO [utils.py:404] [dev-lm_scale_1.3] %WER 45.64% [27462 / 60169, 43 ins, 24922 del, 2497 sub ]
|
157 |
+
2022-06-26 23:25:58,126 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.3.txt
|
158 |
+
2022-06-26 23:25:58,167 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.4.txt
|
159 |
+
2022-06-26 23:25:58,254 INFO [utils.py:404] [dev-lm_scale_1.4] %WER 48.05% [28913 / 60169, 42 ins, 26468 del, 2403 sub ]
|
160 |
+
2022-06-26 23:25:58,462 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.4.txt
|
161 |
+
2022-06-26 23:25:58,514 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.5.txt
|
162 |
+
2022-06-26 23:25:58,762 INFO [utils.py:404] [dev-lm_scale_1.5] %WER 49.60% [29841 / 60169, 40 ins, 27451 del, 2350 sub ]
|
163 |
+
2022-06-26 23:25:58,963 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.5.txt
|
164 |
+
2022-06-26 23:25:59,014 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.6.txt
|
165 |
+
2022-06-26 23:25:59,100 INFO [utils.py:404] [dev-lm_scale_1.6] %WER 50.69% [30499 / 60169, 40 ins, 28138 del, 2321 sub ]
|
166 |
+
2022-06-26 23:25:59,297 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.6.txt
|
167 |
+
2022-06-26 23:25:59,344 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.7.txt
|
168 |
+
2022-06-26 23:25:59,431 INFO [utils.py:404] [dev-lm_scale_1.7] %WER 51.66% [31086 / 60169, 41 ins, 28748 del, 2297 sub ]
|
169 |
+
2022-06-26 23:25:59,626 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.7.txt
|
170 |
+
2022-06-26 23:25:59,665 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.8.txt
|
171 |
+
2022-06-26 23:25:59,742 INFO [utils.py:404] [dev-lm_scale_1.8] %WER 52.42% [31540 / 60169, 39 ins, 29206 del, 2295 sub ]
|
172 |
+
2022-06-26 23:25:59,936 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.8.txt
|
173 |
+
2022-06-26 23:25:59,971 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.9.txt
|
174 |
+
2022-06-26 23:26:00,234 INFO [utils.py:404] [dev-lm_scale_1.9] %WER 52.97% [31873 / 60169, 36 ins, 29517 del, 2320 sub ]
|
175 |
+
2022-06-26 23:26:00,436 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.9.txt
|
176 |
+
2022-06-26 23:26:00,486 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_2.0.txt
|
177 |
+
2022-06-26 23:26:00,563 INFO [utils.py:404] [dev-lm_scale_2.0] %WER 53.42% [32142 / 60169, 37 ins, 29787 del, 2318 sub ]
|
178 |
+
2022-06-26 23:26:00,760 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_2.0.txt
|
179 |
+
2022-06-26 23:26:00,765 INFO [decode.py:532]
|
180 |
+
For dev, WER of different settings are:
|
181 |
+
lm_scale_0.1 20.01 best for dev
|
182 |
+
lm_scale_0.2 20.07
|
183 |
+
lm_scale_0.3 20.2
|
184 |
+
lm_scale_0.4 20.44
|
185 |
+
lm_scale_0.5 20.92
|
186 |
+
lm_scale_0.6 21.95
|
187 |
+
lm_scale_0.7 23.88
|
188 |
+
lm_scale_0.8 26.64
|
189 |
+
lm_scale_0.9 30.08
|
190 |
+
lm_scale_1.0 33.8
|
191 |
+
lm_scale_1.1 38.05
|
192 |
+
lm_scale_1.2 42.04
|
193 |
+
lm_scale_1.3 45.64
|
194 |
+
lm_scale_1.4 48.05
|
195 |
+
lm_scale_1.5 49.6
|
196 |
+
lm_scale_1.6 50.69
|
197 |
+
lm_scale_1.7 51.66
|
198 |
+
lm_scale_1.8 52.42
|
199 |
+
lm_scale_1.9 52.97
|
200 |
+
lm_scale_2.0 53.42
|
201 |
+
|
202 |
+
2022-06-26 23:26:00,765 INFO [decode.py:695] Done!
|
decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-26-23-21-46
ADDED
The diff for this file is too large to render.
See raw diff
|
|
decoding-results/log-whole-lattice-rescoring/log-decode-2022-06-27-18-46-45
ADDED
@@ -0,0 +1,299 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-06-27 18:46:45,715 INFO [decode.py:548] Decoding started
|
2 |
+
2022-06-27 18:46:45,716 INFO [decode.py:549] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 5000, 'use_double_scores': True, 'env_info': {'k2-version': '1.16', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '3c606c27045750bbbb7a289d8b2b09825dea521a', 'k2-git-date': 'Mon Jun 27 03:06:58 2022', 'lhotse-version': '1.3.0.dev+git.a07121a.clean', 'torch-version': '1.7.1', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'test', 'icefall-git-sha1': 'e24e6ac-dirty', 'icefall-git-date': 'Mon Jun 27 01:23:06 2022', 'icefall-path': '/alt-arabic/speech/amir/k2/tmp/icefall', 'k2-path': '/alt-arabic/speech/amir/k2/tmp/k2/k2/python/k2/__init__.py', 'lhotse-path': '/alt-arabic/speech/amir/k2/tmp/lhotse/lhotse/__init__.py', 'hostname': 'crimv3mgpu016', 'IP address': '10.141.0.3'}, 'epoch': 45, 'avg': 5, 'method': 'whole-lattice-rescoring', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_5000_att0.8'), 'lang_dir': PosixPath('data/lang_bpe_5000'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 30, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 20, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True}
|
3 |
+
2022-06-27 18:46:46,037 INFO [lexicon.py:177] Loading pre-compiled data/lang_bpe_5000/Linv.pt
|
4 |
+
2022-06-27 18:46:46,339 INFO [decode.py:559] device: cuda:0
|
5 |
+
2022-06-27 18:47:36,089 INFO [decode.py:621] Loading pre-compiled G_4_gram.pt
|
6 |
+
2022-06-27 18:47:39,038 INFO [decode.py:657] averaging ['conformer_ctc/exp_5000_att0.8/epoch-41.pt', 'conformer_ctc/exp_5000_att0.8/epoch-42.pt', 'conformer_ctc/exp_5000_att0.8/epoch-43.pt', 'conformer_ctc/exp_5000_att0.8/epoch-44.pt', 'conformer_ctc/exp_5000_att0.8/epoch-45.pt']
|
7 |
+
2022-06-27 18:48:49,442 INFO [decode.py:664] Number of model parameters: 90786736
|
8 |
+
2022-06-27 18:48:49,442 INFO [asr_datamodule.py:362] About to get test cuts
|
9 |
+
2022-06-27 18:48:49,491 INFO [asr_datamodule.py:357] About to get dev cuts
|
10 |
+
2022-06-27 18:48:51,979 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
11 |
+
2022-06-27 18:49:23,121 INFO [decode.py:783] Caught exception:
|
12 |
+
CUDA out of memory. Tried to allocate 1.70 GiB (GPU 0; 31.75 GiB total capacity; 27.32 GiB already allocated; 400.50 MiB free; 30.16 GiB reserved in total by PyTorch)
|
13 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
14 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
15 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
16 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
17 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
18 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
19 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
20 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0x71 (0x2aab11432f51 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
21 |
+
frame #7: <unknown function> + 0x2472bd (0x2aab115c32bd in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
22 |
+
frame #8: k2::RaggedShapeFromTotSizes(std::shared_ptr<k2::Context>, int, int const*) + 0x213 (0x2aab115c3b83 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
23 |
+
frame #9: k2::IndexAxis0(k2::RaggedShape&, k2::Array1<int> const&, k2::Array1<int>*) + 0x32c (0x2aab115d77ec in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
24 |
+
frame #10: k2::Index(k2::RaggedShape&, int, k2::Array1<int> const&, k2::Array1<int>*) + 0x353 (0x2aab115dc943 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
25 |
+
frame #11: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x407 (0x2aab11552327 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
26 |
+
frame #12: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x3a2 (0x2aab11545682 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
27 |
+
frame #13: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
28 |
+
frame #14: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
29 |
+
<omitting python frames>
|
30 |
+
frame #44: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
31 |
+
|
32 |
+
|
33 |
+
2022-06-27 18:49:23,122 INFO [decode.py:789] num_arcs before pruning: 940457
|
34 |
+
2022-06-27 18:49:23,123 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
35 |
+
2022-06-27 18:49:23,136 INFO [decode.py:803] num_arcs after pruning: 8198
|
36 |
+
2022-06-27 18:49:32,702 INFO [decode.py:483] batch 100/?, cuts processed until now is 407
|
37 |
+
2022-06-27 18:50:09,988 INFO [decode.py:483] batch 200/?, cuts processed until now is 839
|
38 |
+
2022-06-27 18:50:47,777 INFO [decode.py:483] batch 300/?, cuts processed until now is 1272
|
39 |
+
2022-06-27 18:51:25,388 INFO [decode.py:483] batch 400/?, cuts processed until now is 1702
|
40 |
+
2022-06-27 18:52:00,700 INFO [decode.py:483] batch 500/?, cuts processed until now is 2109
|
41 |
+
2022-06-27 18:52:39,254 INFO [decode.py:483] batch 600/?, cuts processed until now is 2544
|
42 |
+
2022-06-27 18:53:16,422 INFO [decode.py:483] batch 700/?, cuts processed until now is 2978
|
43 |
+
2022-06-27 18:53:54,612 INFO [decode.py:483] batch 800/?, cuts processed until now is 3384
|
44 |
+
2022-06-27 18:54:31,824 INFO [decode.py:483] batch 900/?, cuts processed until now is 3811
|
45 |
+
2022-06-27 18:55:06,114 INFO [decode.py:783] Caught exception:
|
46 |
+
CUDA out of memory. Tried to allocate 1.67 GiB (GPU 0; 31.75 GiB total capacity; 27.27 GiB already allocated; 1.06 GiB free; 29.49 GiB reserved in total by PyTorch)
|
47 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
48 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
49 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
50 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
51 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
52 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
53 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
54 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0x71 (0x2aab11432f51 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
55 |
+
frame #7: <unknown function> + 0x2472bd (0x2aab115c32bd in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
56 |
+
frame #8: k2::RaggedShapeFromTotSizes(std::shared_ptr<k2::Context>, int, int const*) + 0x213 (0x2aab115c3b83 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
57 |
+
frame #9: k2::IndexAxis0(k2::RaggedShape&, k2::Array1<int> const&, k2::Array1<int>*) + 0x32c (0x2aab115d77ec in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
58 |
+
frame #10: k2::Index(k2::RaggedShape&, int, k2::Array1<int> const&, k2::Array1<int>*) + 0x353 (0x2aab115dc943 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
59 |
+
frame #11: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x407 (0x2aab11552327 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
60 |
+
frame #12: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x3a2 (0x2aab11545682 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
61 |
+
frame #13: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
62 |
+
frame #14: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
63 |
+
<omitting python frames>
|
64 |
+
frame #44: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
65 |
+
|
66 |
+
|
67 |
+
2022-06-27 18:55:06,115 INFO [decode.py:789] num_arcs before pruning: 1034414
|
68 |
+
2022-06-27 18:55:06,115 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
69 |
+
2022-06-27 18:55:06,128 INFO [decode.py:803] num_arcs after pruning: 5251
|
70 |
+
2022-06-27 18:55:09,967 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4220
|
71 |
+
2022-06-27 18:55:48,299 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4631
|
72 |
+
2022-06-27 18:56:25,270 INFO [decode.py:483] batch 1200/?, cuts processed until now is 5033
|
73 |
+
2022-06-27 18:56:56,411 INFO [decode.py:783] Caught exception:
|
74 |
+
CUDA out of memory. Tried to allocate 1.81 GiB (GPU 0; 31.75 GiB total capacity; 27.76 GiB already allocated; 1.06 GiB free; 29.49 GiB reserved in total by PyTorch)
|
75 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
76 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
77 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
78 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
79 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
80 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
81 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
82 |
+
frame #6: k2::Array1<int>::Init(std::shared_ptr<k2::Context>, int, k2::Dtype) + 0x71 (0x2aab11432f51 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
83 |
+
frame #7: <unknown function> + 0x2472bd (0x2aab115c32bd in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
84 |
+
frame #8: k2::RaggedShapeFromTotSizes(std::shared_ptr<k2::Context>, int, int const*) + 0x213 (0x2aab115c3b83 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
85 |
+
frame #9: k2::IndexAxis0(k2::RaggedShape&, k2::Array1<int> const&, k2::Array1<int>*) + 0x32c (0x2aab115d77ec in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
86 |
+
frame #10: k2::Index(k2::RaggedShape&, int, k2::Array1<int> const&, k2::Array1<int>*) + 0x353 (0x2aab115dc943 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
87 |
+
frame #11: k2::Ragged<k2::Arc> k2::DeviceIntersector::FormatOutputTpl<k2::Hash::PackedAccessor>(k2::Array1<int>*, k2::Array1<int>*) + 0x407 (0x2aab11552327 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
88 |
+
frame #12: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x3a2 (0x2aab11545682 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
89 |
+
frame #13: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
90 |
+
frame #14: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
91 |
+
<omitting python frames>
|
92 |
+
frame #44: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
93 |
+
|
94 |
+
|
95 |
+
2022-06-27 18:56:56,411 INFO [decode.py:789] num_arcs before pruning: 1081951
|
96 |
+
2022-06-27 18:56:56,411 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
97 |
+
2022-06-27 18:56:56,424 INFO [decode.py:803] num_arcs after pruning: 6154
|
98 |
+
2022-06-27 18:57:04,430 INFO [decode.py:483] batch 1300/?, cuts processed until now is 5355
|
99 |
+
2022-06-27 18:57:05,509 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.1.txt
|
100 |
+
2022-06-27 18:57:05,613 INFO [utils.py:418] [test-lm_scale_0.1] %WER 15.01% [9667 / 64388, 371 ins, 3551 del, 5745 sub ]
|
101 |
+
2022-06-27 18:57:06,082 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.1.txt
|
102 |
+
2022-06-27 18:57:06,131 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.2.txt
|
103 |
+
2022-06-27 18:57:06,226 INFO [utils.py:418] [test-lm_scale_0.2] %WER 15.29% [9842 / 64388, 323 ins, 3881 del, 5638 sub ]
|
104 |
+
2022-06-27 18:57:06,450 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.2.txt
|
105 |
+
2022-06-27 18:57:06,495 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.3.txt
|
106 |
+
2022-06-27 18:57:06,590 INFO [utils.py:418] [test-lm_scale_0.3] %WER 15.70% [10106 / 64388, 262 ins, 4365 del, 5479 sub ]
|
107 |
+
2022-06-27 18:57:06,814 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.3.txt
|
108 |
+
2022-06-27 18:57:06,858 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.4.txt
|
109 |
+
2022-06-27 18:57:06,951 INFO [utils.py:418] [test-lm_scale_0.4] %WER 16.37% [10541 / 64388, 219 ins, 5051 del, 5271 sub ]
|
110 |
+
2022-06-27 18:57:07,175 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.4.txt
|
111 |
+
2022-06-27 18:57:07,468 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.5.txt
|
112 |
+
2022-06-27 18:57:07,703 INFO [utils.py:418] [test-lm_scale_0.5] %WER 17.47% [11250 / 64388, 176 ins, 6024 del, 5050 sub ]
|
113 |
+
2022-06-27 18:57:07,927 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.5.txt
|
114 |
+
2022-06-27 18:57:07,972 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.6.txt
|
115 |
+
2022-06-27 18:57:08,066 INFO [utils.py:418] [test-lm_scale_0.6] %WER 19.07% [12282 / 64388, 152 ins, 7369 del, 4761 sub ]
|
116 |
+
2022-06-27 18:57:08,290 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.6.txt
|
117 |
+
2022-06-27 18:57:08,334 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.7.txt
|
118 |
+
2022-06-27 18:57:08,426 INFO [utils.py:418] [test-lm_scale_0.7] %WER 21.32% [13725 / 64388, 136 ins, 9100 del, 4489 sub ]
|
119 |
+
2022-06-27 18:57:08,649 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.7.txt
|
120 |
+
2022-06-27 18:57:08,692 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.8.txt
|
121 |
+
2022-06-27 18:57:08,783 INFO [utils.py:418] [test-lm_scale_0.8] %WER 24.17% [15565 / 64388, 105 ins, 11228 del, 4232 sub ]
|
122 |
+
2022-06-27 18:57:09,153 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.8.txt
|
123 |
+
2022-06-27 18:57:09,198 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_0.9.txt
|
124 |
+
2022-06-27 18:57:09,290 INFO [utils.py:418] [test-lm_scale_0.9] %WER 27.59% [17766 / 64388, 91 ins, 13706 del, 3969 sub ]
|
125 |
+
2022-06-27 18:57:09,516 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_0.9.txt
|
126 |
+
2022-06-27 18:57:09,560 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.0.txt
|
127 |
+
2022-06-27 18:57:09,651 INFO [utils.py:418] [test-lm_scale_1.0] %WER 31.58% [20336 / 64388, 80 ins, 16475 del, 3781 sub ]
|
128 |
+
2022-06-27 18:57:09,879 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.0.txt
|
129 |
+
2022-06-27 18:57:09,920 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.1.txt
|
130 |
+
2022-06-27 18:57:10,011 INFO [utils.py:418] [test-lm_scale_1.1] %WER 35.69% [22983 / 64388, 65 ins, 19363 del, 3555 sub ]
|
131 |
+
2022-06-27 18:57:10,240 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.1.txt
|
132 |
+
2022-06-27 18:57:10,280 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.2.txt
|
133 |
+
2022-06-27 18:57:10,510 INFO [utils.py:418] [test-lm_scale_1.2] %WER 39.91% [25700 / 64388, 54 ins, 22286 del, 3360 sub ]
|
134 |
+
2022-06-27 18:57:10,744 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.2.txt
|
135 |
+
2022-06-27 18:57:10,790 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.3.txt
|
136 |
+
2022-06-27 18:57:10,880 INFO [utils.py:418] [test-lm_scale_1.3] %WER 44.32% [28534 / 64388, 47 ins, 25350 del, 3137 sub ]
|
137 |
+
2022-06-27 18:57:11,109 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.3.txt
|
138 |
+
2022-06-27 18:57:11,149 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.4.txt
|
139 |
+
2022-06-27 18:57:11,236 INFO [utils.py:418] [test-lm_scale_1.4] %WER 48.37% [31142 / 64388, 40 ins, 28142 del, 2960 sub ]
|
140 |
+
2022-06-27 18:57:11,468 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.4.txt
|
141 |
+
2022-06-27 18:57:11,505 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.5.txt
|
142 |
+
2022-06-27 18:57:11,592 INFO [utils.py:418] [test-lm_scale_1.5] %WER 52.01% [33491 / 64388, 38 ins, 30627 del, 2826 sub ]
|
143 |
+
2022-06-27 18:57:11,825 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.5.txt
|
144 |
+
2022-06-27 18:57:11,863 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.6.txt
|
145 |
+
2022-06-27 18:57:12,093 INFO [utils.py:418] [test-lm_scale_1.6] %WER 55.07% [35457 / 64388, 35 ins, 32740 del, 2682 sub ]
|
146 |
+
2022-06-27 18:57:12,325 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.6.txt
|
147 |
+
2022-06-27 18:57:12,363 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.7.txt
|
148 |
+
2022-06-27 18:57:12,450 INFO [utils.py:418] [test-lm_scale_1.7] %WER 57.74% [37177 / 64388, 33 ins, 34549 del, 2595 sub ]
|
149 |
+
2022-06-27 18:57:12,683 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.7.txt
|
150 |
+
2022-06-27 18:57:12,720 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.8.txt
|
151 |
+
2022-06-27 18:57:12,807 INFO [utils.py:418] [test-lm_scale_1.8] %WER 59.81% [38508 / 64388, 28 ins, 35972 del, 2508 sub ]
|
152 |
+
2022-06-27 18:57:13,040 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.8.txt
|
153 |
+
2022-06-27 18:57:13,077 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_1.9.txt
|
154 |
+
2022-06-27 18:57:13,162 INFO [utils.py:418] [test-lm_scale_1.9] %WER 61.70% [39730 / 64388, 22 ins, 37267 del, 2441 sub ]
|
155 |
+
2022-06-27 18:57:13,539 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_1.9.txt
|
156 |
+
2022-06-27 18:57:13,577 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-test-lm_scale_2.0.txt
|
157 |
+
2022-06-27 18:57:13,663 INFO [utils.py:418] [test-lm_scale_2.0] %WER 63.24% [40719 / 64388, 20 ins, 38322 del, 2377 sub ]
|
158 |
+
2022-06-27 18:57:13,897 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-test-lm_scale_2.0.txt
|
159 |
+
2022-06-27 18:57:13,904 INFO [decode.py:532]
|
160 |
+
For test, WER of different settings are:
|
161 |
+
lm_scale_0.1 15.01 best for test
|
162 |
+
lm_scale_0.2 15.29
|
163 |
+
lm_scale_0.3 15.7
|
164 |
+
lm_scale_0.4 16.37
|
165 |
+
lm_scale_0.5 17.47
|
166 |
+
lm_scale_0.6 19.07
|
167 |
+
lm_scale_0.7 21.32
|
168 |
+
lm_scale_0.8 24.17
|
169 |
+
lm_scale_0.9 27.59
|
170 |
+
lm_scale_1.0 31.58
|
171 |
+
lm_scale_1.1 35.69
|
172 |
+
lm_scale_1.2 39.91
|
173 |
+
lm_scale_1.3 44.32
|
174 |
+
lm_scale_1.4 48.37
|
175 |
+
lm_scale_1.5 52.01
|
176 |
+
lm_scale_1.6 55.07
|
177 |
+
lm_scale_1.7 57.74
|
178 |
+
lm_scale_1.8 59.81
|
179 |
+
lm_scale_1.9 61.7
|
180 |
+
lm_scale_2.0 63.24
|
181 |
+
|
182 |
+
2022-06-27 18:57:15,579 INFO [decode.py:483] batch 0/?, cuts processed until now is 4
|
183 |
+
2022-06-27 18:57:52,780 INFO [decode.py:483] batch 100/?, cuts processed until now is 428
|
184 |
+
2022-06-27 18:58:29,361 INFO [decode.py:483] batch 200/?, cuts processed until now is 888
|
185 |
+
2022-06-27 18:59:07,002 INFO [decode.py:483] batch 300/?, cuts processed until now is 1363
|
186 |
+
2022-06-27 18:59:42,509 INFO [decode.py:483] batch 400/?, cuts processed until now is 1815
|
187 |
+
2022-06-27 19:00:20,896 INFO [decode.py:483] batch 500/?, cuts processed until now is 2243
|
188 |
+
2022-06-27 19:00:55,648 INFO [decode.py:483] batch 600/?, cuts processed until now is 2717
|
189 |
+
2022-06-27 19:01:30,559 INFO [decode.py:483] batch 700/?, cuts processed until now is 3192
|
190 |
+
2022-06-27 19:02:03,949 INFO [decode.py:783] Caught exception:
|
191 |
+
CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 16.89 GiB already allocated; 7.38 GiB free; 23.17 GiB reserved in total by PyTorch)
|
192 |
+
Exception raised from malloc at /pytorch/c10/cuda/CUDACachingAllocator.cpp:272 (most recent call first):
|
193 |
+
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x42 (0x2aab0258d8b2 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10.so)
|
194 |
+
frame #1: <unknown function> + 0x2021b (0x2aab0232721b in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
195 |
+
frame #2: <unknown function> + 0x21034 (0x2aab02328034 in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
196 |
+
frame #3: <unknown function> + 0x2167d (0x2aab0232867d in /home/local/QCRI/ahussein/anaconda3/envs/k2/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
|
197 |
+
frame #4: k2::PytorchCudaContext::Allocate(unsigned long, void**) + 0x3a (0x2aab1173401a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
198 |
+
frame #5: k2::NewRegion(std::shared_ptr<k2::Context>, unsigned long) + 0x112 (0x2aab11465b72 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
199 |
+
frame #6: k2::Hash::Hash(std::shared_ptr<k2::Context>, int, int, int) + 0x2f7 (0x2aab11538957 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
200 |
+
frame #7: k2::Hash::Resize(int, int, int, bool) + 0x1b4 (0x2aab1152e464 in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
201 |
+
frame #8: k2::DeviceIntersector::ForwardSortedA() + 0x53e (0x2aab1156355e in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
202 |
+
frame #9: k2::IntersectDevice(k2::Ragged<k2::Arc>&, int, k2::Ragged<k2::Arc>&, int, k2::Array1<int> const&, k2::Array1<int>*, k2::Array1<int>*, bool) + 0x4cd (0x2aab115457ad in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/libk2context.so)
|
203 |
+
frame #10: <unknown function> + 0x8eb5a (0x2aab1032cb5a in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
204 |
+
frame #11: <unknown function> + 0x3628c (0x2aab102d428c in /alt-arabic/speech/amir/k2/tmp/k2/build/lib/_k2.cpython-38-x86_64-linux-gnu.so)
|
205 |
+
<omitting python frames>
|
206 |
+
frame #41: __libc_start_main + 0xf5 (0x2aaaab616555 in /lib64/libc.so.6)
|
207 |
+
|
208 |
+
|
209 |
+
2022-06-27 19:02:03,950 INFO [decode.py:789] num_arcs before pruning: 1001553
|
210 |
+
2022-06-27 19:02:03,950 INFO [decode.py:792] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
|
211 |
+
2022-06-27 19:02:03,962 INFO [decode.py:803] num_arcs after pruning: 5815
|
212 |
+
2022-06-27 19:02:07,376 INFO [decode.py:483] batch 800/?, cuts processed until now is 3635
|
213 |
+
2022-06-27 19:02:44,139 INFO [decode.py:483] batch 900/?, cuts processed until now is 4082
|
214 |
+
2022-06-27 19:03:28,596 INFO [decode.py:483] batch 1000/?, cuts processed until now is 4500
|
215 |
+
2022-06-27 19:04:15,771 INFO [decode.py:483] batch 1100/?, cuts processed until now is 4869
|
216 |
+
2022-06-27 19:04:33,221 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.1.txt
|
217 |
+
2022-06-27 19:04:33,318 INFO [utils.py:418] [dev-lm_scale_0.1] %WER 15.62% [9398 / 60169, 338 ins, 3783 del, 5277 sub ]
|
218 |
+
2022-06-27 19:04:33,545 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.1.txt
|
219 |
+
2022-06-27 19:04:33,593 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.2.txt
|
220 |
+
2022-06-27 19:04:33,688 INFO [utils.py:418] [dev-lm_scale_0.2] %WER 15.85% [9536 / 60169, 295 ins, 4101 del, 5140 sub ]
|
221 |
+
2022-06-27 19:04:33,920 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.2.txt
|
222 |
+
2022-06-27 19:04:33,966 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.3.txt
|
223 |
+
2022-06-27 19:04:34,061 INFO [utils.py:418] [dev-lm_scale_0.3] %WER 16.38% [9856 / 60169, 249 ins, 4624 del, 4983 sub ]
|
224 |
+
2022-06-27 19:04:34,291 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.3.txt
|
225 |
+
2022-06-27 19:04:34,335 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.4.txt
|
226 |
+
2022-06-27 19:04:34,426 INFO [utils.py:418] [dev-lm_scale_0.4] %WER 17.22% [10364 / 60169, 219 ins, 5377 del, 4768 sub ]
|
227 |
+
2022-06-27 19:04:34,650 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.4.txt
|
228 |
+
2022-06-27 19:04:34,695 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.5.txt
|
229 |
+
2022-06-27 19:04:34,935 INFO [utils.py:418] [dev-lm_scale_0.5] %WER 18.58% [11178 / 60169, 176 ins, 6457 del, 4545 sub ]
|
230 |
+
2022-06-27 19:04:35,152 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.5.txt
|
231 |
+
2022-06-27 19:04:35,193 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.6.txt
|
232 |
+
2022-06-27 19:04:35,283 INFO [utils.py:418] [dev-lm_scale_0.6] %WER 20.50% [12337 / 60169, 142 ins, 7911 del, 4284 sub ]
|
233 |
+
2022-06-27 19:04:35,500 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.6.txt
|
234 |
+
2022-06-27 19:04:35,542 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.7.txt
|
235 |
+
2022-06-27 19:04:35,632 INFO [utils.py:418] [dev-lm_scale_0.7] %WER 22.99% [13830 / 60169, 113 ins, 9743 del, 3974 sub ]
|
236 |
+
2022-06-27 19:04:35,863 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.7.txt
|
237 |
+
2022-06-27 19:04:35,902 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.8.txt
|
238 |
+
2022-06-27 19:04:35,993 INFO [utils.py:418] [dev-lm_scale_0.8] %WER 25.88% [15571 / 60169, 91 ins, 11766 del, 3714 sub ]
|
239 |
+
2022-06-27 19:04:36,344 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.8.txt
|
240 |
+
2022-06-27 19:04:36,387 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_0.9.txt
|
241 |
+
2022-06-27 19:04:36,477 INFO [utils.py:418] [dev-lm_scale_0.9] %WER 29.42% [17700 / 60169, 78 ins, 14199 del, 3423 sub ]
|
242 |
+
2022-06-27 19:04:36,711 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_0.9.txt
|
243 |
+
2022-06-27 19:04:36,752 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.0.txt
|
244 |
+
2022-06-27 19:04:36,842 INFO [utils.py:418] [dev-lm_scale_1.0] %WER 33.12% [19926 / 60169, 69 ins, 16663 del, 3194 sub ]
|
245 |
+
2022-06-27 19:04:37,077 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.0.txt
|
246 |
+
2022-06-27 19:04:37,117 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.1.txt
|
247 |
+
2022-06-27 19:04:37,207 INFO [utils.py:418] [dev-lm_scale_1.1] %WER 37.31% [22452 / 60169, 62 ins, 19379 del, 3011 sub ]
|
248 |
+
2022-06-27 19:04:37,443 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.1.txt
|
249 |
+
2022-06-27 19:04:37,482 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.2.txt
|
250 |
+
2022-06-27 19:04:37,692 INFO [utils.py:418] [dev-lm_scale_1.2] %WER 41.59% [25025 / 60169, 51 ins, 22214 del, 2760 sub ]
|
251 |
+
2022-06-27 19:04:37,932 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.2.txt
|
252 |
+
2022-06-27 19:04:37,971 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.3.txt
|
253 |
+
2022-06-27 19:04:38,057 INFO [utils.py:418] [dev-lm_scale_1.3] %WER 45.83% [27574 / 60169, 45 ins, 24966 del, 2563 sub ]
|
254 |
+
2022-06-27 19:04:38,282 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.3.txt
|
255 |
+
2022-06-27 19:04:38,351 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.4.txt
|
256 |
+
2022-06-27 19:04:38,443 INFO [utils.py:418] [dev-lm_scale_1.4] %WER 49.77% [29947 / 60169, 41 ins, 27519 del, 2387 sub ]
|
257 |
+
2022-06-27 19:04:38,682 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.4.txt
|
258 |
+
2022-06-27 19:04:38,762 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.5.txt
|
259 |
+
2022-06-27 19:04:38,844 INFO [utils.py:418] [dev-lm_scale_1.5] %WER 53.28% [32057 / 60169, 34 ins, 29848 del, 2175 sub ]
|
260 |
+
2022-06-27 19:04:39,063 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.5.txt
|
261 |
+
2022-06-27 19:04:39,099 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.6.txt
|
262 |
+
2022-06-27 19:04:39,299 INFO [utils.py:418] [dev-lm_scale_1.6] %WER 56.26% [33853 / 60169, 33 ins, 31754 del, 2066 sub ]
|
263 |
+
2022-06-27 19:04:39,517 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.6.txt
|
264 |
+
2022-06-27 19:04:39,551 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.7.txt
|
265 |
+
2022-06-27 19:04:39,631 INFO [utils.py:418] [dev-lm_scale_1.7] %WER 58.66% [35296 / 60169, 30 ins, 33295 del, 1971 sub ]
|
266 |
+
2022-06-27 19:04:39,851 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.7.txt
|
267 |
+
2022-06-27 19:04:39,886 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.8.txt
|
268 |
+
2022-06-27 19:04:39,966 INFO [utils.py:418] [dev-lm_scale_1.8] %WER 60.66% [36496 / 60169, 31 ins, 34535 del, 1930 sub ]
|
269 |
+
2022-06-27 19:04:40,187 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.8.txt
|
270 |
+
2022-06-27 19:04:40,222 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_1.9.txt
|
271 |
+
2022-06-27 19:04:40,305 INFO [utils.py:418] [dev-lm_scale_1.9] %WER 62.48% [37591 / 60169, 28 ins, 35674 del, 1889 sub ]
|
272 |
+
2022-06-27 19:04:40,664 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_1.9.txt
|
273 |
+
2022-06-27 19:04:40,700 INFO [decode.py:504] The transcripts are stored in conformer_ctc/exp_5000_att0.8/recogs-dev-lm_scale_2.0.txt
|
274 |
+
2022-06-27 19:04:40,786 INFO [utils.py:418] [dev-lm_scale_2.0] %WER 63.74% [38350 / 60169, 27 ins, 36440 del, 1883 sub ]
|
275 |
+
2022-06-27 19:04:41,020 INFO [decode.py:516] Wrote detailed error stats to conformer_ctc/exp_5000_att0.8/errs-dev-lm_scale_2.0.txt
|
276 |
+
2022-06-27 19:04:41,025 INFO [decode.py:532]
|
277 |
+
For dev, WER of different settings are:
|
278 |
+
lm_scale_0.1 15.62 best for dev
|
279 |
+
lm_scale_0.2 15.85
|
280 |
+
lm_scale_0.3 16.38
|
281 |
+
lm_scale_0.4 17.22
|
282 |
+
lm_scale_0.5 18.58
|
283 |
+
lm_scale_0.6 20.5
|
284 |
+
lm_scale_0.7 22.99
|
285 |
+
lm_scale_0.8 25.88
|
286 |
+
lm_scale_0.9 29.42
|
287 |
+
lm_scale_1.0 33.12
|
288 |
+
lm_scale_1.1 37.31
|
289 |
+
lm_scale_1.2 41.59
|
290 |
+
lm_scale_1.3 45.83
|
291 |
+
lm_scale_1.4 49.77
|
292 |
+
lm_scale_1.5 53.28
|
293 |
+
lm_scale_1.6 56.26
|
294 |
+
lm_scale_1.7 58.66
|
295 |
+
lm_scale_1.8 60.66
|
296 |
+
lm_scale_1.9 62.48
|
297 |
+
lm_scale_2.0 63.74
|
298 |
+
|
299 |
+
2022-06-27 19:04:41,025 INFO [decode.py:695] Done!
|