Commit
•
d9201e1
1
Parent(s):
787c970
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,5 @@
|
|
1 |
This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster.
|
2 |
|
3 |
-
For detailed information, please refer to <https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md> or <https://github.com/k2-fsa/icefall/pull/462.>
|
|
|
|
|
|
1 |
This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster.
|
2 |
|
3 |
+
For detailed information, please refer to <https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md> or <https://github.com/k2-fsa/icefall/pull/462.>
|
4 |
+
|
5 |
+
For data/lang_bpe_500 and data/lm etc., please refer to <https://huggingface.co/csukuangfj/icefall-asr-librispeech-conformer-ctc-jit-bpe-500-2021-11-09/>
|