|
This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster. |
|
|
|
For detailed information, please refer to <https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md> or <https://github.com/k2-fsa/icefall/pull/462.> |
|
|
|
For data/lang_bpe_500 and data/lm etc., please refer to <https://huggingface.co/csukuangfj/icefall-asr-librispeech-conformer-ctc-jit-bpe-500-2021-11-09/> |