This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster. For detailed information, please refer to or For data/lang_bpe_500 and data/lm etc., please refer to