WayneWiser
commited on
Commit
•
787c970
1
Parent(s):
71b0bb5
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster.
|
2 |
|
3 |
-
For detailed information, please refer to github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md or github.com/k2-fsa/icefall/pull/462
|
|
|
1 |
This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster.
|
2 |
|
3 |
+
For detailed information, please refer to <https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md> or <https://github.com/k2-fsa/icefall/pull/462.>
|