Commit
•
71b0bb5
1
Parent(s):
fce9586
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
This model is a reworked CTC attention model trained with the code from github.com/k2-fsa/icefall/tree/master/egs/librispeech/ASR/conformer_ctc2. The model has 12 layers of reworked Conformer encoder layers and 6 reworked Transformer decoder layers. Number of model parameters is 103,071,035. With full Librispeech data set, it was trained for only 30 epochs because the reworked model would converge much faster.
|
2 |
+
|
3 |
+
For detailed information, please refer to github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/RESULTS.md or github.com/k2-fsa/icefall/pull/462.
|