Update README.md
Browse files
README.md
CHANGED
@@ -17,6 +17,7 @@ metrics:
|
|
17 |
The Mandarin-wav2vec2.0 model is pre-trained on 1000 hours of AISHELL-2 dataset. The pre-training detail can be found at https://github.com/kehanlu/mandarin-wav2vec2. This model is fine-tuned on 178 hours of AISHELL-1 dataset and is the baseline model in the paper "A context-aware knowledge transferring strategy for CTC-based ASR
|
18 |
"([preprint](https://arxiv.org/abs/2210.06244)).
|
19 |
|
|
|
20 |
|CER|dev|test|
|
21 |
| - | - | - |
|
22 |
|vanilla w2v2-CTC | 4.85 | 5.13|
|
|
|
17 |
The Mandarin-wav2vec2.0 model is pre-trained on 1000 hours of AISHELL-2 dataset. The pre-training detail can be found at https://github.com/kehanlu/mandarin-wav2vec2. This model is fine-tuned on 178 hours of AISHELL-1 dataset and is the baseline model in the paper "A context-aware knowledge transferring strategy for CTC-based ASR
|
18 |
"([preprint](https://arxiv.org/abs/2210.06244)).
|
19 |
|
20 |
+
## Results on AISHELL-1
|
21 |
|CER|dev|test|
|
22 |
| - | - | - |
|
23 |
|vanilla w2v2-CTC | 4.85 | 5.13|
|