shuaijiang commited on
Commit
b538b13
1 Parent(s): 30fe15d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -11,7 +11,7 @@ Fine tune [distilwhisper-large-v2](https://huggingface.co/distil-whisper/distil-
11
 
12
  Similar to distilwhisper-large-v2, Belle-distilwhisper-large-v2-zh is **5.8 times faster** and has **51% fewer parameters** compared to whisper-large-v2.
13
 
14
- Despite having 51% fewer parameters, Belle-distilwhisper-large-v2-zh achieves a relative improvement of **-3% to 35%** over whisper-large-v2
15
 
16
  It's important to note that distilwhisper-large-v2 cannot transcribe Chinese (it only outputs English) in the Chinese ASR benchmarks AISHELL1, AISHELL2, WENETSPEECH, and HKUST.
17
 
@@ -51,6 +51,7 @@ If you want to fine-thuning the model on your datasets, please reference to the
51
  | distilwhisper-large-v2 |756| Chinese | - | - | - | - | - |
52
  | Belle-distilwhisper-large-v2-zh| 756 | Chinese | 5.958 | 6.477 | 12.786 | 17.039 | 20.771 |
53
 
 
54
  ## Citation
55
 
56
  Please cite our paper and github when using our code, data or model.
 
11
 
12
  Similar to distilwhisper-large-v2, Belle-distilwhisper-large-v2-zh is **5.8 times faster** and has **51% fewer parameters** compared to whisper-large-v2.
13
 
14
+ Despite having 51% fewer parameters, Belle-distilwhisper-large-v2-zh achieves a relative improvement of **-3% to 35%** over whisper-large-v2.
15
 
16
  It's important to note that distilwhisper-large-v2 cannot transcribe Chinese (it only outputs English) in the Chinese ASR benchmarks AISHELL1, AISHELL2, WENETSPEECH, and HKUST.
17
 
 
51
  | distilwhisper-large-v2 |756| Chinese | - | - | - | - | - |
52
  | Belle-distilwhisper-large-v2-zh| 756 | Chinese | 5.958 | 6.477 | 12.786 | 17.039 | 20.771 |
53
 
54
+
55
  ## Citation
56
 
57
  Please cite our paper and github when using our code, data or model.