OthmaneJ commited on
Commit
4f9e34a
1 Parent(s): deb8145

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -5
README.md CHANGED
@@ -10,14 +10,16 @@ license: apache-2.0
10
  ---
11
 
12
  # Distil-wav2vec2
13
- This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 4 times smaller and 3 times faster than the original wav2vec2 large model.
14
 
15
  # Evaluation results
16
- When used with a light tri-gram language model head, this model achieves the following results :
 
 
 
 
 
17
 
18
- | Dataset | WER |
19
- | ------------- |-------------|
20
- | Librispeech-clean| 0.127|
21
 
22
  # Usage
23
  notebook (google colab) at https://github.com/OthmaneJ/distil-wav2vec2
 
10
  ---
11
 
12
  # Distil-wav2vec2
13
+ This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 45% times smaller and 3 times faster than the original wav2vec2 base model.
14
 
15
  # Evaluation results
16
+ This model achieves the following results :
17
+
18
+ |Model| Size| WER Librispeech-test-clean |WER Librispeech-test-other|
19
+ |----------| ------------- |-------------|-----------|
20
+ |Distil-wav2vec2| 197.9 Mb | 0.0983 | 0.2266|
21
+ |wav2vec2-base| 360 Mb | 0.0389 | 0.1047|
22
 
 
 
 
23
 
24
  # Usage
25
  notebook (google colab) at https://github.com/OthmaneJ/distil-wav2vec2