OthmaneJ commited on
Commit
14e61e6
1 Parent(s): b50e58d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -10,7 +10,7 @@ license: apache-2.0
10
  ---
11
 
12
  # Distil-wav2vec2
13
- This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 45% times smaller and 3 times faster than the original wav2vec2 base model.
14
 
15
  # Evaluation results
16
  This model achieves the following results (speed is mesured for a batch size of 64):
@@ -18,7 +18,7 @@ This model achieves the following results (speed is mesured for a batch size of
18
  |Model| Size| WER Librispeech-test-clean |WER Librispeech-test-other|Speed on cpu|speed on gpu|
19
  |----------| ------------- |-------------|-----------| ------|----|
20
  |Distil-wav2vec2| 197.9 Mb | 0.0983 | 0.2266|0.4006s| 0.0082s|
21
- |wav2vec2-base| 360 Mb | 0.0389 | 0.1047| 0.4919s|0.0046s |
22
 
23
 
24
  # Usage
 
10
  ---
11
 
12
  # Distil-wav2vec2
13
+ This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 45% times smaller and twice as fast as the original wav2vec2 base model.
14
 
15
  # Evaluation results
16
  This model achieves the following results (speed is mesured for a batch size of 64):
 
18
  |Model| Size| WER Librispeech-test-clean |WER Librispeech-test-other|Speed on cpu|speed on gpu|
19
  |----------| ------------- |-------------|-----------| ------|----|
20
  |Distil-wav2vec2| 197.9 Mb | 0.0983 | 0.2266|0.4006s| 0.0082s|
21
+ |wav2vec2-base| 360 Mb | 0.0389 | 0.1047|\t0.4919s|0.0046s |
22
 
23
 
24
  # Usage