OthmaneJ commited on
Commit
b50e58d
1 Parent(s): 4f9e34a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -13,13 +13,13 @@ license: apache-2.0
13
  This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 45% times smaller and 3 times faster than the original wav2vec2 base model.
14
 
15
  # Evaluation results
16
- This model achieves the following results :
17
 
18
- |Model| Size| WER Librispeech-test-clean |WER Librispeech-test-other|
19
- |----------| ------------- |-------------|-----------|
20
- |Distil-wav2vec2| 197.9 Mb | 0.0983 | 0.2266|
21
- |wav2vec2-base| 360 Mb | 0.0389 | 0.1047|
22
 
23
 
24
  # Usage
25
- notebook (google colab) at https://github.com/OthmaneJ/distil-wav2vec2
 
13
  This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 45% times smaller and 3 times faster than the original wav2vec2 base model.
14
 
15
  # Evaluation results
16
+ This model achieves the following results (speed is mesured for a batch size of 64):
17
 
18
+ |Model| Size| WER Librispeech-test-clean |WER Librispeech-test-other|Speed on cpu|speed on gpu|
19
+ |----------| ------------- |-------------|-----------| ------|----|
20
+ |Distil-wav2vec2| 197.9 Mb | 0.0983 | 0.2266|0.4006s| 0.0082s|
21
+ |wav2vec2-base| 360 Mb | 0.0389 | 0.1047| 0.4919s|0.0046s |
22
 
23
 
24
  # Usage
25
+ notebook (executes seamlessly on google colab) at https://github.com/OthmaneJ/distil-wav2vec2