Adds README to the Distil-Whisper Org

#1
by reach-vb HF staff - opened
Files changed (1) hide show
  1. README.md +26 -1
README.md CHANGED
@@ -7,4 +7,29 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  pinned: false
8
  ---
9
 
10
+ # Distil-Whisper
11
+
12
+ [[Paper]](https://arxiv.org/abs/2311.00430)
13
+ [[Models]](https://huggingface.co/collections/distil-whisper/distil-whisper-models-65411987e6727569748d2eb6)
14
+ [[Colab]](https://colab.research.google.com/github/sanchit-gandhi/notebooks/blob/main/Distil_Whisper_Benchmark.ipynb)
15
+ [[Training Code]](https://github.com/huggingface/distil-whisper)
16
+
17
+ Distil-Whisper is a distilled version of Whisper that is **6 times faster**, 49% smaller, and performs **within 1% word
18
+ error rate (WER)** on out-of-distribution evaluation sets:
19
+
20
+ | Model | Params / M | Rel. Latency ↑ | Short-Form WER ↓ | Long-Form WER ↓ |
21
+ |----------------------------------------------------------------------------|------------|----------------|------------------|-----------------|
22
+ | [large-v3](https://huggingface.co/openai/whisper-large-v3) | 1550 | 1.0 | **8.4** | 11.0 |
23
+ | | | | | |
24
+ | [distil-large-v3](https://huggingface.co/distil-whisper/distil-large-v3) | 756 | 6.3 | 9.7 | **10.8** |
25
+ | [distil-large-v2](https://huggingface.co/distil-whisper/distil-large-v2) | 756 | 5.8 | 10.1 | 11.6 |
26
+ | [distil-medium.en](https://huggingface.co/distil-whisper/distil-medium.en) | 394 | **6.8** | 11.1 | 12.4 |
27
+ | [distil-small.en](https://huggingface.co/distil-whisper/distil-small.en) | **166** | 5.6 | 12.1 | 12.8 |
28
+
29
+ For most applications, we recommend the latest [distil-large-v3](https://huggingface.co/distil-whisper/distil-large-v3) checkpoint,
30
+ since it is the most performant distilled checkpoint and compatible across all Whisper libraries. The only exception is
31
+ resource-constrained applications with very little memory, such as on-device or mobile applications, where the
32
+ [distil-small.en](https://huggingface.co/distil-whisper/distil-small.en) is a great choice, since it is only 166M
33
+ parameters and performs within 4% WER of Whisper large-v3.
34
+
35
+ **Note:** Distil-Whisper is currently only available for English speech recognition. We are working with the community to distill Whisper on other languages. If you are interested in distilling Whisper in your language, check out the provided [training code](training). We will soon update the repository with multilingual checkpoints when ready!