Commit
•
8d6ff0b
1
Parent(s):
404c3b9
Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ This repository contains the model weights for [distil-large-v3](https://hugging
|
|
14 |
converted to [CTranslate2](https://github.com/OpenNMT/CTranslate2) format. CTranslate2 is a fast inference engine for
|
15 |
Transformer models and is the supported backend for the [Faster-Whisper](https://github.com/systran/faster-whisper) package.
|
16 |
|
17 |
-
Compared to previous Distil-Whisper releases, distil-large-v3 is specifically designed to
|
18 |
with the OpenAI Whisper long-form transcription algorithm. In our benchmark over 4 out-of-distribution datasets, distil-large-v3
|
19 |
outperformed distil-large-v2 by 5% WER average. Thus, you can expect significant performance gains by switching to this
|
20 |
latest checkpoint.
|
|
|
14 |
converted to [CTranslate2](https://github.com/OpenNMT/CTranslate2) format. CTranslate2 is a fast inference engine for
|
15 |
Transformer models and is the supported backend for the [Faster-Whisper](https://github.com/systran/faster-whisper) package.
|
16 |
|
17 |
+
Compared to previous Distil-Whisper releases, distil-large-v3 is specifically designed to be compatible
|
18 |
with the OpenAI Whisper long-form transcription algorithm. In our benchmark over 4 out-of-distribution datasets, distil-large-v3
|
19 |
outperformed distil-large-v2 by 5% WER average. Thus, you can expect significant performance gains by switching to this
|
20 |
latest checkpoint.
|