NICE large v2 runs on my rtx2060 smooth on 6gb vram

#1
by kiam001 - opened

Wtf you got it it runs on 6gb vram large-v2

Yes, this is thanks to faster-whisper (suggested by @FlippFuzz ), which uses CTranslate2.

And yeah, I can finally run large-v2 locally myself, on a RTX 2080 Super. ๐Ÿ˜€

deleted
This comment has been hidden

This is not Whisper Desktop, but there is an installation guide for "whisper-webui" and Windows 10/11 here:

The installation should be the same for "whisper-webui" and "fast-whisper-webui" - the difference is only that you checkout "https://huggingface.co/spaces/aadnk/faster-whisper-webui/" instead of "https://huggingface.co/spaces/aadnk/whisper-webui/" in the Git stage.

You can also use Docker, if you prefer.

@aadnk

you install manual ist to complicated,
i can it sucefully install with miniconda3 pytorch with last gpu support and i choose python (not conda) and install it in conda. install python -r requments.tyt and than gpu support work without install any cuda shit with last gaming driver. so i can setup faster-whisper on evry windows in 5 min

Sign up or log in to comment