Use fine tuned model from HF?

#12
by Sagicc - opened

It would be good to use fine tuned model from some HF?

...and Happy New Year!!! All the best!

Think I find a solution...

As soon this pass more testing, will post it on Github . Thank you again for this amazing work!

Think I find a solution...

As soon this pass more testing, will post it on Github . Thank you again for this amazing work!

Hi, have you found a solution?
Can you kindly share it?

Hello,
Sorry for late answer. For .pt conversion I found this
https://github.com/bayartsogt-ya/whisper-multiple-hf-datasets
But it give me almost the same size of .pt file as is .bin.

After that made a fork of https://github.com/ProjectEGU/whisper-for-low-vram/ (to use large model with 8G vRAM) and add it as large-v2-lang (lang stands for language code) inhttps://github.com/DigitLib/whisper-for-low-vram-sr/blob/a157d925ccac42bb0dbda3e09ed058ec08363c98/whisper/__init__.py#L29 also add it to web-vad as a new model.
Tested it works :)
TO DO: Find a way to compress it better, and not to use fork of whisper.
Hope this will help.

TNX @aadnk for this great app!

Sign up or log in to comment