whisper fine tuning question

#115
by spawar - opened

Can I finetune the model on two different datasets of the same language(by the same language, I mean to keep the pre-trained weights of one language but, after fine-tuning, save both models like they are different languages) and access them by changing some key, like used for languages?
One dataset has significantly degraded speech data; the other is added with noise, on which whisper already gives good results. So, I am afraid that the new data might affect the original model.

Sign up or log in to comment