finetuning

#83
by dalyaff - opened

Is it possible to fine-tune a model of a new language that was not trained in during the training model? I want to finetuning in the Arabic language. What is the probability of this being successful?

dalyaff changed discussion title from finetunning to finetuning
Microsoft org

Yes, it is possible.

You will be better first fine-tuning with a mixture of English and Arabic test, then, if necessary, fine-tuning again with Arabic text. Since the model was pre-trained in English, it is better to give some context and time for it to adjust.

The success is based on what your goals are. It won't be good as if you were pre-training in Arabic, but it will be good.

gugarosa changed discussion status to closed

Sign up or log in to comment