Finetune roberta

#4
by humza-sami - opened

Hi, Is there any codebase or guidance which I follow to finetune roberta on my own dataset?

Hi @humza-sami - I haven't published one but there should be some good ones online - try googling for AutoModelForSequenceClassification.from_pretrained problem_type=multi_label_classification or similar (as that's the method I used to generate this one) and I think you'll find guides in Medium articles and notebooks in Github that should get you on the right track.

For example https://github.com/NielsRogge/Transformers-Tutorials/blob/master/BERT/Fine_tuning_BERT_(and_friends)_for_multi_label_text_classification.ipynb from @nielsr (which you could fork and modify to use Roberta and your own data)

Regards,
Sam.

Thanks @SamLowe , I am basically trying to finetuned it on three class problem. I will give it a shot as well

hello, can you please tell me what are the hyperparameters used?
Did you use the AdamW optimizer? and can you please tell me what is the max sequence length?

Sign up or log in to comment