how to add dropout in xlm-roberta-large model ?

#5
by Shaukat - opened

Hi everyone,
I am working on a problem where I have to increase the classification accuracy of xlm-roberta-large model. I have to add the dropouts in the model to mitigate overfitting. Is there a way to achieve that?

thanks in advance
Shaukat

Hey @Shaukat ,

Sure you can add dropout to the model simply by setting hidden_dropout_prob or attention_probs_dropout_prob in the config, e.g.:

from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-large")

model = AutoModelForMaskedLM.from_pretrained("xlm-roberta-large", hidden_dropout_prob=0.3, attention_probs_dropout_prob=0.25)

Hi Patrick,
Thank you so much for the prompt response. I am trying this.
Best Regards
Shaukat

image.png

Hi @patrickvonplaten , @Shaukat
many thanks for your comment. I am fine tunning gpt-neo and wondering if I can increase the drop out as above ? after that is still model good for fine tunning with a new dataset as a pretrained model?
or the model need to be retrain and I can't use it for fine tunning with my own dataset?

Sign up or log in to comment