[fine-tuning] attention_dropout not defined

#2
by jondurbin - opened

Hi there,

When attempting to fine-tuning the model, I bumped into an attribute error, here (self.attention_dropout)
https://huggingface.co/internlm/internlm2-base-20b/blob/main/modeling_internlm2.py#L483

I can it manually on my local copy for now, but ideally this would be set properly.

Thanks for the release!

Sign up or log in to comment