ltg
/

Bug on line 212 of modeling_ltgbert.py

#1
by nikitastheo - opened

Hi, I think there is a bug on line 213 of the model file, where self.position_bucker_size is called but it should be instead, self.config.position_bucker_size, as the Attention module does not have this attribute.

Language Technology Group (University of Oslo) org

Yes, you are right. I have now fixed the error

Thanks for the reply and the fix! Maybe you could also fix it at ltg/ltg-bert-babylm? I think there is the same issue. Thanks again.

Language Technology Group (University of Oslo) org

Done

lgcharpe changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment