Bug on line 212 of modeling_ltgbert.py
#1
by
nikitastheo
- opened
Hi, I think there is a bug on line 213 of the model file, where self.position_bucker_size
is called but it should be instead, self.config.position_bucker_size
, as the Attention module does not have this attribute.
Yes, you are right. I have now fixed the error
Thanks for the reply and the fix! Maybe you could also fix it at ltg/ltg-bert-babylm
? I think there is the same issue. Thanks again.
Done
lgcharpe
changed discussion status to
closed