Custom attention masks error comes back
#14
by
pvelosipednikov
- opened
Getting a very similar error to the one described in link and resolved 3 months ago.
https://huggingface.co/Deci/DeciCoder-1b/discussions/11
Error message when running trainer.train:ValueError: For support of custom attention masks please set naive_attention_prefill to True in the config
Running transformers 4.35.2, datasets 2.15.0, Python 3.9.6
pvelosipednikov
changed discussion status to
closed