Use input attention mask instead of casual mask in attention

#101
by CyberZHG - opened

The current implementation does not work with left/leading padding.

Add revision/code_revision = "17f5623" to AutoXxx.from_pretrained if you want to the modified version.

Cannot merge
This branch has merge conflicts in the following files:
  • modeling_falcon.py

Sign up or log in to comment