Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet

#3
No description provided.
czczup changed pull request status to merged

Sign up or log in to comment