Update flash_attn_triton.py

#7
by Skylion007 - opened

Updates the triton flash attention file from llm_foundry

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment