zimhe commited on
Commit
30fd333
·
1 Parent(s): b719b9b

revise flash attn

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -11,4 +11,4 @@ xformers
11
  realesrgan
12
  py360convert
13
  gradio
14
- https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
 
11
  realesrgan
12
  py360convert
13
  gradio
14
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl