torch | |
accelerate | |
huggingface_hub | |
gradio | |
transformers | |
spaces | |
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl |
torch | |
accelerate | |
huggingface_hub | |
gradio | |
transformers | |
spaces | |
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl |