00134aa 4b0cf39 00134aa 2b8a625 c45107d 2b8a625
1
2
3
4
5
6
7
spaces transformers==4.47.1 numpy==1.24.3 flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl autoawq==0.2.8 torch==2.4.0