orionweller commited on
Commit
d0e70e9
·
verified ·
1 Parent(s): 776166c

Update requirements.txt

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -2,5 +2,5 @@ spaces
2
  transformers==4.49.0
3
  numpy==1.24.3
4
  flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
- autoawq==0.2.6
6
  torch==2.2.0
 
2
  transformers==4.49.0
3
  numpy==1.24.3
4
  flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
+ autoawq==0.2.1
6
  torch==2.2.0