Umair Khan commited on
Commit
6b43a2d
·
1 Parent(s): f67e20c

re-bump torch version

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -1,6 +1,6 @@
1
  # install pre-built GPU packages
2
  --extra-index-url https://download.pytorch.org/whl/cu118
3
- torch==2.4.1+cu118
4
  https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
  llm-foundry[gpu]==0.17.1
6
 
 
1
  # install pre-built GPU packages
2
  --extra-index-url https://download.pytorch.org/whl/cu118
3
+ torch==2.5.1+cu118
4
  https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
  llm-foundry[gpu]==0.17.1
6