DeepBeepMeep
commited on
Commit
·
be1972e
1
Parent(s):
c001f3b
Added instructions to install on RTX 50xx
Browse files
README.md
CHANGED
|
@@ -19,6 +19,7 @@ In this repository, we present **Wan2.1**, a comprehensive and open suite of vid
|
|
| 19 |
|
| 20 |
|
| 21 |
## 🔥 Latest News!!
|
|
|
|
| 22 |
* Mar 19 2025: 👋 Wan2.1GP v3.2:
|
| 23 |
- Added Classifier-Free Guidance Zero Star. The video should match better the text prompt (especially with text2video) at no performance cost: many thanks to the **CFG Zero * Team:**\
|
| 24 |
Dont hesitate to give them a star if you appreciate the results: https://github.com/WeichenFan/CFG-Zero-star
|
|
@@ -88,7 +89,7 @@ You will find the original Wan2.1 Video repository here: https://github.com/Wan-
|
|
| 88 |
|
| 89 |
|
| 90 |
|
| 91 |
-
## Installation Guide for Linux and Windows
|
| 92 |
|
| 93 |
**If you are looking for a one click installation, just go to the Pinokio App store : https://pinokio.computer/**
|
| 94 |
|
|
@@ -109,15 +110,23 @@ pip install torch==2.6.0 torchvision torchaudio --index-url https://download.pyt
|
|
| 109 |
# 2. Install pip dependencies
|
| 110 |
pip install -r requirements.txt
|
| 111 |
|
| 112 |
-
# 3.1 optional Sage attention support (30% faster
|
|
|
|
|
|
|
|
|
|
| 113 |
pip install sageattention==1.0.6
|
| 114 |
|
| 115 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 116 |
git clone https://github.com/thu-ml/SageAttention
|
| 117 |
cd SageAttention
|
| 118 |
pip install -e .
|
| 119 |
|
| 120 |
-
# 3.
|
| 121 |
pip install flash-attn==2.7.2.post1
|
| 122 |
|
| 123 |
```
|
|
@@ -125,17 +134,38 @@ pip install flash-attn==2.7.2.post1
|
|
| 125 |
Note pytorch *sdpa attention* is available by default. It is worth installing *Sage attention* (albout not as simple as it sounds) because it offers a 30% speed boost over *sdpa attention* at a small quality cost.
|
| 126 |
In order to install Sage, you will need to install also Triton. If Triton is installed you can turn on *Pytorch Compilation* which will give you an additional 20% speed boost and reduced VRAM consumption.
|
| 127 |
|
| 128 |
-
|
| 129 |
-
|
| 130 |
-
|
| 131 |
-
```
|
| 132 |
-
pip install https://github.com/woct0rdho/triton-windows/releases/download/v3.2.0-windows.post9/triton-3.2.0-cp310-cp310-win_amd64.whl # triton for pytorch 2.6.0
|
| 133 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 134 |
|
| 135 |
-
|
| 136 |
-
|
| 137 |
-
|
| 138 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 139 |
```
|
| 140 |
|
| 141 |
## Run the application
|
|
|
|
| 19 |
|
| 20 |
|
| 21 |
## 🔥 Latest News!!
|
| 22 |
+
* Mar 20 2025: 👋 Good news ! Official support for RTX 50xx please check the installation instructions below.
|
| 23 |
* Mar 19 2025: 👋 Wan2.1GP v3.2:
|
| 24 |
- Added Classifier-Free Guidance Zero Star. The video should match better the text prompt (especially with text2video) at no performance cost: many thanks to the **CFG Zero * Team:**\
|
| 25 |
Dont hesitate to give them a star if you appreciate the results: https://github.com/WeichenFan/CFG-Zero-star
|
|
|
|
| 89 |
|
| 90 |
|
| 91 |
|
| 92 |
+
## Installation Guide for Linux and Windows for GPUs up to RTX40xx
|
| 93 |
|
| 94 |
**If you are looking for a one click installation, just go to the Pinokio App store : https://pinokio.computer/**
|
| 95 |
|
|
|
|
| 110 |
# 2. Install pip dependencies
|
| 111 |
pip install -r requirements.txt
|
| 112 |
|
| 113 |
+
# 3.1 optional Sage attention support (30% faster)
|
| 114 |
+
# Windows only: extra step only needed for windows as triton is included in pytorch with the Linux version of pytorch
|
| 115 |
+
pip install triton-windows
|
| 116 |
+
# For both Windows and Linux
|
| 117 |
pip install sageattention==1.0.6
|
| 118 |
|
| 119 |
+
|
| 120 |
+
# 3.2 optional Sage 2 attention support (40% faster)
|
| 121 |
+
# Windows only
|
| 122 |
+
pip install triton-windows
|
| 123 |
+
pip install https://github.com/woct0rdho/SageAttention/releases/download/v2.1.1-windows/sageattention-2.1.1+cu126torch2.6.0-cp310-cp310-win_amd64.whl
|
| 124 |
+
# Linux only (sorry only manual compilation for the moment, but is straight forward with Linux)
|
| 125 |
git clone https://github.com/thu-ml/SageAttention
|
| 126 |
cd SageAttention
|
| 127 |
pip install -e .
|
| 128 |
|
| 129 |
+
# 3.3 optional Flash attention support (easy to install on Linux but may be complex on Windows as it will try to compile the cuda kernels)
|
| 130 |
pip install flash-attn==2.7.2.post1
|
| 131 |
|
| 132 |
```
|
|
|
|
| 134 |
Note pytorch *sdpa attention* is available by default. It is worth installing *Sage attention* (albout not as simple as it sounds) because it offers a 30% speed boost over *sdpa attention* at a small quality cost.
|
| 135 |
In order to install Sage, you will need to install also Triton. If Triton is installed you can turn on *Pytorch Compilation* which will give you an additional 20% speed boost and reduced VRAM consumption.
|
| 136 |
|
| 137 |
+
## Installation Guide for Linux and Windows for GPUs up to RTX50xx
|
| 138 |
+
RTX50XX are only supported by pytorch starting from pytorch 2.7.0 which is still in beta. Therefore this version may be less stable.\
|
| 139 |
+
It is important to use Python 3.10 otherwise the pip wheels may not be compatible.
|
|
|
|
|
|
|
| 140 |
```
|
| 141 |
+
# 0 Download the source and create a Python 3.10.9 environment using conda or create a venv using python
|
| 142 |
+
git clone https://github.com/deepbeepmeep/Wan2GP.git
|
| 143 |
+
cd Wan2GP
|
| 144 |
+
conda create -n wan2gp python=3.10.9
|
| 145 |
+
conda activate wan2gp
|
| 146 |
|
| 147 |
+
# 1 Install pytorch 2.7.0:
|
| 148 |
+
pip install torch==2.7.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/test/cu128
|
| 149 |
+
|
| 150 |
+
# 2. Install pip dependencies
|
| 151 |
+
pip install -r requirements.txt
|
| 152 |
+
|
| 153 |
+
# 3.1 optional Sage attention support (30% faster)
|
| 154 |
+
# Windows only: extra step only needed for windows as triton is included in pytorch with the Linux version of pytorch
|
| 155 |
+
pip install triton-windows
|
| 156 |
+
# For both Windows and Linux
|
| 157 |
+
pip install sageattention==1.0.6
|
| 158 |
+
|
| 159 |
+
|
| 160 |
+
# 3.2 optional Sage 2 attention support (40% faster)
|
| 161 |
+
# Windows only
|
| 162 |
+
pip install triton-windows
|
| 163 |
+
pip install https://github.com/woct0rdho/SageAttention/releases/download/v2.1.1-windows/sageattention-2.1.1+cu128torch2.7.0-cp310-cp310-win_amd64.whl
|
| 164 |
+
|
| 165 |
+
# Linux only (sorry only manual compilation for the moment, but is straight forward with Linux)
|
| 166 |
+
git clone https://github.com/thu-ml/SageAttention
|
| 167 |
+
cd SageAttention
|
| 168 |
+
pip install -e .
|
| 169 |
```
|
| 170 |
|
| 171 |
## Run the application
|