marcorez8 commited on
Commit
e1480e1
·
verified ·
1 Parent(s): a9ebc1b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -3
README.md CHANGED
@@ -16,11 +16,14 @@ tags:
16
 
17
  # Flash-Attention 2.7.4 Prebuilt Wheels for NVIDIA Blackwell (cu128) on Windows
18
 
19
- This repository provides prebuilt wheels for **Flash-Attention 2.7.4** optimized for NVIDIA Blackwell GPUs (cu128) on Windows systems. These wheels are compatible with Python 3.10 and 3.11, enabling seamless integration for high-performance attention mechanisms in deep learning workflows.
 
20
 
21
  ## Available Wheels
22
- - `flash_attn-2.7.4.post1-cp310-cp310-win_amd64.whl` (Python 3.10)
23
- - `flash_attn-2.7.4.post1-cp311-cp311-win_amd64.whl` (Python 3.11)
 
 
24
 
25
  ## Compatibility
26
  The prebuilt wheels are designed for NVIDIA Blackwell GPUs but have been tested and confirmed compatible with previous-generation NVIDIA GPUs, including:
 
16
 
17
  # Flash-Attention 2.7.4 Prebuilt Wheels for NVIDIA Blackwell (cu128) on Windows
18
 
19
+ This repository provides prebuilt wheels for **Flash-Attention 2.7.4** optimized for NVIDIA Blackwell GPUs (cu128 and cu129) on Windows systems.
20
+ These wheels are compatible with Python 3.10 and 3.11, enabling seamless integration for high-performance attention mechanisms in deep learning workflows.
21
 
22
  ## Available Wheels
23
+ - `flash_attn-2.7.4.post1-cp310-cp310-win_amd64.whl` (Python 3.10) * pytorch 2.7 cu128
24
+ - `flash_attn-2.7.4.post1-cp311-cp311-win_amd64.whl` (Python 3.11) * pytorch 2.7 cu128
25
+
26
+ - `flash_attn-2.7.4.post1-cp310-cp310-win_amd64.whl` (Python 3.10) * pytorch 2.8 cu129
27
 
28
  ## Compatibility
29
  The prebuilt wheels are designed for NVIDIA Blackwell GPUs but have been tested and confirmed compatible with previous-generation NVIDIA GPUs, including: