Flash Attenuation wheels for 2.4.1+cu124

#1
by ryg81 - opened

Hey, please if possible give wheels for python 3.11.9 - pytorch 2.4.1+cu124

Owner

There are some here that may work for you, I haven't tried myself:

https://github.com/oobabooga/flash-attention/releases

Thank you @Kijai

Owner

Also Flash attention is no longer required to run mochi with the nodes.

ryg81 changed discussion status to closed

Also Flash attention is no longer required to run mochi with the nodes.

just one small question, can I now run this on RTX4070?

Sign up or log in to comment