Flash Attenuation wheels for 2.4.1+cu124
#1
by
ryg81
- opened
Hey, please if possible give wheels for python 3.11.9 - pytorch 2.4.1+cu124
There are some here that may work for you, I haven't tried myself:
Also Flash attention is no longer required to run mochi with the nodes.
ryg81
changed discussion status to
closed
Also Flash attention is no longer required to run mochi with the nodes.
just one small question, can I now run this on RTX4070?