Original model is here and on Civitai. The author is here. This model created by Raelina.

Notice

This is an experimental conversion in Spaces using a homebrew script. serverless Inference API does not currently support torch float8_e4m3fn, so it does not work. I have not been able to confirm if the conversion is working properly. Please consider this as a test run only.

Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for John6666/raemu-flux-v10-fp8-flux

Finetuned
Raelina/Raemu-Flux
Finetuned
(1)
this model

Space using John6666/raemu-flux-v10-fp8-flux 1