Original model : sd-1.5-lcm-openvino

This model can be used with FastSD on Intel AI PC NPU.

Downloads last month
64
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.