Inclusion of a model_index.json file for onnx conversion

#4
by Tyler-S - opened

Hello, was curious if there is a place or plan to include a model_index.json file? The reason being is I utilize AMD GPU, while not as great for AI work as Nvidia, its what I have. When utilizing models I have to convert them using onnx as one of the steps for my AMD webui stable-diffusion model creation.

If someone walk me trough, i'll be happy to create one!

Sorry for hijacking your comment, TLDR: do yourself a favor and install an ubuntu dual boot for sd.

I also used an amd (6700xtx) card on windows and had to go through all the shenanigans of onnx conversion (with all the problems it creates) AND not having access to Automatic1111's interface AND it being slower AND using more vram...

Install an ubuntu/windows dual boot, it's quite easy and the sd installation on linux is not any more complicated than on windows.
It renders twice as fast (for me at last) goes higher in resolution because onnx uses more vram and you got an easy to use/complete interface.
Really it was the biggest breakthrough for me since I started installing sd locally.

Bringing this back up. It would be great if you can include all the configuration files so that your model can be imported using the diffusers library. You can probably follow a similar pattern to other models who support this (e.g. https://huggingface.co/Lykon/DreamShaper/tree/main) and/or follow instructions here: https://huggingface.co/docs/diffusers/using-diffusers/loading. @patrickvonplaten may know more in pointing you in the right direction.

Sign up or log in to comment