Hardware requirements?

#6
by tintwotin - opened

What is the hardware requirements to run this locally?

Is it possible on a RTX2060 with 6 GB VRAM?

in my test it need 10gb vram, possible with lesser vram if ram swap but much slower

see my attempt: https://github.com/phineas-pta/SDXL-trt-win

can this be installed on automatic1111? i got an rtx3060 12gb, and i installed the extension for tensorrt for automatic1111. I was able to convert 1.5 models to .trt format . How can i convert this? i saw multiple files. what is about that file of 5gb? I tried downloading the model.onnx for both model and refiner but can't convert them. Please help im a noob. BTW: im using linux mint

this repo doesnt seem to be beginner friendly

for a1111 use the new extension: https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT

Sign up or log in to comment