How much GPU memory does SD3.5-Large-IP-Adapter use?

#2
by SOT1k - opened

How much GPU memory does SD3.5-Large-IP-Adapter use?

I get an out of memory error with 2x3090 (48GB total)

I got max utilization near 31Gb on A100 40GB using bfloat16.

You can use pipe.enable_sequential_cpu_offload() to greatly reduce VRAM requirements, as parameters are moved to GPU only before being used, and are offloaded to CPU right after. It does take a bit longer to get the output, but still orders of magnitude faster than CPU (and infinitely faster compared to OOM GPU :) ). If you have 32GB of RAM, should work without no issues, and should work even with 8GB GPUs. If you need help getting inference code running let me know and I can help you out!

Sign up or log in to comment