Inference with smaller GPU

#1
by nebi - opened

Hi,
How is inference with e.g. 4GB VRAM possible?
I couldn't make it work with the proposed code.
Cheers

Sign up or log in to comment