How to run on GPU with 8GB memory.

#93
by TauseefAhmed - opened

I have a RTX 2080 (via TrainML), and I am trying to use stable-diffusion-xl-base-1.0, but every time CUDA runs out of memory only after loading the model. Apparently I only need 20 more MBs but even that is not available, is there any way I can only load parts that I need and not include any other tensors?
My use case is only txt2img.

Maybe float16 or int8

adjust your virtual memory , i am using GT 1030 with 4 GB, and my system RAM is 16GB , i adjust my Vram now my GPU is 11GB , STXL 1.0 is working fine for me. but in slow motion .

adjust your virtual memory , i am using GT 1030 with 4 GB, and my system RAM is 16GB , i adjust my Vram now my GPU is 11GB , STXL 1.0 is working fine for me. but in slow motion .

how to adjust my server's virtual ram, then add it to GPU ram

Same question, I am using a linux machine for my training and I can find (on internet) adjusting RAM for windows machine, but do you have any idea for linux or for cloud services?

just buy it.

Sign up or log in to comment