Memory issue with RTX 3060, 12GB

#11
by spkprav - opened

Exception occurred: CUDA out of memory. Tried to allocate 3.71 GiB (GPU 0; 11.75 GiB total capacity; 4.80 GiB already allocated; 1.75 GiB free; 8.56 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

It's a fresh install, I tried it after restarting the system, is this supported in the above configuration?

Did you try running the webui with the --xformers commandline arg?

I have the same error while I am using Nvdia Geforce 1960ti 4gb is there a way to resolve the error
If so please mention the way

Thank you

Error :
OutOfMemoryError: CUDA out of memory. Tried to allocate 170.00 MiB (GPU 0; 4.00 GiB total capacity; 9.71 GiB already allocated; 0 bytes free; 9.99 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Sign up or log in to comment