Running out of memory with my 4090(CUDA out of memory error)

#15
by shuriken200 - opened

IM probably stupid. But I am still very noob when it comes to these things.

Why do I get the error "torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 136.00 MiB (GPU 0; 23.99 GiB total capacity; 22.82 GiB already allocated; 0 bytes free; 23.00 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF" ?

Any help to fix this would be appreciated! Thank ya all!

"try setting max_split_size_mb to avoid fragmentation" - try that

Sign up or log in to comment