我本地报这个错,设置了好像也不行,是不是满了?

#1
by uncletang - opened

CUDA out of memory. Tried to allocate 108.00 MiB (GPU 0; 5.79 GiB total capacity; 4.96 GiB already allocated; 145.19 MiB free; 4.96 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

我的大小也超了,但是不是说5.3gb,我的8gb也爆了
return torch.empty_strided(
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 108.00 MiB (GPU 0; 8.00 GiB total capacity; 7.25 GiB already allocated; 0 bytes free; 7.25 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

change to trust_remote_code=True).half().to("cuda") instead of trust_remote_code=True, device='cuda') fixed my problem.

change to trust_remote_code=True).half().to("cuda") instead of trust_remote_code=True, device='cuda') fixed my problem.

谢谢,我已经解决了。

uncletang changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment