Would there be a workable way to modify this and fit it on a 3090?

#32
by MotorCityCobra - opened

What does the community suggest?

NVIDIA 3090 has paired 24 GB GDDR6X, it should work.

This models seems to use 32GB on the CPU of my machine, at least with the example provided. That is because it is not using torch.float16. Add this in the start of your script:

import torch
torch.set_default_dtype(torch.float16)

Sign up or log in to comment