How much RAM and GPU do you recommend?

#1
by areumtecnologia - opened

I tried to run it on a T4 with 50GB of RAM, but I couldn't.

How are you running it, I use diffusers scripts and have no issues running V0.3 it on a free tier Colab T4, 15Gb VRAM, 12 GB system RAM
Just use fp16 precision.

How are you running it, I use diffusers scripts and have no issues running V0.3 it on a free tier Colab T4, 15Gb VRAM, 12 GB system RAM
Just use fp16 precision.

I'm trying to run on google colab as well, but it crashes on low RAM (device RAM, >12 GB), used fp16. For V0.4 and V0.3 as well. Could you please share your colab notebook to check what may I miss?

what you probably missed is that I had to create an fp16 version of the model https://huggingface.co/Vargol/ProteusV0.3/tree/main for V0.3

The good news is that can be done on the CPU , the more memory the better, but swap will just slow things down rather than grind them to a halt.

You'd have thou by now there would be a way to avoid loading the whole model into system ram first.

Sign up or log in to comment