vram requirements

#5
by joujiboi - opened

How much vram and ram is required to run gptq_model-4bit--1g?

With Exllama I'm able to just barely load the model into 36GB of vram, ram usage is under 8GB.

Sign up or log in to comment