how do you think it will be launched on 3090?

#1
by nickolaygrad - opened

with oogabooga...

To run this implementation of the model, you need 26 GB of video memory. You can try to translate it to Fp8 to run with less memory

Gaivoronsky changed discussion status to closed

this model will run 3090

There may be problems with memory, I advise you to pay attention to the 8-bit implementation https://huggingface.co/Gaivoronsky/ruGPT-3.5-13B-8bit

Sign up or log in to comment