Minimum gpu specs to run the large model?

#2
by mbug90 - opened

I know the medium model requires 16gb of vram, but what does the large model require? 24gb?

I know the medium model requires 16gb of vram, but what does the large model require? 24gb?

I ran all of the models with this webui on a T4 Gpu in the google colab, which is the free one

yes. T4 GPU on colab is enough.

Sign up or log in to comment