What specifications do this model need?

#2
by hieuxinhe - opened

Hi, Great work!

But Please provide some basic specifications you need to run inference of this model.
( I have some problems when running this model on T4 with 18GB. )

deleted
This comment has been hidden
Owner
โ€ข
edited Apr 26

Hello, you need around VRAM >18GB for single image inference. Do you get an OOM error?

๐Ÿ˜€๐Ÿ˜€๐Ÿ˜

@yisol please can u mention the memory ,gpu specifications , can we clone this model to run locally ?? , I'm using 16GB ram and 24GB GPU still unable to launch the app

Sign up or log in to comment