Help to run the model locally

#14
by Salzani - opened

I'm trying to execute this model inference in my local. How can I do that for multiples images? like within a for loop. Is it possible to use Llamma.cpp for that? If not, how would you recommend I do this loop?

Sign up or log in to comment