Serving this with llama.cpp

#1
by qiisziilbash - opened

Hi,

Is there an example of using this model? in particular with llama.cpp?

Thanks

Sign up or log in to comment