How to run this?

#2
by baktrian - opened

Wanted to test it out, how do i go about running this model?

I cant see any support for it on Ollama yet, any guide I could follow?

Thanks

I tested it with latest llama.cpp.

Sign up or log in to comment