Text Generation
Transformers
GGUF
15 languages
Inference Endpoints

Hi Guys! What is the prompt format or what do we have to put in modelfile.txt to run with OLLAMA?

#1
by NeevrajKB - opened

Hi Guys! What is the prompt format or what do we have to put in modelfile.txt to run with OLLAMA? I am running and using gemma:2b's prompt format but it is outputting gibberish. Is it a issue with the .gguf models or a prompt format/template issue? Can you please provide a solution?
Thanks!

Sign up or log in to comment