Help with prompt for this (and probably all other models) model, please.

#1
by supercharge19 - opened

So md has following for main.exe
./main -ngl 35 -m CatPPT-base-Mistral-7B-Instruct-v0.1-GGUF.Q4_K_M.gguf --color -c 32768 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant"

But I don't understand what to fill in the server with. There are multiple places where information can be written.
llama cpp 1.JPG

So, what should be written in 1 prompt, 2 prompt template, 3 chat history template and 4 grammar for optimal output.
What I understood from the model card is that following should be written in prompt template which as you can see in image is currently like this:
{{prompt}}

{{history}}
{{char}}:

and should be changed to:
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

And just write anything about the model or how it's going to be used in first field (prompt), however, i still don't know what to put in chat history template and in grammar fields or just leave them be, for this model and for other models.

I also want to use this with langchain, so I have started api_like_OAI.py as well, and don't know how to give prompt to models so they follow instructions faithfully, though I have heard that many models do good however none could work for me, and I suspect it is due to prompting issues, specially when they are used in langchain. So, help please.

Unfortunately, I am not familiar with this UI. I do use llama.cpp, but simply via CLI in the terminal for testings or serving.
This supports GGUF models as well: https://lmstudio.ai

No problem, and thank you for returning to answer. I will formulate my question better later.

Sign up or log in to comment