What chat template should we use with this model?

#12
by Ziizu - opened

What chat template should we use with this model?

I used the model file below with ollama but I'm getting {response} with every query.

FROM "~/.cache/huggingface/hub/models--NousResearch--Nous-Hermes-2-Yi-34B-GGUF/snapshots/37c3438c25d73017d0207b35dc84042b86094eb5/Nous-Hermes-2-Yi-34B.Q5_K_M.gguf"

PARAMETER stop "<|im_start|>"
PARAMETER stop "<|im_end|>"

TEMPLATE """
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
"""

I'm assuming this means there's something wrong with the template and/or parameters I used. Does anyone know why this is? I assumed the standard chat-ml template would work with this model.

I would try this:

TEMPLATE """{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
"""
PARAMETER stop <|im_start|>
PARAMETER stop <|im_end|>

I am using this for Hermes-2-Pro-Mistral-7B:

FROM ./models/Hermes-2-Pro-Mistral-7B.Q4_K_M.gguf
TEMPLATE """{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
"""
PARAMETER stop <|im_start|>
PARAMETER stop <|im_end|>

Chat works in Ollama using the ChatML template and parameters, but I haven't tested this for function calling or structured output yet.

Thanks, up and running now :)

Sign up or log in to comment