TIP: Template

#5
by leomaxwell973 - opened

so, been using this model awhile now. it's pretty rad, though couple minor issues I've had with it, most of them resolved by finding a template setup, not listed for any default i know of. A combination of Llama3-Instruct and Mistral iirc. here it is:

{{ if .System }}<|start_header_id|>system
{{ .System }}<|end_header_id|>
{{ end }}{{ if .Prompt }}<|start_header_id|>user
{{ .Prompt }}<|end_header_id|>
{{ end }}<|start_header_id|>assistant
{{ .Response }}<|end_header_id|>

{"mirostat":2,"mirostat_eta":0.25,"mirostat_tau":4.5,"num_ctx":8192,"num_predict":288,"repeat_penalty":1.35,"temperature":1,"top_k":100,"top_p":0.8}

it's the mistral layout mostly, but, the im_ends and im_starts, are replaced or rather retained by the Llama3 headers instead. attempting to switch to the mistral ones had the prompt feel like a wet noodle, nasty and kind of dragged on.

Result- This, has the prompt no longer feeling like stiff vocabulary that the normal templates for Llama3 have had for me, constantly asking "what shall we explore friend" almost every prompt essentially depending on your settings. This is in Ollama btw, though i imagine this template will improve others like webui too! as i had noticed this issue in webui, albeit, a lot less constant.

It should resolve immediately, unless you have mirostat on at all though (or recently), give it a few prompts, maybe tidy up the system prompts, save reboot ollama load and try a few more prompts and should be seeing results by then which should gradually fully set in a bit if not by then

.

unless, i have some kind of anomaly build/settings in ollama, assuming not for now :P

2.png

1.png

Sign up or log in to comment