The chat template doesn't support a system prompt

#114
by sam-kap - opened

https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/blob/main/tokenizer_config.json#L42

This prevents open source deployments from including a system prompt and could be fixed.

#115 should fix this if merged

@ArthurZ Yup #115 is me from my work account :D

@sam-kap and @ArthurZ even if you have corrected the template but will the model be able to interpret that, whatever is written inside SYS tags needs to be considered a part of system prompt?

I would assume so, since Mistral uses the Llama template and supports system prompts in their API calls

I've seen mention that it doesn't follow llama chat format, and is instead [INST] system prompt [/INST]</s> - but it would be nice to get actual clarification from mistral on this point.

I found this on an older version of the docs site: https://web.archive.org/web/20231030013339/https://docs.mistral.ai/usage/guardrailing/#appendix, seems like the <> tokens are not supported

Sign up or log in to comment