The chat template doesn't support a system prompt
#114
by
sam-kap
- opened
https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/blob/main/tokenizer_config.json#L42
This prevents open source deployments from including a system prompt and could be fixed.
I would assume so, since Mistral uses the Llama template and supports system prompts in their API calls
I've seen mention that it doesn't follow llama chat format, and is instead [INST] system prompt [/INST]</s>
- but it would be nice to get actual clarification from mistral on this point.
I found this on an older version of the docs site: https://web.archive.org/web/20231030013339/https://docs.mistral.ai/usage/guardrailing/#appendix, seems like the <> tokens are not supported