Request: chat template with system message

#23
by nadiaOcean - opened

can we update the tokenizer_config.json to allow system prompt?
many agent framework use system prompt for agent definition.

like this (from mixtral 8*7b discussion ):
"chat_template": "{% if messages[0]['role'] == 'system' %}{% set contains_sys_prompt = 1 %}{% else %}{% set contains_sys_prompt = 0 %}{% endif %}{{ bos_token }}{% for message in messages %}{% if (message['role'] == 'user') != ((loop.index0 + contains_sys_prompt) % 2 == 0) %}{{ raise_exception('Conversation roles must alternate (system/)user/assistant/user/assistant/...') }}{% endif %}{% if message['role'] == 'system' %}{{ '[INST] <>' + message['content'].strip() + '<>' }}{% elif message['role'] == 'user' %}{{ (' ' if contains_sys_prompt == 1 and loop.index0 == 1 else '[INST] ') + message['content'].strip() + ' [/INST] ' }}{% elif message['role'] == 'assistant' %}{{ message['content'].strip() + eos_token}}{% else %}{{ raise_exception('Only system, user and assistant roles are supported!') }}{% endif %}{% endfor %}"

https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/discussions/115/files

Mistral AI_ org

mistral-common allows system prompt. If you are using HF tokenizer, could you help make a PR? Or maybe @Jofthomas can help! Thanks!

Mistral models don't have special tokens for system messages, hence I should prepend it to the first user message right ?
Is it supported the same way in mistral-common @sophiamyang ?

Mistral models don't have special tokens for system messages, hence I should prepend it to the first user message right ?
Is it supported the same way in mistral-common @sophiamyang ?

I have the same question. Is the instruct model trained with a system prompt?

Hi there, if I remember correctly they concatenate the contents, and here is after a quick check on the code source from mistral-common:

if is_first and system_prompt:
     content = system_prompt + "\n\n" + message.content
else:
     content = message.content

image.png

As you could have guessed it, the system prompt is concatenated to the first user message from my understanding.

my understanding is we can just adjust the chat template in tokenizer config, can we updates on this?

Sign up or log in to comment