Incorrect system prompt in `tokenizer.chat_template`?

#45
by kaizau - opened

I'm in the process of adding support for OpenChat's chat template to llama.cpp, but noticed a conflict between the official Jinja template and this comment, which states that the system prompt should be unprefixed:

You are a helpful assistant.<|end_of_turn|>GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant:

Running the model's included template produces a prefixed message:

GPT4 Correct System: You are a helpful assistant.<|end_of_turn|>GPT4 Correct User: Hello<|end_of_turn|>GPT4 Correct Assistant:

Can I get a confirmation that unprefixed is correct?

Sign up or log in to comment