fix chat template
#6
by
ehartford
- opened
The chat template should be ChatML.
When I run it as-is I get this:
When I manually add ChatML I get proper output.
When I use Dolphin 8x7b without manually adding ChatML, it's added (probably because of the tokenizer config)
I am guessing (not being an expert in mlx) what you gotta do, is add it to tokenizer_config.json
"chat_template": "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
I've answered here: https://huggingface.co/mlx-community/dbrx-instruct-4bit/discussions/7#66150cb1ef2015c1b35a9486
Chat template is backed into mlx. I will add to Model Card.
eek
changed discussion status to
closed