Does this also use ChatML prompt template?

#1
by jukofyork - opened

Should this model use the same ChatML prompt template as the smaller internlm2-math-plus-20b and internlm2-math-plus-7b models, eg:

  "chat_template": "{{ bos_token }}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}"

The internlm2-math-plus-mixtral8x22b doesn't have any prompt template defined:

https://huggingface.co/internlm/internlm2-math-plus-mixtral8x22b/blob/main/tokenizer_config.json

InternLM org

Yes, they all use same

Sign up or log in to comment