Adding chat template to test the model and doing inference

#4
by mlinmg - opened

Hi, first of all, amazing work.
I've recently made a 55B Yi model based on Tess, and to test on the open llm leaderbord, I need a chat template, and since I used your prompt format, I figured out you could also benefit from it :)
The jinja script is: {% for message in messages %}
{% if message.role == "system" %}
SYSTEM: {{ message.content }}
{% elif message.role == "user" %}
USER: {{ message.content }}
{% elif message.role == "assistant" %}
ASSISTANT: {{ message.content }}
{% endif %}
{% endfor %}
you have to assign it like this: tokenizer.chat_template="""[jinja script]""" and then you save the tokenizer

Thanks man, can you share your model link? I’ll give it a go!

Sure man! https://huggingface.co/mlinmg/SG-Raccoon-Yi-55B-200k, thanks.
Currently I’m having repetition issues tho , I’m actively trying to fix those

Sign up or log in to comment