Chat template difference with 32b
#2
by
nbroad
- opened
Hi @LG-AI-EXAONE ,
I noticed that the tokenizer chat template for 2.4B is not the same as for 32B. Was this intentional or should they have the same chat template?
Hello, @nbroad !
We checked the tokenizer configuration and found it uses '\n' instead of '\n', which was not intentional. However, we've tested both versions of the chat template and haven't observed any difference in behavior.
I believe you can either update the chat template to match the 32B version or leave it unchanged - both options should work fine.
For consistency and to avoid confusion, we have updated the chat template of 2.4B to align with our 32B model and other models.
This update would not change the model's behavior or functionality.
Great! Thank you π
nbroad
changed discussion status to
closed