Can someone point to the chat template used for this model? Would be awesome.

#7
by hrishbhdalal - opened

When I use the .apply_chat_template, I get this warning: No chat template is defined for this tokenizer - using the default template for the LlamaTokenizerFast class. If the default is not appropriate for your model, please set tokenizer.chat_template to an appropriate template. See https://huggingface.co/docs/transformers/main/chat_templating for more information.

If you use tokenizer.apply_chat_template and there is no additional "chat_template" in the model tokenizer_config.json, a default template of torch package is used.
My problem is why the chat templated is provided inm-a-p/OpenCodeInterpreter-DS-6.7B but not provided in m-a-p/OpenCodeInterpreter-DS-33B.
And if i apply the chat template of 6.7B for 33B, 33B will return response in wierd format, which weaken its accuracy in HumanEval test.

Thanks I figured it out later, but was confused as it did not seem to have a system prompt at that time..

Sign up or log in to comment