Add default chat template to tokenizer_config.json
[Automated] This PR adds the default chat template to the tokenizer config, allowing the model to be used with the new conversational widget (see PR).
If the default is not appropriate for your model, please set tokenizer.chat_template
to an appropriate template. See https://huggingface.co/docs/transformers/main/chat_templating for more information.
Hi Xenova,
Thank you very much for your Pull Request and for your interest in improving my model "Conversational_Spanish_GPT". I have carefully reviewed your suggestion to add a "chat_template" to the "tokenizer_config.json" file.
While I appreciate your time and effort, I would like to inform you that my model is single-turn, which means it is not trained to maintain conversational context. For this reason, the "chat_template" you propose would not be suitable. Using it could generate undesired or meaningless results, since the model does not have the ability to interpret conversation history.
I have considered alternatives such as not using a "chat_template" or creating a custom one for single-turn models. However, at this time, I believe the best option for my model is to not use a "chat_template".
I appreciate your understanding and cooperation. If you have any other suggestions or comments, please do not hesitate to let me know.