Update chat template

#2
by CISCai - opened

I know it's a bit of a pain, but could you update the chat template to the latest chat templates now that llama.cpp supports it?

At least you won't have to requantize everything as I made a handy script that lets you create a new GGUF using the updated tokenizer_config.json file, see the details in the PR. :)

PS: You only have to update the first file in a split GGUF.

:O the PR is a bit lacking in details, do you have more info? if not i'll get to it when i have a chance to dig through the code they changed :)

With the new script you can create a new GGUF after you downloaded the latest tokenizer_config.json like this:

python gguf-new-metadata.py input.gguf output.gguf --chat-template-config tokenizer_config.json

Then reupload the new GGUF file.

Sign up or log in to comment