Update chat template

#1
by CISCai - opened

I know it's a bit of a pain, but could you update the chat template to the latest chat templates now that llama.cpp supports it?

At least you won't have to requantize everything as I made a handy script that lets you create a new GGUF using the updated tokenizer_config.json file, see the details in the PR. :)

There is occasional confusion in the current version, such as repeating the answer in a loop.

BTW, Llama3's GGUF files seem to be FIXED 👉🏻 https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-BPE-fix-GGUF

LM Studio Community org

Yeah and command r have become more broken in the mean time 😅 if they provide any updates that improve it I'll see about fixing this one

Sign up or log in to comment