Updated with a working chat_template

#4

After the recent README.md changes and some experimentation I have come up with the following chat_template, which I'm currently using in my GGUFs.

Minor fix for #3, now only generates functions in last message to not waste multi-turn context.

Fixes #2.

Gorilla LLM (UC Berkeley) org

Thanks for contributing! We really appreciate it. Sorry for the late response! We found that different chat templates are used in various packages, so we haven't updated the official chat_template and recommend developers use our official get_prompt(.) function instead as ground truth to avoid inconsistencies. We'll update the system prompt in chat_template from Deepseek to Gorilla LLM. Thanks for catching this again!

We'll close this issue. Let us know if you have additional questions / concerns, we'll reopen this thread. Thanks again!

CharlieJi changed pull request status to closed

Sign up or log in to comment