can this model be used with langchain llamacpp?

#4
by Juanreatrepo77 - opened

First I want to thank you for your work and second, can this model be used with langchain llamacpp? When trying it I had problems due to tokenization

MeetKai org

Hi, yes this model can be used with local OpenAI-compatible server with langchain. We have recently integrated functionary v2 into llama-cpp-python's OpenAI-compatible server in v0.2.50. You can find more details about setting up here and here in the Function Calling section.

The tokenization issue is due to llama.cpp's tokenizer unable to handle newly added tokens. We have fixed this issue in the llama-cpp-python integration by using the HF AutoTokenizer instead.

Sign up or log in to comment