The model can be started using vllm, but no dialogue is possible.

#2
by SongXiaoMao - opened

{
"object": "error",
"message": "Cannot use apply_chat_template() because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating",
"type": "BadRequestError",
"param": null,
"code": 400
}

There is no chat template in the tokenizer file provided by mistral. Please download the corresponding jinja file and specify it.

It's already available, thanks.

Sign up or log in to comment