Chat template error and prompt problems

#4
by epishchik - opened

Hello!

I'm facing issues running this model using the default code from the model card. The pure transformers version doesn't work for me as well as the pipeline version.

It gives me an error on the following line.

prompt = processor.apply_chat_template(conversation, add_generation_prompt=True)
ValueError: No chat template is set for this processor. Please either set the `chat_template` attribute, or provide a chat template as an argument.

I also tried the default prompt for LLaVA mentioned in this guide: "USER: <image>\nPROMPT\nASSISTANT:", but unlike the "llava-hf/llava-1.5-7b-hf" model, it doesn't produce anything.

Can you help me with this issue?

Llava Hugging Face org

Hey @epishchik ! Chat templates were introduced recently. Update your transformers version to at least '4.43' so that the model loads chat template from the hub :)

epishchik changed discussion status to closed

Sign up or log in to comment