Prompt Template?

#2
by amgadhasan - opened

Hi

Thanks for sharing this model.

What is the correct prompt template for chatting with this model?

I tried using tokenizer.apply_chat_template but it says there is no bundled chat template:

$ from transformers import AutoTokenizer
$ import transformers
$ import torch

$ model = "mlabonne/NeuralDaredevil-7B"
$messages = [{"role": "user", "content": "What is Deep Learning?"}]

$ tokenizer = AutoTokenizer.from_pretrained(model)
$ prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
No chat template is set for this tokenizer, falling back to a default class-level template. This is very error-prone, because models are often trained with templates different from the class default! Default chat templates are a legacy feature and will be removed in Transformers v4.43, at which point any code depending on them will stop working. We recommend setting a valid chat template before then to ensure that this model continues working without issues.

Thanks!

Oops sorry about that! Mistral Instruct is the correct prompt template for this model.

Oops sorry about that! Mistral Instruct is the correct prompt template for this model.

Thanks for answering.
I opened 2 PRs to solve this.

Thanks a lot @amgadhasan , it's now merged!

amgadhasan changed discussion status to closed

Sign up or log in to comment