Does instruct need add_generation_prompt?

#33
by bdambrosio - opened

I see it in the tokenizer_config prompt template.
I also see there is no way to generate a final assistant message that doesn't end in <|eot_id|> otherwise.
But I haven't personally seen any difference w or wo add_generation_prompt set, even with the newer tokenizer_config and generation_config.
??

Hello, I am using this example and it is working fine for me. https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct#transformers-automodelforcausallm

However, I don't know why does add_generation_prompt = True excludes the assistant keyword (and a newline) before the actual answer by bot. Seeking help to understand.

Sign up or log in to comment