Chat template ignores add_generation_prompt

#5

The current chat_template unconditionally adds <|start_header_id|>assistant<|end_header_id|>\n\n to the end of the prompt. This is true even if add_generation_prompt is set to False. This has major implications for fine-tuning if the use of apply_chat_template is used to format training data.

This PR fixes the template to conform to the examples given in Templates for Chat Models

Meta Llama org

Thanks @caleb-artifact , I agree this should be merged.

pcuenq changed pull request status to merged

Sign up or log in to comment