Text Generation
Transformers
PyTorch
English
llama
Inference Endpoints
text-generation-inference

What is the prompt template for it in multi-turn conversation?

#1
by tridungduong16 - opened

Can you provide an example of prompt template of multi-turn conversation for this model?

tridungduong16 changed discussion title from What is the prompt template for it? to What is the prompt template for it in multi-turn conversation?
OpenOrca org

We've only tested with the template from the OpenOrcaxOpenChat-Preview2-13B base model.

# Multi-turn V1 Llama 2
tokenize("User: Hello<|end_of_turn|>Assistant: Hi<|end_of_turn|>User: How are you today?<|end_of_turn|>Assistant:")
# Result: [1, 4911, 29901, 15043, 32000, 4007, 22137, 29901, 6324, 32000, 4911, 29901, 1128, 526, 366, 9826, 29973, 32000, 4007, 22137, 29901]

System prompt is just prepended to the first message with a space before "User: " ...

"You are OrcaPlaty, an LLM trained by Alignment Lab AI and garage-bAInd. Write out your thinking step by step before coming to a conclusion to be sure you get the right answer! User: Hello there<|end_of_turn|>Assistant: Hi, nice to meet you.<|end_of_turn|>User: What's new?<|end_of_turn|>Assistant: "

Sign up or log in to comment