System Prompt

#2
by jphme - opened

As orca relies on different system prompts, I was wondering why there is no system prompt in your given prompt Template?

Is that an error? For the webui you mention a "context", where should that be placed in the prompt format?

OpenOrca org

You can give the context immediately preceding the first " User: " section. It is analogous to a system prompt.

bleysg changed discussion status to closed

Hi @bleysg thanks for your answer.
I suppose there should be 2 line breaks after the context like in other similar prompt templates?

It would be nice if you would add an example (string and/or tokenization) for this as you write yourself "The model is heavily conditioned to work using this format only" and I also experienced degradation just due to missing/wrong separators.
Many thanks!

Edit: found that in the example Gradio Space, I guess this is the correct format then, with 2 line breaks as expected? Would make sense to add that to the Model card imho:

messages = BASE_SYSTEM_MESSAGE + system_message.strip() + "\n" +
"\n".join(["\n".join(["User: "+item[0]+"<|end_of_turn|>", "Assistant: "+item[1]+"<|end_of_turn|>"])
for item in history])

Edit2: actually this is just one line break and no free line after the system prpmpt, misred the 2 loops... So a deviation from the prompt format which is used by many other models.

jphme changed discussion status to open

Sign up or log in to comment