Any support for adding system instructions similar as chatGPT?
#79
by
timpan
- opened
Nice work, thanks!
There is a template for prompt in Llama like
"""
<s>[INST] <>\n{your_system_message}\n<>\n\n{user_message_1} [/INST]
"""
Is there any support like this in chatGLM2-6b? In some scenarios I would like the model take some part of the query as the highest priority, like role playing.
Thx.