Prompt format?

#1
by Thireus - opened

Could you please share the prompt template?

A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. USER: Hi ASSISTANT: 

gives chinese answers

A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. USER: Hi ASSISTANT: ไฝ ๅฅฝ๏ผๆœ‰ไป€ไนˆๆˆ‘ๅฏไปฅๅธฎๅŠฉไฝ ็š„๏ผŸ

If I add \n, it gives English answers

A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input.
USER:
Hi
ASSISTANT:
Hello! How can I help you today? If you have any questions or need assistance, feel free to ask.
Large Model Systems Organization org
โ€ข
edited Jun 23, 2023

See this https://github.com/lm-sys/FastChat/blob/a47b8f9e93c8b5a85e81d1ae33e3a1106d8cdf80/fastchat/conversation.py#L662-L667

The Chinese output you got is weird although we also found some cases rarely. Could it be reproduced constantly?

@lmzheng , yes it can be reproduced consistently.

I also get Japanese if I say "Hello there" instead of "Hi"

A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. USER: Hello there ASSISTANT: ใ“ใ‚“ใซใกใฏ๏ผใฉใ†ใžใ‚ˆใ‚ใ—ใใŠ้ก˜ใ„ใ—ใพใ™ใ€‚ไฝ•ใ‹ใŠๆ‰‹ไผใ„ใงใใ‚‹ใ“ใจใŒใ‚ใ‚Šใพใ™ใฎใงใ€่ณชๅ•ใ‚„ใ‚ขใƒ‰ใƒใ‚คใ‚นใ‚’ใŠ้ก˜ใ„ใ—ใพใ™ใ€‚

chinese
japanese
params

I can confirm multiple issues with this model. I'm getting Chinese text and it stops early on many prompts.

Large Model Systems Organization org

@Thireus It seems you changed the system prompt. You added "uncensored". This model might be sensitive to the system prompt. Could you try not changing the system prompt but give the instruction in the first round of your conversation?

@lemonflourorange Did you use FastChat CLI or other frontends? Could you double-check your prompt format?

@lmzheng I ran the model with text-generation-webui and the Vicuna-v1.1 prompt format

Sign up or log in to comment