Prompt Format?

#1
by teknium - opened

How do I format my prompts

You can use the normal Vicuna 1.1 format:
USER: What is 4x8?
ASSISTANT:

Something like this works for Vicuna, so it should also work for this model. It was trained with the same script, just a different dataset, as far as I understand.

A chat between a human and an assistant.

### Human:
<Question>
### Assistant:
<Answer>

Something like this works for Vicuna, so it should also work for this model. It was trained with the same script, just a different dataset, as far as I understand.

A chat between a human and an assistant.

### Human:
<Question>
### Assistant:
<Answer>

Keep in mind that this is the Vicuna 1.0 format and doesn't work good for Vicuna 1.1. And this model is based on 1.1 so use my format for best experience.

Something like this works for Vicuna, so it should also work for this model. It was trained with the same script, just a different dataset, as far as I understand.

A chat between a human and an assistant.

### Human:
<Question>
### Assistant:
<Answer>

Keep in mind that this is the Vicuna 1.0 format and doesn't work good for Vicuna 1.1. And this model is based on 1.1 so use my format for best experience.

Interesting. Would you be able to provide some examples where it doesn't work well? Been using the v1.0 format with v1.1 and haven't experienced issues so far, but haven't been using it intensively either.

Something like this works for Vicuna, so it should also work for this model. It was trained with the same script, just a different dataset, as far as I understand.

A chat between a human and an assistant.

### Human:
<Question>
### Assistant:
<Answer>

Keep in mind that this is the Vicuna 1.0 format and doesn't work good for Vicuna 1.1. And this model is based on 1.1 so use my format for best experience.

Interesting. Would you be able to provide some examples where it doesn't work well? Been using the v1.0 format with v1.1 and haven't experienced issues so far, but haven't been using it intensively either.

Yeah. The biggest problems are the answer quality. You (maybe you dont notice it) will get quite better answers as this format is how the model was trained on. The other problem I have is, that it sometimes switches to the new format and writes the chat for itself without stopping.

Yes, @CyberTimon is Correct his Prompt Does Well Than

### Human:
<Question>
### Assistant:
<Answer>

Because After Telling What We Asked For It Again ask itself a Related Question And Answer it itself

A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: Hello! ASSISTANT: Hi!USER: How are you? ASSISTANT:

https://github.com/lm-sys/FastChat/blob/a47b8f9e93c8b5a85e81d1ae33e3a1106d8cdf80/fastchat/conversation.py#L662-L667

Sign up or log in to comment