few shot prompting best practices/examples

#34
by bharven - opened

hey @sam-mosaic !

i'm looking for some guidance on few shot prompting with the mpt-7b models. is there any specific prompt format your team has found works well? Some other models (jurassic 2, etc) do well with few shot prompts with each example separated by a special separator token, is there any similar separator token here?

Example:

do task a on input below:
input: abcd
output: x
##
do task a on input below:
input: abcdef
output: xyz
##
do task a on input below:
input: abcdefghi
output:

Additionally, what do you recommend as far as using the base model or instruction tuned model for few shot prompting?

For context, trying to use these LLMs to solve some basic NLP tasks.. trying to ensure consistent and accurate output.

Thanks for the guidance!!

@bharven Initial experiments show base being slightly better at few shot prompting than instruct. However, we didn't do extensive testing on the prompts. @kartikmosaicml can you explain what format we used?

Sign up or log in to comment