Issues instructing MPT to answer specifically when given a condition

#10
by dsmithcentric - opened

Hey guys, I'm having a bit of an issue logically constraining MPT-30B's responses. A simple scenario is given a question and a context, if the answer is not contained within the context, ask for the question to be rephrased. Smaller and simpler models can do this without issue, but MPT fairly consistently answers when the context does not contain the answer, rather than following the instruction. Even if I include that "all previous knowledge should be forgotten" it doesn't seem to keep the model on task. Is there a trick to prompting in this way with MPT? I'd like to progress further and use the model to do more complicated deductions but I haven't had a lot of luck with this first hurdle yet.
Cheers

What prompts work with other modela that aren't working here?

Here's a prompt which works in the huggingface MPT-30B-Chat demo, but doesn't work locally with MPT-30B-Instruct (template included here):
Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n###Instruction\nThoroughly answer as much of the question as truthfully as possible using only information from the provided context below. Forget everything you knew about the world. The answer MUST ONLY include information contained within the context below. You MUST NOT provide any information unless it is in the context below. If the context below does not include any information to answer the question you must apologize and inform the user that the context does not contain enough relevant information.\n\nContext:\nThe sky is blue. The grass is green. Fire is like water but hotter.\n\nQuestion:\nWhat is glue?\n\nAnswer:\n\n### Response\n
I'm using a topP value of 1, topK value of -1 and temperature set to 0, but adjusting them doesn't seem to change much. MPT-30B-Instruct consistently tells me what glue is.

I think I found the problem, "inform the user" isn't a clear enough instruction. Being more specific there has helped solve the issue.

dsmithcentric changed discussion status to closed

I have noticed that once the prompt starts to get more complex the answers do return though, which Chat doesn't seem as inclined to do. I'll keep experimenting

dsmithcentric changed discussion status to open

Here's another example, very similar to the last one, but slightly improved. It seems like any little tweak can easily break the attempted constraint on limiting output. I feel like I'm misunderstanding how to fundamentally instruct the model because Chat seemingly always follows the instructions, and Instruct seems to always just want to answer the question.

Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n###Instruction\nUsing only information provided in the context below thoroughly answer as much of the question as possible. The answer MUST ONLY include information contained within the context below. You MUST NOT provide any information unless it is in the context below. If the context below does not include any information to answer the question you must answer by stating that the context does not contain enough information to answer the question.\n\nContext:\nThe sky is blue. The grass is green. Fire is like water but hotter.\n\nQuestion:\nWhat is the sun?\n\nAnswer:\n\n### Response\n

The Chat model saw many more samples, and they were higher quality as well. It is not licensed for commercial use though, because of OpenAI's terms of service wrt how data from their models is used. It is entirely possible that Chat saw examples of this task (refusal when the retrieved data in the prompt doesn't support answering the question) and Instruct did not.

If fine tuning isn't an option, try few-shot (give it multiple examples), with some of the examples being refusals:

Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n###Instruction\nUsing only information provided in the context below thoroughly answer as much of the question as possible. The answer MUST ONLY include information contained within the context below. You MUST NOT provide any information unless it is in the context below. If the context below does not include any information to answer the question you must answer by stating that the context does not contain enough information to answer the question.\n\nContext:\nThe sky is blue. The grass is green. Fire is like water but hotter.\n\nQuestion:\nWhat is the sun?\n\nAnswer:\nThe context does not contain enough information to answer the question.\n\nContext:\nSOME OTHER CONTEXT.\n\nQuestion:\ANOTHER EXAMPLE QUESTION?\n\nAnswer:\nTHIS ANSWER SHOULD BE SUPPORTED BY THE CONTEXT.\n\nContext:\{Context}.\n\nQuestion:\{Question}\n\nAnswer:\n\n### Response\n

So that is a 2-shot prompt. If you have the space, doing 2-4 should help a lot.

Sign up or log in to comment