can anyone help me get prompt template for Question Answering model

#54
by Iamexperimenting - opened

Hi Team,

Currently, I'm working on building "Question answering model using opensource LLM". Currently, I'm using the below template to generate answering

"""
{context}\n\n{question}
"""

This template provides me one word answer.

Example
Question : what is the maximum kilometer did Andrew covered during his cycling practice?
Answer: 28 Kilometer

Is it possible to generate answers like conversational ai(human-like reply)
example
Question : what is the maximum kilometer did Andrew covered during his cycling practice?
Answer: Maximum kilometer covered by Andrew is 28.

can anyone please help me here?

If you are willing to share a bigger part of the code, I can take a look.

I am doing a completely different task but only getting a single token of generated text as well, which strikes me as odd.

Sign up or log in to comment