falcon-7b-instruct is answering out of context

#66
by kvmukilan - opened

I have trained LLM on my PDF file now I am asking questions related to same, but if a question is being asked out of the context I want the answer as " I dont know " or " out of context "

Right now it is answering even out of context
image.png

I have used follwing embeddings:

  1. sentence-transformers/all-mpnet-base-v2
  2. hkunlp/instructor-xl

and tried with following LLMs:

  1. lmsys/fastchat-t5-3b-v1.0
  2. falcon-7b-instruct

Here is the

Prompt template

                context: {context}
                question: {question}
                answer: 
                """
                QUESTION_T5_PROMPT = PromptTemplate(
                    template=question_t5_template, input_variables=["context", "question"]
                )
            qa.combine_documents_chain.llm_chain.prompt = QUESTION_T5_PROMPT
            qa.combine_documents_chain.verbose = True
            qa.return_source_documents = True

Function calling the query

    def answer_query(self,question:str) ->str:
        """
        Answer the question
        """

        answer_dict = self.qa({"query":question,})
        print(answer_dict)
        answer = answer_dict["result"]

Please access full code here

Sign up or log in to comment