any success in In-context question-answering?

#34
by beejay - opened

Has anyone been able to successfully use any of the Falcon models and produce a question-answering system where the answer comes from a document for a given question? If there is no answer in the given document, the application should say something to the effect of "No answer in the document/I don't know". I have been trying, rather unsuccessfully, and anything I have tried always returns an answer that is clearly not in the document, which is basically what the model has seen in the training data. If anyone has had success, it would be great if the code that achieves it can be shared. My need is to run this on my own GPU without relying on a hosted inference API such as a HF API.

I am facing the same issue, The work around I have is I put cosine similarity cut-off for query and available snippets any question out of context is asked cosine similarity will less, hence I don't even trigger the LLM.

Sign up or log in to comment