Any hints on prompt to reduce / stop hallucinations

#82
by dnovak232 - opened

I am using model with ChromaDB as a RAG.
In the instructions I am instructing: “If provided additional information is not sufficient to generate final answer then reply that anwer cannotnbe created based on input provided.”

But still when non reelvant or no information is provided additional information section model tends to hallucinate and dig out some data that it was trained on.
Specifically when I am asking about wifi products that are not present in RAG results it is hallucinating about wifi acces points and cameras and inventing non existing products that it is returning as answer.

Hi @dnovak232 that seems a bit challenging, were you able to solve this issue?

Sign up or log in to comment