**Chatrag-Deberta** is a small lightweight LLM to predict whether a question should retrieve additional information with RAG or not. Chatrag-Deberta is based on Deberta-v3, a 304M encoder-decoder. Its initial version was fine-tuned on 20,000 examples of questions annotated by Mistral 7B.