chatrag-deberta / README.md
Pclanglais's picture
Update README.md
1239cfa verified
**Chatrag-Deberta** is a small lightweight LLM to predict whether a question should retrieve additional information with RAG or not.
Chatrag-Deberta is based on Deberta-v3-large, a 304M encoder-decoder. Its initial version was fine-tuned on 20,000 examples of questions annotated by Mistral 7B.
## Use
A typical example of inference with Chatrag-Deberta is provided in the [Google Colab demo](https://colab.research.google.com/drive/1nTLFJXopFOEJldaCPzjQ2g5-j0NnpLdz?usp=sharing) or with inference_chatrag.py
For every submitted text, Chatrag-Deberta will output a range of probabilities to require RAG or not.
This makes it possible to adjust a threshold of activation depending on whether more or less RAG is desirable in the system.
| Query | Prob | Result |
|----------------------------------------------------------|:---------:|--------:|
| Comment puis-je renouveler un passeport ? | 0.988455 | RAG |
| Combien font deux et deux ? | 0.041475 | No-RAG |
| Écris un début de lettre de recommandation pour la Dinum | 0.103086 | No-RAG |