Optimized embedding method in RAG

#37
by Sushanta007 - opened
Social Post Explorers org

Hi Team,
I want to understand any other approaches for picking best embedding models for RAG apart from picking up GTE or E5 based embeddings.
Need few suggestions in this regard.

Social Post Explorers org

Hello!

Generally, the MTEB benchmark is used for picking embedding models. The retrieval task might be particularly interesting.

  • Tom Aarsen
Social Post Explorers org

Hello Tom,
That's correct, but this is what I tried to understand.
Assuming I picked up 'e5-mistral-7b-instruct' LLM and it did not give the desired result. Then how do I select the next LLM because if I keep on running different LLMs, it might take me a long time. Is there any other method that LLMs can be picked up (let's say as per dataset etc.)?

Sign up or log in to comment