This vs bge-large-en-v1.5?

#5
by chongcy - opened

Hello,
I'm currently using bge-large-en-v1.5 for embedding to vector database, and also embedding query for retrieval.
Is there any performance/quality difference changing to this model? Or I can just use my current one?
And is the dimension for this still 1024, or 768?

Beijing Academy of Artificial Intelligence org

llm-embedder has the same size as bge-base-en-v1.5, whose dimension is 768. llm-embedder is fine-tuned based on bge-base model, and improves the ability for example retrieval, tool retrieval, and conversation retrieval. You can select the model based on your scenario.

Sign up or log in to comment