No description provided.
Massive Text Embedding Benchmark org

Hello @pengql .
You're supposed to add this metadata to the README.md of your own model. Then, the MTEB leaderboard will automatically fetch it. See e.g. https://huggingface.co/BAAI/bge-large-en-v1.5.
This repository is primarily for external models that e.g. are not on Hugging Face.

  • Tom Aarsen

Hello @pengql .
You're supposed to add this metadata to the README.md of your own model. Then, the MTEB leaderboard will automatically fetch it. See e.g. https://huggingface.co/BAAI/bge-large-en-v1.5.
This repository is primarily for external models that e.g. are not on Hugging Face.

  • Tom Aarsen

What you mean is that I create an empty model myself, then I add this metadata to readme.md, and then the MTEB ranking list will automatically get it, can you explain it in more detail?

Massive Text Embedding Benchmark org

Exactly. However, there are a few more informal requirements: we need the model to be usable, either via an API or by having the weights available. Otherwise your model will be removed from the leaderboard.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment