Librarian Bot: Add moe tag to model

#2

This pull request aims to enrich the metadata of your model by adding an moe (Mixture of Experts) tag in the YAML block of your model's README.md.

How did we find this information? We infered that this model is a moe model based on the following criteria:

  • The model's name contains the string moe.
  • The model indicates it uses a moe architecture
  • The model's base model is a moe model

Why add this? Enhancing your model's metadata in this way:

  • Boosts Discoverability - It becomes easier to find mixture of experts models on the Hub
  • Helping understand the ecosystem - It becomes easier to understand the ecosystem of mixture of experts models on the Hub and how they are used

This PR comes courtesy of Librarian Bot. If you have any feedback, queries, or need assistance, please don't hesitate to reach out to @davanstrien .

Cannot merge
This branch has merge conflicts in the following files:
  • README.md

Sign up or log in to comment