librarian-bot commited on
Commit
8aac9db
1 Parent(s): f88f8f9

Librarian Bot: Add moe tag to model

Browse files

This pull request aims to enrich the metadata of your model by adding an `moe` (Mixture of Experts) `tag` in the `YAML` block of your model's `README.md`.

How did we find this information? We infered that this model is a `moe` model based on the following criteria:

- The model's name contains the string `moe`.
- The model indicates it uses a `moe` architecture
- The model's base model is a `moe` model


**Why add this?** Enhancing your model's metadata in this way:
- **Boosts Discoverability** - It becomes easier to find mixture of experts models on the Hub
- **Helping understand the ecosystem** - It becomes easier to understand the ecosystem of mixture of experts models on the Hub and how they are used


This PR comes courtesy of [Librarian Bot](https://huggingface.co/librarian-bot). If you have any feedback, queries, or need assistance, please don't hesitate to reach out to

@davanstrien

.

Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -1,11 +1,13 @@
1
  ---
2
- license: apache-2.0
3
  language:
4
  - fr
5
  - it
6
  - de
7
  - es
8
  - en
 
 
 
9
  inference: false
10
  ---
11
  # Model Card for Mixtral-Fusion-4x7B-Instruct-v0.1
 
1
  ---
 
2
  language:
3
  - fr
4
  - it
5
  - de
6
  - es
7
  - en
8
+ license: apache-2.0
9
+ tags:
10
+ - moe
11
  inference: false
12
  ---
13
  # Model Card for Mixtral-Fusion-4x7B-Instruct-v0.1