librarian-bot commited on
Commit
a78d6f9
1 Parent(s): cde21ae

Librarian Bot: Add moe tag to model

Browse files

This pull request aims to enrich the metadata of your model by adding an `moe` (Mixture of Experts) `tag` in the `YAML` block of your model's `README.md`.

How did we find this information? We infered that this model is a `moe` model based on the following criteria:

- The model's name contains the string `moe`.
- The model indicates it uses a `moe` architecture
- The model's base model is a `moe` model


**Why add this?** Enhancing your model's metadata in this way:
- **Boosts Discoverability** - It becomes easier to find mixture of experts models on the Hub
- **Helping understand the ecosystem** - It becomes easier to understand the ecosystem of mixture of experts models on the Hub and how they are used


This PR comes courtesy of [Librarian Bot](https://huggingface.co/librarian-bot). If you have any feedback, queries, or need assistance, please don't hesitate to reach out to

@davanstrien

.

Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -1,5 +1,7 @@
1
  ---
2
- base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
 
 
3
  tags:
4
  - mixtral
5
  - instruct
@@ -8,12 +10,11 @@ tags:
8
  - gpt4
9
  - synthetic data
10
  - distillation
 
 
11
  model-index:
12
  - name: OpenHermes-Mixtral-8x7B
13
  results: []
14
- license: apache-2.0
15
- language:
16
- - en
17
  ---
18
 
19
  # OpenHermes - Mixtral 8x7B
 
1
  ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
  tags:
6
  - mixtral
7
  - instruct
 
10
  - gpt4
11
  - synthetic data
12
  - distillation
13
+ - moe
14
+ base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
15
  model-index:
16
  - name: OpenHermes-Mixtral-8x7B
17
  results: []
 
 
 
18
  ---
19
 
20
  # OpenHermes - Mixtral 8x7B