librarian-bot commited on
Commit
1fd9d2f
1 Parent(s): ba25e82

Librarian Bot: Add moe tag to model

Browse files

This pull request aims to enrich the metadata of your model by adding an `moe` (Mixture of Experts) `tag` in the `YAML` block of your model's `README.md`.

How did we find this information? We infered that this model is a `moe` model based on the following criteria:

- The model's name contains the string `moe`.
- The model indicates it uses a `moe` architecture
- The model's base model is a `moe` model


**Why add this?** Enhancing your model's metadata in this way:
- **Boosts Discoverability** - It becomes easier to find mixture of experts models on the Hub
- **Helping understand the ecosystem** - It becomes easier to understand the ecosystem of mixture of experts models on the Hub and how they are used


This PR comes courtesy of [Librarian Bot](https://huggingface.co/librarian-bot). If you have any feedback, queries, or need assistance, please don't hesitate to reach out to

@davanstrien

.

Files changed (1) hide show
  1. README.md +26 -4
README.md CHANGED
@@ -1,15 +1,37 @@
1
  ---
2
- license: apache-2.0
3
  language:
4
  - fr
5
  - it
6
  - de
7
  - es
8
  - en
 
 
 
9
  widget:
10
- - text: "[INST] You are a pirate chatbot who always responds with Arr and pirate speak!\nThere's a llama on my lawn, how can I get rid of him? [/INST]"
11
- output:
12
- text: "Arr, shiver me timbers! Ye have a llama on yer lawn, ye say? Well, that be a new one for me! Here's what I'd suggest, arr:\n\n1. Firstly, ensure yer safety. Llamas may look gentle, but they can be protective if they feel threatened.\n2. Try to make the area less appealing to the llama. Remove any food sources or water that might be attracting it.\n3. Contact local animal control or a wildlife rescue organization. They be the experts and can provide humane ways to remove the llama from yer property.\n4. If ye have any experience with animals, you could try to gently herd the llama towards a nearby field or open space. But be careful, arr!\n\nRemember, arr, it be important to treat the llama with respect and care. It be a creature just trying to survive, like the rest of us."
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
  # Model Card for Mixtral-8x7B
15
  The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts. The Mixtral-8x7B outperforms Llama 2 70B on most benchmarks we tested.
 
1
  ---
 
2
  language:
3
  - fr
4
  - it
5
  - de
6
  - es
7
  - en
8
+ license: apache-2.0
9
+ tags:
10
+ - moe
11
  widget:
12
+ - text: '[INST] You are a pirate chatbot who always responds with Arr and pirate speak!
13
+
14
+ There''s a llama on my lawn, how can I get rid of him? [/INST]'
15
+ output:
16
+ text: 'Arr, shiver me timbers! Ye have a llama on yer lawn, ye say? Well, that
17
+ be a new one for me! Here''s what I''d suggest, arr:
18
+
19
+
20
+ 1. Firstly, ensure yer safety. Llamas may look gentle, but they can be protective
21
+ if they feel threatened.
22
+
23
+ 2. Try to make the area less appealing to the llama. Remove any food sources
24
+ or water that might be attracting it.
25
+
26
+ 3. Contact local animal control or a wildlife rescue organization. They be the
27
+ experts and can provide humane ways to remove the llama from yer property.
28
+
29
+ 4. If ye have any experience with animals, you could try to gently herd the
30
+ llama towards a nearby field or open space. But be careful, arr!
31
+
32
+
33
+ Remember, arr, it be important to treat the llama with respect and care. It
34
+ be a creature just trying to survive, like the rest of us.'
35
  ---
36
  # Model Card for Mixtral-8x7B
37
  The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts. The Mixtral-8x7B outperforms Llama 2 70B on most benchmarks we tested.