pstock clefourrier HF Staff commited on
Commit
985aa05
·
verified ·
1 Parent(s): 5830144

Add MoE tag to Mixtral (#29)

Browse files

- Add MoE tag to Mixtral (560530cd80ab72072ea739d0d6093d847ce21f54)


Co-authored-by: Clémentine Fourrier <clefourrier@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -6,6 +6,8 @@ language:
6
  - de
7
  - es
8
  - en
 
 
9
  ---
10
  # Model Card for Mixtral-8x7B
11
  The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts. The Mistral-8x7B outperforms Llama 2 70B on most benchmarks we tested.
 
6
  - de
7
  - es
8
  - en
9
+ tags:
10
+ - moe
11
  ---
12
  # Model Card for Mixtral-8x7B
13
  The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts. The Mistral-8x7B outperforms Llama 2 70B on most benchmarks we tested.