reach-vb HF staff commited on
Commit
ae92664
1 Parent(s): c31215f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -10,6 +10,10 @@ tags:
10
  - moe
11
  ---
12
  # Model Card for Mixtral-8x22B
 
 
 
 
13
  Converted to HuggingFace Transformers format using the script [here](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1/blob/main/convert.py).
14
 
15
  The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
 
10
  - moe
11
  ---
12
  # Model Card for Mixtral-8x22B
13
+
14
+ > [!TIP]
15
+ > Kudos to [@v2ray](https://huggingface.co/v2ray) for converting the checkpoints and uploading them in `transformers` compatible format. Go give them a follow!
16
+
17
  Converted to HuggingFace Transformers format using the script [here](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1/blob/main/convert.py).
18
 
19
  The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.