mlabonne commited on
Commit
30d5943
1 Parent(s): 1647d6e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -19,6 +19,8 @@ tags:
19
 
20
  phixtral-2x2_8 is the first Mixure of Experts (MoE) made with two [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) models, inspired by the [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) architecture. It performs better than each individual expert.
21
 
 
 
22
  ## 🏆 Evaluation
23
 
24
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
 
19
 
20
  phixtral-2x2_8 is the first Mixure of Experts (MoE) made with two [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) models, inspired by the [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) architecture. It performs better than each individual expert.
21
 
22
+ You can try it out using this [Space](https://huggingface.co/spaces/mlabonne/phixtral-chat).
23
+
24
  ## 🏆 Evaluation
25
 
26
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|