mlabonne commited on
Commit
83d8a3c
1 Parent(s): db1d42d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -25,7 +25,7 @@ phixtral-4x2_8 is the first Mixure of Experts (MoE) made with four [microsoft/ph
25
 
26
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
27
  |----------------------------------------------------------------|------:|------:|---------:|-------:|------:|
28
- |[**phixtral-4x2_8**](https://huggingface.co/mlabonne/phixtral-2x2_8)| TBD| TBD| TBD| TBD| TBD|
29
  |[dolphin-2_6-phi-2](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)| 33.12| 69.85| 47.39| 37.2| 46.89|
30
  |[phi-2-dpo](https://huggingface.co/lxuechen/phi-2-dpo)| 30.39| 71.68| 50.75| 34.9| 46.93|
31
  |[phi-2-sft-dpo-gpt4_en-ep1](https://huggingface.co/Yhyu13/phi-2-sft-dpo-gpt4_en-ep1)| 30.61| 71.13| 48.74| 35.23| 46.43|
 
25
 
26
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
27
  |----------------------------------------------------------------|------:|------:|---------:|-------:|------:|
28
+ |[**phixtral-4x2_8**](https://huggingface.co/mlabonne/phixtral-4x2_8)| **33.91**| **70.44**| **48.78**| **37.68**| **47.7**|
29
  |[dolphin-2_6-phi-2](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)| 33.12| 69.85| 47.39| 37.2| 46.89|
30
  |[phi-2-dpo](https://huggingface.co/lxuechen/phi-2-dpo)| 30.39| 71.68| 50.75| 34.9| 46.93|
31
  |[phi-2-sft-dpo-gpt4_en-ep1](https://huggingface.co/Yhyu13/phi-2-sft-dpo-gpt4_en-ep1)| 30.61| 71.13| 48.74| 35.23| 46.43|