mlabonne commited on
Commit
d032502
1 Parent(s): 9a67aa9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -27,6 +27,8 @@ phixtral-4x2_8 is the first Mixure of Experts (MoE) made with four [microsoft/ph
27
 
28
  ## 🏆 Evaluation
29
 
 
 
30
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
31
  |----------------------------------------------------------------|------:|------:|---------:|-------:|------:|
32
  |[**phixtral-4x2_8**](https://huggingface.co/mlabonne/phixtral-4x2_8)| **33.91**| **70.44**| **48.78**| **37.68**| **47.7**|
 
27
 
28
  ## 🏆 Evaluation
29
 
30
+ The evaluation was performed using [LLM AutoEval](https://github.com/mlabonne/llm-autoeval) on Nous suite.
31
+
32
  | Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
33
  |----------------------------------------------------------------|------:|------:|---------:|-------:|------:|
34
  |[**phixtral-4x2_8**](https://huggingface.co/mlabonne/phixtral-4x2_8)| **33.91**| **70.44**| **48.78**| **37.68**| **47.7**|