Update README.md
#52
by
Dee-Pac
- opened
README.md
CHANGED
@@ -367,7 +367,7 @@ Overall, the Phi-3.5 MoE model with just 6.6B active params outperforms GPT-3.5-
|
|
367 |
| STEM | 28.18 | 26.91 | 24.64 | 39.82 | 26.36 | 32.18 | 20.91 |
|
368 |
| **Overall** | 25.34 | 25.68 | 24.03 | 39.62 | 24.56 | 30.56 | 20.97 |
|
369 |
|
370 |
-
#### KMMLU-HARD (5-shot)
|
371 |
|
372 |
| supercategory | Phi-3.5-MoE-Instruct | Phi-3.0-Mini-128k-Instruct (June2024) | Llama-3.1-8B-Instruct | GPT-4o | GPT-4o-mini | GPT-4-turbo | GPT-3.5-turbo |
|
373 |
|:----------------|-----------------------:|--------------------------------:|------------------------:|---------:|--------------:|--------------:|----------------:|
|
|
|
367 |
| STEM | 28.18 | 26.91 | 24.64 | 39.82 | 26.36 | 32.18 | 20.91 |
|
368 |
| **Overall** | 25.34 | 25.68 | 24.03 | 39.62 | 24.56 | 30.56 | 20.97 |
|
369 |
|
370 |
+
#### KMMLU-HARD (5-shot)
|
371 |
|
372 |
| supercategory | Phi-3.5-MoE-Instruct | Phi-3.0-Mini-128k-Instruct (June2024) | Llama-3.1-8B-Instruct | GPT-4o | GPT-4o-mini | GPT-4-turbo | GPT-3.5-turbo |
|
373 |
|:----------------|-----------------------:|--------------------------------:|------------------------:|---------:|--------------:|--------------:|----------------:|
|