mgoin commited on
Commit
7f1a735
1 Parent(s): 3c35e7b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -17,12 +17,12 @@ Produced using [AutoFP8 with calibration samples from ultrachat](https://github.
17
  ### Open LLM Leaderboard evaluation scores
18
  | | Mixtral-8x22B-Instruct-v0.1 | Mixtral-8x22B-Instruct-v0.1-FP8<br>(this model) |
19
  | :------------------: | :----------------------: | :------------------------------------------------: |
20
- | arc-c<br>25-shot | 72.70 | 69.19 |
21
- | hellaswag<br>10-shot | 89.08 | 82.49 |
22
- | mmlu<br>5-shot | 77.77 | 70.61 |
23
- | truthfulqa<br>0-shot | 68.14 | 65.73 |
24
- | winogrande<br>5-shot | 85.16 | 82.63 |
25
- | gsm8k<br>5-shot | 82.03 | 76.57 |
26
- | **Average<br>Accuracy** | **79.15** | **74.53** |
27
- | **Recovery** | **100%** | **94.17%** |
28
 
 
17
  ### Open LLM Leaderboard evaluation scores
18
  | | Mixtral-8x22B-Instruct-v0.1 | Mixtral-8x22B-Instruct-v0.1-FP8<br>(this model) |
19
  | :------------------: | :----------------------: | :------------------------------------------------: |
20
+ | arc-c<br>25-shot (acc_norm) | 72.70 | 72.53 |
21
+ | hellaswag<br>10-shot (acc_norm) | 89.08 | 88.10 |
22
+ | mmlu<br>5-shot | 77.77 | 76.08 |
23
+ | truthfulqa<br>0-shot (acc) | 68.14 | 66.32 |
24
+ | winogrande<br>5-shot (acc) | 85.16 | 84.37 |
25
+ | gsm8k<br>5-shot (strict-match) | 82.03 | 83.40 |
26
+ | **Average<br>Accuracy** | **79.15** | **78.47** |
27
+ | **Recovery** | **100%** | **99.14%** |
28