Q-bert commited on
Commit
3f4d59c
1 Parent(s): d9dd63b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -18,15 +18,15 @@ Fine-tuned On [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistr
18
  You can use ChatML format.
19
 
20
  # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
21
- Detailed results can be found [Coming soon]()
22
 
23
  | Metric | Value |
24
  |-----------------------|---------------------------|
25
- | Avg. | Coming soon |
26
- | ARC (25-shot) | Coming soon |
27
- | HellaSwag (10-shot) | Coming soon |
28
- | MMLU (5-shot) | Coming soon |
29
- | TruthfulQA (0-shot) | Coming soon |
30
- | Winogrande (5-shot) | Coming soon |
31
- | GSM8K (5-shot) | Coming soon |
32
 
 
18
  You can use ChatML format.
19
 
20
  # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
21
+ Detailed results can be found [Here](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/Q-bert/Optimus-7B/results_2023-12-04T18-59-49.207215.json)
22
 
23
  | Metric | Value |
24
  |-----------------------|---------------------------|
25
+ | Avg. | 69.09 |
26
+ | ARC (25-shot) | 65.44 |
27
+ | HellaSwag (10-shot) | 85.41 |
28
+ | MMLU (5-shot) | 63.61 |
29
+ | TruthfulQA (0-shot) | 55.79 |
30
+ | Winogrande (5-shot) | 78.77 |
31
+ | GSM8K (5-shot) | 65.50 |
32