Update README.md
Browse files
README.md
CHANGED
@@ -17,14 +17,14 @@ Merge [Q-bert/MetaMath-Cybertron](https://huggingface.co/Q-bert/MetaMath-Cybertr
|
|
17 |
You can use ChatML format.
|
18 |
|
19 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
20 |
-
Detailed results can be found [
|
21 |
|
22 |
| Metric | Value |
|
23 |
|-----------------------|---------------------------|
|
24 |
-
| Avg. |
|
25 |
-
| ARC (25-shot) |
|
26 |
-
| HellaSwag (10-shot) |
|
27 |
-
| MMLU (5-shot) |
|
28 |
-
| TruthfulQA (0-shot) |
|
29 |
-
| Winogrande (5-shot) |
|
30 |
-
| GSM8K (5-shot) |
|
|
|
17 |
You can use ChatML format.
|
18 |
|
19 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
20 |
+
Detailed results can be found [Here](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/Q-bert/MetaMath-Cybertron-Starling/results_2023-12-07T21-59-56.458563.json)
|
21 |
|
22 |
| Metric | Value |
|
23 |
|-----------------------|---------------------------|
|
24 |
+
| Avg. | 71.35 |
|
25 |
+
| ARC (25-shot) | 67.75 |
|
26 |
+
| HellaSwag (10-shot) | 86.23 |
|
27 |
+
| MMLU (5-shot) | 65.24 |
|
28 |
+
| TruthfulQA (0-shot) | 55.94 |
|
29 |
+
| Winogrande (5-shot) | 81.45 |
|
30 |
+
| GSM8K (5-shot) | 71.49 |
|