Experimental merge. Details to come if successful. | |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) | |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeA-70b) | |
| Metric | Value | | |
|-----------------------|---------------------------| | |
| Avg. | 55.92 | | |
| ARC (25-shot) | 71.25 | | |
| HellaSwag (10-shot) | 87.3 | | |
| MMLU (5-shot) | 70.56 | | |
| TruthfulQA (0-shot) | 60.61 | | |
| Winogrande (5-shot) | 81.53 | | |
| GSM8K (5-shot) | 5.69 | | |
| DROP (3-shot) | 14.53 | | |