Adding Evaluation Results

#1
Files changed (1) hide show
  1. README.md +14 -1
README.md CHANGED
@@ -216,4 +216,17 @@ Please cite the use of `orca_mini_v2_ger_7b` using the following BibTeX:
216
  archivePrefix={arXiv},
217
  primaryClass={cs.CL}
218
  }
219
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
216
  archivePrefix={arXiv},
217
  primaryClass={cs.CL}
218
  }
219
+ ```
220
+ # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
221
+ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__orca_mini_v2_ger_7b)
222
+
223
+ | Metric | Value |
224
+ |-----------------------|---------------------------|
225
+ | Avg. | 42.33 |
226
+ | ARC (25-shot) | 49.83 |
227
+ | HellaSwag (10-shot) | 75.5 |
228
+ | MMLU (5-shot) | 39.1 |
229
+ | TruthfulQA (0-shot) | 45.74 |
230
+ | Winogrande (5-shot) | 71.59 |
231
+ | GSM8K (5-shot) | 4.17 |
232
+ | DROP (3-shot) | 10.42 |