license: llama2 | |
language: | |
- en | |
See https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype ? | |
Stheno-20B is even more stupid, uses the same technique as above, just slightly different params. | |
a 64-layer splice of Stheno P1 and P2. | |
Hey, it works... decently well. | |
Meme model that somehow isn't as bad as I thought. | |
Ty Chargoddard for mergekit. | |
*Stheno v2 on the way* ***soon***, *Euryale-70B progress stalled for now*, *Medusa-7B soonTM* | |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) | |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B) | |
| Metric | Value | | |
|-----------------------|---------------------------| | |
| Avg. | 45.76 | | |
| ARC (25-shot) | 57.76 | | |
| HellaSwag (10-shot) | 79.63 | | |
| MMLU (5-shot) | 52.51 | | |
| TruthfulQA (0-shot) | 51.8 | | |
| Winogrande (5-shot) | 68.98 | | |
| GSM8K (5-shot) | 0.08 | | |
| DROP (3-shot) | 9.53 | | |