File size: 1,114 Bytes
f6991c9
 
 
 
 
 
 
 
673143b
f6991c9
5e9aa24
c8b152a
 
f6991c9
 
 
ef23e9a
 
 
a968aed
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: llama2
language:
- en
---

See https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype ?

Stheno-20B is even more stupid, uses the same technique as above, just slightly different params.

a 64-layer splice of Stheno P1 and P2.

Hey, it works... decently well.

Meme model that somehow isn't as bad as I thought.

Ty Chargoddard for mergekit.


*Stheno v2 on the way* ***soon***, *Euryale-70B progress stalled for now*, *Medusa-7B soonTM*
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Mix-L2-20B)

| Metric                | Value                     |
|-----------------------|---------------------------|
| Avg.                  | 45.76   |
| ARC (25-shot)         | 57.76          |
| HellaSwag (10-shot)   | 79.63    |
| MMLU (5-shot)         | 52.51         |
| TruthfulQA (0-shot)   | 51.8   |
| Winogrande (5-shot)   | 68.98   |
| GSM8K (5-shot)        | 0.08        |
| DROP (3-shot)         | 9.53         |