File size: 1,103 Bytes
7381e93
 
 
 
 
 
68b4f64
86dbef4
7381e93
86dbef4
1ac5eb9
86dbef4
348d956
ff8c841
 
793cc4e
 
 
348d956
 
793cc4e
348d956
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
license: llama2
tags:
- moe
- merge
---
<img src=https://huggingface.co/lodrick-the-lafted/Grafted-Llama2-2x70B/resolve/main/gl.png>
The Llamas are WinterGoddess + AuroraNights.

This is yet another mergekit abomination. 

This is probably more of a "dense" MoE than a sparse one. 

Unfortunately, most of the testing I have tried with this model shows it works well for a couple sentences, then it starts spouting gibberish. Don't waste your bandwidth.

<br/>
<br/>


# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B)

|             Metric              |Value|
|---------------------------------|----:|
|Avg.                             |73.77|
|AI2 Reasoning Challenge (25-Shot)|72.61|
|HellaSwag (10-Shot)              |89.57|
|MMLU (5-Shot)                    |71.67|
|TruthfulQA (0-shot)              |66.49|
|Winogrande (5-shot)              |84.37|
|GSM8k (5-shot)                   |57.92|