|
--- |
|
license: llama2 |
|
tags: |
|
- moe |
|
- merge |
|
--- |
|
<img src=https://huggingface.co/lodrick-the-lafted/Grafted-Llama2-2x70B/resolve/main/gl.png> |
|
The Llamas are WinterGoddess + AuroraNights. |
|
|
|
This is yet another mergekit abomination. |
|
|
|
This is probably more of a "dense" MoE than a sparse one. |
|
|
|
Unfortunately, most of the testing I have tried with this model shows it works well for a couple sentences, then it starts spouting gibberish. Don't waste your bandwidth. |
|
|
|
<br/> |
|
<br/> |
|
|
|
|
|
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) |
|
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B) |
|
|
|
| Metric |Value| |
|
|---------------------------------|----:| |
|
|Avg. |73.77| |
|
|AI2 Reasoning Challenge (25-Shot)|72.61| |
|
|HellaSwag (10-Shot) |89.57| |
|
|MMLU (5-Shot) |71.67| |
|
|TruthfulQA (0-shot) |66.49| |
|
|Winogrande (5-shot) |84.37| |
|
|GSM8k (5-shot) |57.92| |
|
|
|
|