File size: 928 Bytes
6fa0177 b062efe 6fa0177 b062efe 8be16fc b062efe f1b14e7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
---
license: llama2
tags:
- merge
- mergekit
---
This is a merge of [Xwin](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) and [WinterGoddess](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2), both extended to 32k using the method discussed [here](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-fp16/discussions/2).
Smarter than [Goliath-32k](https://huggingface.co/grimulkan/Goliath-longLORA-120b-rope8-32k-fp16), but has some quirks.
A bit damaged compared to original [WinterGoliath](https://huggingface.co/ChuckMcSneed/WinterGoliath-123b).
# Prompt format
Vicuna or Alpaca.
# Benchmarks
### NeoEvalPlusN_benchmark
[My meme benchmark.](https://huggingface.co/datasets/ChuckMcSneed/NeoEvalPlusN_benchmark)
| Test name | WinterGoliath-32k | WinterGoliath |
| ---------- | ---------- | ------- |
| B | 3 | 3 |
| C | 2 | 2 |
| D | 1 | 2 |
| S | 3.75 | 5.5 |
| P | 3.75 | 6 |
| Total | 13.5 | 18.5 | |