Merges
Collection
Personal Merges
•
97 items
•
Updated
•
1
This is a merge of pre-trained language models created using mergekit.
V0.1 was spitting out nonsense, so have attempted a different merge method and parameters.
This model was merged using the TIES merge method using Gryphe/Pantheon-RP-1.6.1-12b-Nemo as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: spow12/ChatWaifu_12B_v2.0
parameters:
density: 0.25
weight: 0.25
- model: Gryphe/Pantheon-RP-1.6.1-12b-Nemo
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: Gryphe/Pantheon-RP-1.6.1-12b-Nemo
parameters:
normalize: false
int8_mask: true
dtype: float16
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 20.81 |
IFEval (0-Shot) | 26.83 |
BBH (3-Shot) | 36.74 |
MATH Lvl 5 (4-Shot) | 5.44 |
GPQA (0-shot) | 9.06 |
MuSR (0-shot) | 19.64 |
MMLU-PRO (5-shot) | 27.14 |