File size: 6,281 Bytes
9405f8c 3a7f0a2 9405f8c b1ec896 118f037 c31969a 9405f8c 46c9eee 8b48d49 9b853e7 46c9eee 8b48d49 898050f 8b48d49 9405f8c 170cea8 898050f 118f037 898050f c31969a f6d9e2b c31969a f6d9e2b c31969a f6d9e2b 9cd3b9f f6d9e2b 258cd1a 9cd3b9f 9405f8c 3a7f0a2 9405f8c c31969a 9405f8c c31969a 9405f8c c31969a 8dc2345 c31969a 5aaca7d c31969a 9405f8c c31969a 9405f8c caa2fab |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 |
---
base_model:
- IntervitensInc/Mistral-Nemo-Base-2407-chatml
- nbeerbower/mistral-nemo-bophades-12B
- nbeerbower/mistral-nemo-wissenschaft-12B
- elinas/Chronos-Gold-12B-1.0
- Fizzarolli/MN-12b-Sunrose
- nbeerbower/mistral-nemo-gutenberg-12B-v4
- anthracite-org/magnum-12b-v2.5-kto
library_name: transformers
tags:
- mergekit
- merge
---
![Made with NovelAI](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1/resolve/main/magmell.png)
*[Welcome, brave one; you've come a long mile.](https://www.youtube.com/watch?v=dgGEuC1F3oE)*
# MN-12B-Mag-Mell-R1
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
[Official Q4_K_M, Q6_K and Q_8 GGUFs by me](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1-GGUF)
[More available from mradermacher](https://huggingface.co/mradermacher/MN-12B-Mag-Mell-R1-GGUF/tree/main)
[Official EXL2 by toastypigeon](https://huggingface.co/Alfitaria/MN-12B-Mag-Mell-R1-exl2)
## Usage Details
### Sampler Settings
Mag Mell R1 was tested with Temp 1.25 and MinP 0.2. This was fairly stable up to 10K, but this might be too "hot".
If issues with coherency occur, try *in*creasing MinP or *de*creasing Temperature.
Other samplers shouldn't be necessary. XTC was shown to break outputs. DRY should be okay if used sparingly. Other penalty-type samplers should probably be avoided.
### Formatting
The base model for Mag Mell is [Mistral-Nemo-Base-2407-chatml](https://huggingface.co/IntervitensInc/Mistral-Nemo-Base-2407-chatml), and as such ChatML formatting is recommended.
Early testing versions had a tendency to leak tokens, but this should be more or less hammered out. It recently (12-18-2024) came to attention that Cache Quantization may either cause or exacerbate this issue.
## Merge Details
Mag Mell is a multi-stage merge, Inspired by hyper-merges like [Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter) and [Umbral Mind.](https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B)
Intended to be a general purpose "Best of Nemo" model for any fictional, creative use case.
6 models were chosen based on 3 categories; they were then paired up and merged via layer-weighted SLERP to create intermediate "specialists" which are then evaluated in their domain.
The specialists were then merged into the base via DARE-TIES, with hyperparameters chosen to reduce interference caused by the overlap of the three domains.
The idea with this approach is to extract the best qualities of each component part, and produce models whose task vectors represent more than the sum of their parts.
The three specialists are as follows:
- Hero (RP, kink/trope coverage): [Chronos Gold](https://huggingface.co/elinas/Chronos-Gold-12B-1.0), [Sunrose](https://huggingface.co/Fizzarolli/MN-12b-Sunrose).
- Monk (Intelligence, groundedness): [Bophades](https://huggingface.co/nbeerbower/mistral-nemo-bophades-12B), [Wissenschaft](https://huggingface.co/nbeerbower/mistral-nemo-wissenschaft-12B).
- Deity (Prose, flair): [Gutenberg v4](https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v4), [Magnum 2.5 KTO](https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto).
I've been dreaming about this merge since Nemo tunes started coming out in earnest. From our testing, Mag Mell demonstrates worldbuilding capabilities unlike any model in its class, comparable to old adventuring models like Tiefighter, and prose that exhibits minimal "slop" (not bad for no finetuning,) frequently devising electrifying metaphors that left us consistently astonished.
I don't want to toot my own bugle though; I'm really proud of how this came out, but please leave your feedback, good or bad.
Special thanks as usual to Toaster for his feedback and Fizz for helping fund compute, as well as the KoboldAI Discord for their resources.
### Merge Method
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [IntervitensInc/Mistral-Nemo-Base-2407-chatml](https://huggingface.co/IntervitensInc/Mistral-Nemo-Base-2407-chatml) as a base.
### Models Merged
The following models were included in the merge:
* IntervitensInc/Mistral-Nemo-Base-2407-chatml
* nbeerbower/mistral-nemo-bophades-12B
* nbeerbower/mistral-nemo-wissenschaft-12B
* elinas/Chronos-Gold-12B-1.0
* Fizzarolli/MN-12b-Sunrose
* nbeerbower/mistral-nemo-gutenberg-12B-v4
* anthracite-org/magnum-12b-v2.5-kto
### Configuration
The following YAML configurations were used to produce this model:
#### Monk:
```yaml
models:
- model: nbeerbower/mistral-nemo-bophades-12B
- model: nbeerbower/mistral-nemo-wissenschaft-12B
merge_method: slerp
base_model: nbeerbower/mistral-nemo-bophades-12B
parameters:
t: [0.1, 0.2, 0.4, 0.6, 0.6, 0.4, 0.2, 0.1]
dtype: bfloat16
tokenizer_source: base
```
#### Hero:
```yaml
models:
- model: elinas/Chronos-Gold-12B-1.0
- model: Fizzarolli/MN-12b-Sunrose
merge_method: slerp
base_model: elinas/Chronos-Gold-12B-1.0
parameters:
t: [0.1, 0.2, 0.4, 0.6, 0.6, 0.4, 0.2, 0.1]
dtype: bfloat16
tokenizer_source: base
```
#### Deity:
```yaml
models:
- model: nbeerbower/mistral-nemo-gutenberg-12B-v4
- model: anthracite-org/magnum-12b-v2.5-kto
merge_method: slerp
base_model: nbeerbower/mistral-nemo-gutenberg-12B-v4
parameters:
t: [0, 0.1, 0.2, 0.25, 0.25, 0.2, 0.1, 0]
dtype: bfloat16
tokenizer_source: base
```
#### Mag Mell:
```yaml
models:
- model: monk
parameters:
density: 0.7
weight: 0.5
- model: hero
parameters:
density: 0.9
weight: 1
- model: deity
parameters:
density: 0.5
weight: 0.7
merge_method: dare_ties
base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
tokenizer_source: base
```
`In Irish mythology, Mag Mell (modern spelling: Magh Meall, meaning 'delightful plain') is one of the names for the Celtic Otherworld, a mythical realm achievable through death and/or glory... Never explicitly stated in any surviving mythological account to be an afterlife; rather, it is usually portrayed as a paradise populated by deities, which is occasionally visited by some adventurous mortals. In its island guise, it was visited by various legendary Irish heroes and monks, forming the basis of the adventure myth or echtrae...` |