File size: 1,912 Bytes
7e81cf8 5df7dd0 7e81cf8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
---
base_model:
- aifeifei798/llama3-8B-DarkIdol-2.1-Uncensored-32K
- ChaoticNeutrals/Poppy_Porpoise-1.0-L3-8B
- Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- Casual-Autopsy/Umbral-Mind-6
- ResplendentAI/Nymph_8B
library_name: transformers
tags:
- mergekit
- merge
---
ToDo: Model card(Inosmnia sapped my motivation, if I haven't by 7/12 8:00 AM EST then start a discussion and I'll get the notif.)
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [Casual-Autopsy/Umbral-Mind-6](https://huggingface.co/Casual-Autopsy/Umbral-Mind-6) as a base.
### Models Merged
The following models were included in the merge:
* [aifeifei798/llama3-8B-DarkIdol-2.1-Uncensored-32K](https://huggingface.co/aifeifei798/llama3-8B-DarkIdol-2.1-Uncensored-32K)
* [ChaoticNeutrals/Poppy_Porpoise-1.0-L3-8B](https://huggingface.co/ChaoticNeutrals/Poppy_Porpoise-1.0-L3-8B)
* [Nitral-AI/Hathor_Tahsin-L3-8B-v0.85](https://huggingface.co/Nitral-AI/Hathor_Tahsin-L3-8B-v0.85)
* [ResplendentAI/Nymph_8B](https://huggingface.co/ResplendentAI/Nymph_8B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: Casual-Autopsy/Umbral-Mind-6
- model: aifeifei798/llama3-8B-DarkIdol-2.1-Uncensored-32K
parameters:
weight: [0.02, -0.01, -0.01, 0.02]
- model: ResplendentAI/Nymph_8B
parameters:
weight: [-0.01, 0.02, 0.02, -0.01]
- model: ChaoticNeutrals/Poppy_Porpoise-1.0-L3-8B
parameters:
weight: [-0.01, 0.02, 0.02, -0.01]
- model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
parameters:
weight: [0.02, -0.01, -0.01, 0.02]
merge_method: task_arithmetic
base_model: Casual-Autopsy/Umbral-Mind-6
parameters:
normalize: false
dtype: bfloat16
```
|