merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using Aratako/SniffyOtter-7B-Novel-Writing-NSFW as a base.
Models Merged
The following models were included in the merge:
- Severian/Nexus-IKM-RolePlay-StoryWriter-Hermes-2-Pro-7B
- Lexic0n/Mistral_7B-Open_Hermes-NSFWV1
- Undi95/LewdMistral-7B-0.2
- OmnicromsBrain/StoryFusion-7B
- diffnamehard/Mistral-CatMacaroni-slerp-uncensored-7B
- sayhan/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA
- Local-Novel-LLM-project/Ninja-v1-NSFW
- cognitivecomputations/samantha-mistral-7b
- teknium/Hermes-Trismegistus-Mistral-7B
- Local-Novel-LLM-project/Vecteus-Poet
- stmackcat/zepb-books-6-merged
- Norquinal/Mistral-7B-storywriter
Configuration
The following YAML configuration was used to produce this model:
models:
# Personality & Pretuning 20%
- model: cognitivecomputations/samantha-mistral-7b
parameters:
weight: 0.0667
density: 0.6
- model: stmackcat/zepb-books-6-merged
parameters:
weight: 0.0666
density: 0.6
- model: sayhan/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA
parameters:
weight: 0.0667
density: 0.7
# Additional Features & Uncensored Weights 30%
- model: Local-Novel-LLM-project/Vecteus-Poet
parameters:
weight: 0.05
density: 0.7
- model: Local-Novel-LLM-project/Ninja-v1-NSFW
parameters:
weight: 0.05
density: 0.7
- model: Severian/Nexus-IKM-RolePlay-StoryWriter-Hermes-2-Pro-7B
parameters:
weight: 0.05
density: 0.7
- model: Lexic0n/Mistral_7B-Open_Hermes-NSFWV1
parameters:
weight: 0.05
density: 0.7
- model: Undi95/LewdMistral-7B-0.2
parameters:
weight: 0.05
density: 0.8
- model: teknium/Hermes-Trismegistus-Mistral-7B
parameters:
weight: 0.05
density: 0.8
# Model Base 50%
- model: Norquinal/Mistral-7B-storywriter
parameters:
weight: 0.125
density: 0.8
- model: Aratako/SniffyOtter-7B-Novel-Writing-NSFW
parameters:
weight: 0.125
density: 0.8
- model: OmnicromsBrain/StoryFusion-7B
parameters:
weight: 0.125
density: 0.8
- model: diffnamehard/Mistral-CatMacaroni-slerp-uncensored-7B
parameters:
weight: 0.125
density: 0.8
base_model: Aratako/SniffyOtter-7B-Novel-Writing-NSFW
merge_method: dare_ties
dtype: bfloat16
- Downloads last month
- 11