Home baked
Collection
Home-baked merges and tunes.
•
13 items
•
Updated
this... will either be my magnum opus... or terrible. no inbetweens!
Post-test verdict: It's mostly braindamaged. Might be my settings or something, idk.
the ./output
mentioned below is my own merge using identical recipe as Envoid/Mixtral-Instruct-ITR-8x7B.
This is a merge of pre-trained language models created using mergekit.
This model was merged using the DARE TIES merge method using Envoid/Mixtral-Instruct-ITR-8x7B as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: ./output/+/ai/LLM/tmp/pefts/daybreak-peft/mixtral-8x7b
parameters:
density: 0.66
weight: 1.0
- model: Envoid/Mixtral-Instruct-ITR-8x7B+retrieval-bar/Mixtral-8x7B-v0.1_case-briefs
parameters:
density: 0.1
weight: 0.25
- model: Envoid/Mixtral-Instruct-ITR-8x7B+Doctor-Shotgun/limarp-zloss-mixtral-8x7b-qlora
parameters:
density: 0.66
weight: 0.5
- model: NeverSleep/Noromaid-v0.4-Mixtral-Instruct-8x7b-Zloss
parameters:
density: 0.15
weight: 0.3
merge_method: dare_ties
base_model: Envoid/Mixtral-Instruct-ITR-8x7B
dtype: float16