Athlon-8B-0.1 / mergekit_config.yml
Bacon666's picture
Upload folder using huggingface_hub
3634495 verified
raw
history blame contribute delete
472 Bytes
models:
- model: Sao10K/Llama-3.1-8B-Stheno-v3.4
parameters:
weight: 0.25
density: 0.84
- model: akjindal53244/Llama-3.1-Storm-8B
parameters:
weight: 0.4
density: 0.88
- model: SicariusSicariiStuff/Dusk_Rainbow
parameters:
weight: 0.6
density: 0.9
merge_method: della_linear
base_model: aifeifei798/DarkIdol-Llama-3.1-8B-Instruct-1.2-Uncensored
parameters:
int8_mask: true
epsilon: 0.07
lambda: 1
dtype: bfloat16