Llama-3-8B-combo-merge / mergekit_config.yml
DavidAhn's picture
Upload folder using huggingface_hub
dcea1df verified
raw
history blame contribute delete
424 Bytes
models:
- model: DavidAhn/Llama-3-8B-slerp-262k-SauerkrautLM
# no parameters necessary for base model
- model: pankajmathur/orca_mini_v5_8b_dpo
parameters:
density: 0.5
weight: 0.5
- model: mlabonne/Daredevil-8B-abliterated
parameters:
density: 0.5
weight: 0.3
merge_method: ties
base_model: DavidAhn/Llama-3-8B-slerp-262k-SauerkrautLM
parameters:
normalize: true
dtype: float16