Marcoro14-7B-slerp
Marcoro14-7B-slerp is a merge of the following models using mergekit:
🧩 Configuration
models:
- model: meta-llama/Meta-Llama-3-8B
# no parameters necessary for base model
- model: mistralai/Mistral-7B-Instruct-v0.1
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: meta-llama/Meta-Llama-3-8B
parameters:
normalize: true
dtype: float16