File size: 317 Bytes
fa76e33 |
1 2 3 4 5 6 7 8 9 |
models:
- model: cognitivecomputations/dolphin-2.8-mistral-7b-v02
- model: NousResearch/Meta-Llama-3-8B-Instruct
merge_method: slerp
base_model: NousResearch/Meta-Llama-3-8B-Instruct
dtype: bfloat16
parameters:
t: [0, 0.5, 1, 0.5, 0] # V shaped curve: Hermes for input & output, WizardMath in the middle layers
|