File size: 383 Bytes
7e0693a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
models:
- model: Gille/StrangeMerges_46-7B-dare_ties
parameters:
weight: 0.4
density: 0.53
- model: AurelPx/Percival_01-7b-slerp
parameters:
weight: 0.4
density: 0.53
- model: kaist-ai/mistral-orpo-beta
parameters:
weight: 0.2
density: 0.53
base_model: kettleguts/zephyr-7b-beta_sparse05
merge_method: dare_ties
dtype: bfloat16
|