File size: 288 Bytes
0d01668 |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
models:
- model: togethercomputer/LLaMA-2-7B-32K
parameters:
weight: 1.0
- model: vibhorag101/llama-2-7b-chat-hf-phr_mental_therapy
parameters:
weight: 0.3
- model: princeton-nlp/SWE-Llama-7b
parameters:
weight: 0.5
merge_method: linear
dtype: float16
|