slumber-7b / mergekit_config.yml
therealchefdave's picture
Upload 10 files
bd2faab verified
raw
history blame contribute delete
288 Bytes
models:
- model: togethercomputer/LLaMA-2-7B-32K
parameters:
weight: 1.0
- model: vibhorag101/llama-2-7b-chat-hf-phr_mental_therapy
parameters:
weight: 0.3
- model: princeton-nlp/SWE-Llama-7b
parameters:
weight: 0.5
merge_method: linear
dtype: float16