Llama-3-8B-Ultra-Instruct-exl2 / mergekit_config.yml
bartowski's picture
Quant for 4.25
e7b50b7
raw history blame
No virus
390 Bytes
base_model: Undi95/Meta-Llama-3-8B-Instruct-hf
dtype: bfloat16
merge_method: dare_ties
slices:
- sources:
- layer_range: [0, 32]
model: llama-3-8B-ultra-instruct/RPPart
parameters:
weight: 0.39
- layer_range: [0, 32]
model: llama-3-8B-ultra-instruct/InstructPart
parameters:
weight: 0.26
- layer_range: [0, 32]
model: Undi95/Meta-Llama-3-8B-Instruct-hf