Rombos-LLM-70b-Llama-3.3-exl2 / mergekit_config.yml
bartowski's picture
Quant for 6.5
fdc4d90 verified
raw
history blame
254 Bytes
models:
- model: ./mergekit/models/3.3-70B-Instruct
parameters:
weight: 1
density: 1
merge_method: ties
base_model: ./mergekit/models/3.1-70b-base
parameters:
weight: 1
density: 1
normalize: true
int8_mask: false
dtype: bfloat16