Llama-3.1-70B-Instruct-lorablated / mergekit_config.yml
mlabonne's picture
Upload folder using huggingface_hub
ba3070e verified
raw
history blame contribute delete
336 Bytes
base_model: ./meta-llama/Meta-Llama-3.1-70B-Instruct+Llama-3-70B-Instruct-abliterated-LORA
dtype: bfloat16
merge_method: task_arithmetic
parameters:
normalize: false
slices:
- sources:
- layer_range: [0, 80]
model: ./meta-llama/Meta-Llama-3.1-70B-Instruct+Llama-3-70B-Instruct-abliterated-LORA
parameters:
weight: 1.0