llama-3-8b-slow-DUS-method-1 / mergekit_config.yml
ryan0712's picture
Upload folder using huggingface_hub
4356f7a verified
raw
history blame
323 Bytes
slices:
- sources:
- model: ryan0712/llama-3-8b-DUS-initialized
layer_range: [0, 21]
- sources:
- model: ryan0712/llama-3-8b-slow-DUS-layer-SLERP
layer_range: [0, 1]
- sources:
- model: ryan0712/llama-3-8b-DUS-initialized
layer_range: [21, 48]
merge_method: passthrough
dtype: bfloat16