Overthinker-Eileithyia-13B / mergekit_config.yml
s3nh's picture
Upload folder using huggingface_hub
f2250a6
raw
history blame contribute delete
311 Bytes
base_model: athirdpath/Eileithyia-13B
dtype: float16
merge_method: slerp
parameters:
t:
- filter: self_attn
value: [0.22, 0.61, 0.46, 0.77, 1.0]
- filter: mlp
value: [0.78, 0.39, 0.54, 0.23, 0.0]
- value: 0.5
slices:
- sources:
- layer_range: [0, 32]
model: FPHam/Sydney_Overthinker_13b_HF