Edit model card

Forbin_13B_M1_SLERP

Forbin_13B_M1_SLERP is a merge of the following models using mergekit:

🧩 Configuration

slices:
  - sources:
      - model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
        layer_range: [0, 32]
      - model: zhengr/MixTAO-7Bx2-MoE-v8.1
        layer_range: [0, 32]
merge_method: slerp
base_model: zhengr/MixTAO-7Bx2-MoE-v8.1
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16
Downloads last month
2,872
Safetensors
Model size
11.5B params
Tensor type
BF16
·