Marcoro14-7B-slerp / README.md
shadowml's picture
Upload folder using huggingface_hub
25f7e12
|
raw
history blame
715 Bytes
metadata
license: apache-2.0
tags:
  - merge
  - mergekit

Marcoro14-7B-slerp

This model is a merge of the following models made with mergekit:

🧩 Configuration

slices:
  - sources:
      - model: AIDC-ai-business/Marcoroni-7B-v3
        layer_range: [0, 32]
      - model: EmbeddedLLM/Mistral-7B-Merge-14-v0.1
        layer_range: [0, 32]
merge_method: slerp
base_model: AIDC-ai-business/Marcoroni-7B-v3
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16