CorticalStack's picture
Upload folder using huggingface_hub
ead742b verified
|
raw
history blame
994 Bytes
metadata
license: apache-2.0
tags:
  - merge
  - mergekit
  - mlabonne/AlphaMonarch-7B
  - bardsai/jaskier-7b-dpo-v5.6

neurotic-crown-clown-7B-ties

neurotic-crown-clown-7B-ties is a TRIM, ELECT SIGN & MERGE (TIES) merge of the following models using mergekit:

See the paper TIES-Merging: Resolving Interference When Merging Models for more on the method.

🧩 Configuration

models:
  - model: mlabonne/NeuralMonarch-7B
    # no parameters necessary for base model
  - model: mlabonne/AlphaMonarch-7B
    parameters:
      density: 0.5
      weight: 0.5
  - model: bardsai/jaskier-7b-dpo-v5.6
    parameters:
      density: 0.5
      weight: 0.3
merge_method: ties
base_model: mlabonne/NeuralMonarch-7B
parameters:
  normalize: true
dtype: float16