CultriX's picture
Upload folder using huggingface_hub
6cc7116 verified
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge

mergeduucsxwdd

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297 as a base.

Models Merged

The following models were included in the merge:

  • /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
  • /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297

Configuration

The following YAML configuration was used to produce this model:

base_model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
dtype: bfloat16
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 1.0
slices:
- sources:
  - layer_range: [0, 8]
    model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
    parameters:
      density: 0.9161252112124398
      weight: 0.42684262949748975
  - layer_range: [0, 8]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297
    parameters:
      density: 1.0
      weight: -0.07053838122379857
  - layer_range: [0, 8]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
    parameters:
      density: 1.0
      weight: 0.4849672857861172
- sources:
  - layer_range: [8, 16]
    model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
    parameters:
      density: 0.9761877279283722
      weight: 0.3012299838842449
  - layer_range: [8, 16]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297
    parameters:
      density: 0.7543807288499863
      weight: 0.1659536938628454
  - layer_range: [8, 16]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
    parameters:
      density: 0.9492212435976857
      weight: 0.18099531810782998
- sources:
  - layer_range: [16, 24]
    model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
    parameters:
      density: 1.0
      weight: 0.41688362670746804
  - layer_range: [16, 24]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297
    parameters:
      density: 1.0
      weight: 0.6151231122002006
  - layer_range: [16, 24]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
    parameters:
      density: 0.8807907336263985
      weight: 0.16825198717141238
- sources:
  - layer_range: [24, 32]
    model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
    parameters:
      density: 0.9156909340084007
      weight: 0.5326292576896499
  - layer_range: [24, 32]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297
    parameters:
      density: 1.0
      weight: 0.30104879006559454
  - layer_range: [24, 32]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
    parameters:
      density: 0.7478687683572947
      weight: 0.24780060115002012
- sources:
  - layer_range: [32, 40]
    model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
    parameters:
      density: 0.7970703509952921
      weight: 0.36719110728431165
  - layer_range: [32, 40]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297
    parameters:
      density: 0.9371578433786855
      weight: 0.3915882680301583
  - layer_range: [32, 40]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
    parameters:
      density: 0.625508665655407
      weight: 0.5087374040978708
- sources:
  - layer_range: [40, 48]
    model: /workspace/mergekit/evol_storage/input_models/SeQwence-14Bv1_904318297
    parameters:
      density: 0.844183840774721
      weight: 0.5421089451732982
  - layer_range: [40, 48]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-MegaMerge-pt2_1305659297
    parameters:
      density: 0.961154402903748
      weight: 0.30023482616059033
  - layer_range: [40, 48]
    model: /workspace/mergekit/evol_storage/input_models/Qwen2.5-14B-Wernicke_200578629
    parameters:
      density: 0.8135033850596236
      weight: 0.3232815470411797