IHaBiS's picture
Upload folder using huggingface_hub
6c5d949 verified
metadata
base_model:
  - ycros/BagelMIsteryTour-v2-8x7B
  - smelborp/MixtralOrochi8x7B
  - cognitivecomputations/dolphin-2.7-mixtral-8x7b
library_name: transformers
tags:
  - mergekit
  - merge

maid-yuzu-v7

This is a merge of pre-trained language models created using mergekit.

I don't know anything about merges, so this may be a stupid method, but I was curious how the models would be merged if I took this approach.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

This model is a model that first merges Model Orochi with Model dolphin with a 0.15 SLERP option, and then merges Model BagelMIsteryTour with a 0.2 SLERP option based on the merged model.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model:
  model:
    path: ../maid-yuzu-v7-base
dtype: bfloat16
merge_method: slerp
parameters:
  t:
  - value: 0.2
slices:
- sources:
  - layer_range: [0, 32]
    model:
      model:
        path: ../maid-yuzu-v7-base
  - layer_range: [0, 32]
    model:
      model:
        path: ycros/BagelMIsteryTour-v2-8x7B