Marcoro14-7B-slerp / README.md
Eang's picture
Upload folder using huggingface_hub
84f140c verified
metadata
license: apache-2.0
tags:
  - merge
  - mergekit
  - lazymergekit
  - OpenPipe/mistral-ft-optimized-1218
  - mlabonne/NeuralHermes-2.5-Mistral-7B

Marcoro14-7B-slerp

Marcoro14-7B-slerp is a merge of the following models using mergekit:

🧩 Configuration

# slices:
#   - sources:
#       - model: AIDC-ai-business/Marcoroni-7B-v3
#         layer_range: [0, 32]
#       - model: EmbeddedLLM/Mistral-7B-Merge-14-v0.1
#         layer_range: [0, 32]
# merge_method: slerp
# base_model: AIDC-ai-business/Marcoroni-7B-v3
slices:
  - sources:
      - model: OpenPipe/mistral-ft-optimized-1218
        layer_range: [0, 32]
      - model: mlabonne/NeuralHermes-2.5-Mistral-7B
        layer_range: [0, 32]
merge_method: slerp
base_model: OpenPipe/mistral-ft-optimized-1218
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16