Edit model card

mixed

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

# slices:
# - sources:
# - model: hwkwon/S-SOLAR-10.7B-v1.4
#         layer_range: [0, 40]
# - model: hkss/hk-SOLAR-10.7B-v1.4
#         layer_range: [0, 40]
# or, the equivalent models: syntax:
# models:
#   - model: psmathur/orca_mini_v3_13b
#   - model: garage-bAInd/Platypus2-13B
models:
  - model: hwkwon/S-SOLAR-10.7B-v1.4
  - model: hkss/hk-SOLAR-10.7B-v1.4

merge_method: slerp
base_model: hwkwon/S-SOLAR-10.7B-v1.4
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5 # fallback for rest of tensors
dtype: float16
Downloads last month
1,203
Safetensors
Model size
10.7B params
Tensor type
FP16
·

Merge of