metadata
base_model:
- cognitivecomputations/Dolphin3.0-R1-Mistral-24B
- ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
Merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Base model for further finetuning.
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4
layer_range: [0, 32]
- model: cognitivecomputations/Dolphin3.0-R1-Mistral-24B
layer_range: [0, 32]
merge_method: slerp
base_model: cognitivecomputations/Dolphin3.0-R1-Mistral-24B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16