Edit model card

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using mistralai/Mistral-7B-Instruct-v0.2 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: mistralai/Mistral-7B-Instruct-v0.2
dtype: bfloat16
merge_method: dare_ties
models:
- model: mistralai/Mistral-7B-Instruct-v0.2
- model: Nexusflow/Starling-LM-7B-beta
  parameters:
    density: '0.53'
    weight: '0.4'
- model: mlabonne/NeuralBeagle14-7B
  parameters:
    density: '0.53'
    weight: '0.3'
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
  parameters:
    density: '0.53'
    weight: '0.3'
parameters:
  int8_mask: true
Downloads last month
2
Safetensors
Model size
7.24B params
Tensor type
BF16
·

Merge of