---
base_model: []
library_name: transformers
tags:
- mergekit
- merge

---
# MN-12B-Celeste-V1.9-Instruct

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the passthrough merge method.

### Models Merged

The following models were included in the merge:
* g:/11b/Mistral-Nemo-Instruct-2407-12B
* G:/11B/MN-12B-Celeste-V1.9

### Configuration

The following YAML configuration was used to produce this model:

```yaml
# SMB with instruct to help performance.

slices:
 - sources:
   - model: g:/11b/Mistral-Nemo-Instruct-2407-12B
     layer_range: [0, 14]
 - sources:
   - model: G:/11B/MN-12B-Celeste-V1.9
     layer_range: [8, 24]
     parameters:
       scale:
         - filter: o_proj
           value: 1
         - filter: down_proj
           value: 1
         - value: 1
 - sources:
   - model: g:/11b/Mistral-Nemo-Instruct-2407-12B
     layer_range: [14, 22]
     parameters:
       scale:
         - filter: o_proj
           value: .5
         - filter: down_proj
           value: .5
         - value: 1
 - sources:
   - model: g:/11b/Mistral-Nemo-Instruct-2407-12B
     layer_range: [22, 31]
     parameters:
       scale:
         - filter: o_proj
           value: .75
         - filter: down_proj
           value: .75
         - value: 1
 - sources:
   - model: G:/11B/MN-12B-Celeste-V1.9
     layer_range: [24, 40]
     parameters:
       scale:
         - filter: o_proj
           value: 1
         - filter: down_proj
           value: 1
         - value: 1
merge_method: passthrough
dtype: bfloat16
```