Magnolia-v2-12B / README.md
grimjim's picture
Initial release
9a66fd9
|
raw
history blame
1.94 kB
metadata
base_model:
  - grimjim/mistralai-Mistral-Nemo-Base-2407
  - grimjim/mistralai-Mistral-Nemo-Instruct-2407
  - grimjim/magnum-consolidatum-v1-12b
library_name: transformers
pipeline_tag: text-generation
tags:
  - mergekit
  - merge
license: apache-2.0

Magnolia-v2-12B

This is a merge of pre-trained language models created using mergekit.

This model essentially a rebuild of v1, using task arithmetic instead of SLERP. Of note, Mistral Nemo Base was used as the base model for task arithmetic, rather than Instruct or another fine-tune. Furthermore, max_position_embeddings was reduced to 131072, down from 1000000, as the model was only trained on up to 131072. Tested with temperature 1.0 and minP 0.01; temperature can be reduced (briefly tested at 0.45) if the model is too aggressively creative/hallucinatory.

Merge Details

Merge Method

This model was merged using the task arithmetic merge method using grimjim/mistralai-Mistral-Nemo-Base-2407 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: false
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1