metadata
base_model:
- unsloth/Llama-3.2-3B-Instruct
- unsloth/Llama-3.2-3B
library_name: transformers
tags:
- mergekit
- merge
Details
This is an experimental merge I plan to use for future projects, it shows promising results from my limited testing. Further testing should probably be done! I just don't have the time, nor compute right now.
Configuration
The following YAML configuration was used to produce this model:
models:
- model: unsloth/Llama-3.2-3B
parameters:
weight: 0.5
density: 0.7
- model: unsloth/Llama-3.2-3B-Instruct
parameters:
weight: 0.5
density: 0.6
merge_method: ties
base_model: unsloth/Llama-3.2-3B
parameters:
normalize: true
int8_mask: true
dtype: bfloat16
tokenizer_source: unsloth/Llama-3.2-3B