googlefan's picture
Upload folder using huggingface_hub
695856d verified
|
raw
history blame
1.72 kB
---
base_model:
- AXCXEPT/EZO-Common-9B-gemma-2-it
- google/gemma-2-9b-it
- AXCXEPT/EZO-Humanities-9B-gemma-2-it
library_name: transformers
tags:
- mergekit
- merge
---
# final_model
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [google/gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it) as a base.
### Models Merged
The following models were included in the merge:
* [AXCXEPT/EZO-Common-9B-gemma-2-it](https://huggingface.co/AXCXEPT/EZO-Common-9B-gemma-2-it)
* [AXCXEPT/EZO-Humanities-9B-gemma-2-it](https://huggingface.co/AXCXEPT/EZO-Humanities-9B-gemma-2-it)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: google/gemma-2-9b-it
dtype: bfloat16
merge_method: dare_ties
parameters:
int8_mask: 1.0
normalize: 1.0
slices:
- sources:
- layer_range: [0, 42]
model: AXCXEPT/EZO-Common-9B-gemma-2-it
parameters:
density: [1.0, 1.0, 1.0, 1.0, 1.0, 0.8609260279433345]
weight: [0.595007212574231, 0.46180161884966264, 0.5081090754627522, 0.7594430240728707,
0.39626499912268787, 0.7442924351417195]
- layer_range: [0, 42]
model: AXCXEPT/EZO-Humanities-9B-gemma-2-it
parameters:
density: [1.0, 0.6904504730518576, 0.8514905340703336, 1.0, 0.9636307820050712,
1.0]
weight: [0.40256314456396064, 0.762089082959327, 0.6352898264013093, 0.0652347269897402,
0.4936558839209752, 0.5672108261487534]
- layer_range: [0, 42]
model: google/gemma-2-9b-it
```