--- base_model: - overfit-brothers/gemma-1130 - overfit-brothers/hello_world02 - overfit-brothers/gemma-1129-test123 - overfit-brothers/gemma-2-9b-it-HJ-test-1204-5epoch-safety-p-m library_name: transformers tags: - mergekit - merge --- # 2024-12-05(02) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [overfit-brothers/gemma-1129-test123](https://huggingface.co/overfit-brothers/gemma-1129-test123) as a base. ### Models Merged The following models were included in the merge: * [overfit-brothers/gemma-1130](https://huggingface.co/overfit-brothers/gemma-1130) * [overfit-brothers/hello_world02](https://huggingface.co/overfit-brothers/hello_world02) * [overfit-brothers/gemma-2-9b-it-HJ-test-1204-5epoch-safety-p-m](https://huggingface.co/overfit-brothers/gemma-2-9b-it-HJ-test-1204-5epoch-safety-p-m) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: overfit-brothers/gemma-1129-test123 dtype: bfloat16 merge_method: model_stock slices: - sources: - layer_range: [0, 42] model: overfit-brothers/gemma-1129-test123 - layer_range: [0, 42] model: overfit-brothers/gemma-1130 parameters: weight: 1.0 - layer_range: [0, 42] model: overfit-brothers/hello_world02 parameters: weight: 1.0 - layer_range: [0, 42] model: overfit-brothers/gemma-2-9b-it-HJ-test-1204-5epoch-safety-p-m parameters: weight: 1.0 ```