--- base_model: - google/gemma-2-2b library_name: transformers tags: - mergekit - merge --- # gemma-2-2b-ORPO-jpn-it-abliterated-18-merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) as a base. ### Models Merged The following models were included in the merge: * /home/user/gemma-2-2b-ORPO-jpn-it-abliterated-18 * /home/user/gemma-2-2b-jpn-it-abliterated-18 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /home/user/gemma-2-2b-ORPO-jpn-it-abliterated-18 dtype: bfloat16 parameters: density: 1.0 weight: 1.0 - model: /home/user/gemma-2-2b-jpn-it-abliterated-18 dtype: bfloat16 parameters: density: 1.0 weight: 1.0 merge_method: ties base_model: google/gemma-2-2b parameters: density: 1.0 weight: 1.0 normalize: true int8_mask: true dtype: bfloat16 tokenizer_source: /home/user/gemma-2-2b-ORPO-jpn-it-abliterated-18 ```