--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - bardsai/jaskier-7b-dpo-v5.6 - eren23/ogno-monarch-jaskier-merge-7b - liminerity/Omningotex-7b-slerp - yleo/OgnoMonarch-7B --- # mergedd mergedd is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [](https://huggingface.co/) * [](https://huggingface.co/) * [](https://huggingface.co/) * [](https://huggingface.co/) ## 🧩 Configuration ```json{ "models": [ { "model": "bardsai/jaskier-7b-dpo-v5.6", "parameters": {} }, { "model": "eren23/ogno-monarch-jaskier-merge-7b", "parameters": { "density": 0.53, "weight": 0.4 } }, { "model": "liminerity/Omningotex-7b-slerp", "parameters": { "density": 0.53, "weight": 0.3 } }, { "model": "yleo/OgnoMonarch-7B", "parameters": { "density": 0.53, "weight": 0.3 } } ], "merge_method": "dare_ties", "base_model": "bardsai/jaskier-7b-dpo-v5.6", "parameters": { "int8_mask": true, "dtype": "bfloat16" } }