--- base_model: - w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored - Technonia/mistral-7b-dolly5k - w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored - jeiku/Luna_LoRA_SOLAR - w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored - Anarchist/mistral_7b_lora_smol_pippa - w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored - typeof/openhermes-2.5-mistral-lora - w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored library_name: transformers tags: - mergekit - merge --- # SolarBest This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored](https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored) as a base. ### Models Merged The following models were included in the merge: * [w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored](https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored) + [Technonia/mistral-7b-dolly5k](https://huggingface.co/Technonia/mistral-7b-dolly5k) * [w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored](https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored) + [jeiku/Luna_LoRA_SOLAR](https://huggingface.co/jeiku/Luna_LoRA_SOLAR) * [w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored](https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored) + [Anarchist/mistral_7b_lora_smol_pippa](https://huggingface.co/Anarchist/mistral_7b_lora_smol_pippa) * [w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored](https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored) + [typeof/openhermes-2.5-mistral-lora](https://huggingface.co/typeof/openhermes-2.5-mistral-lora) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: dare_ties base_model: w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored parameters: normalize: true models: - model: w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored+jeiku/Luna_LoRA_SOLAR parameters: weight: 0.65 - model: w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored+typeof/openhermes-2.5-mistral-lora parameters: weight: 1 - model: w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored+Technonia/mistral-7b-dolly5k parameters: weight: 0.8 - model: w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored+Anarchist/mistral_7b_lora_smol_pippa parameters: weight: 0.55 dtype: float16 ```