--- base_model: - NousResearch/Nous-Hermes-2-SOLAR-10.7B - BlueNipples/SnowLotus-v2-10.7B tags: - mergekit - merge - solar license: apache-2.0 language: - en pipeline_tag: text-generation --- These are GGUF quants for https://huggingface.co/saishf/Nous-Lotus-10.7B # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details This model is a slerp between SnowLotus-v2 & Nous-Hermes-2-SOLAR, I found snowlotus was awesome to talk to but lacked when prompting with out-there characters. Nous Hermes seemed to handle those characters a lot better, so i decided to merge the two. This is my first merge so it could perform badly or may not even work ### Extra Info Both models are solar based so context should be 4096 SnowLotus uses Alpaca Nous Hermes uses ChatML Both seem to work but i don't exactly know which performs better ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [NousResearch/Nous-Hermes-2-SOLAR-10.7B](https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B) * [BlueNipples/SnowLotus-v2-10.7B](https://huggingface.co/BlueNipples/SnowLotus-v2-10.7B) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: BlueNipples/SnowLotus-v2-10.7B layer_range: [0, 48] - model: NousResearch/Nous-Hermes-2-SOLAR-10.7B layer_range: [0, 48] merge_method: slerp base_model: BlueNipples/SnowLotus-v2-10.7B parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ```