--- base_model: - migtissera/Tess-70B-v1.6 - 152334H/miqu-1-70b-sf - NeverSleep/MiquMaid-v2-70B - sophosympatheia/Midnight-Miqu-70B-v1.0 library_name: transformers tags: - mergekit - merge --- # out This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) as a base. ### Models Merged The following models were included in the merge: * [migtissera/Tess-70B-v1.6](https://huggingface.co/migtissera/Tess-70B-v1.6) * [NeverSleep/MiquMaid-v2-70B](https://huggingface.co/NeverSleep/MiquMaid-v2-70B) * [sophosympatheia/Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: NeverSleep/MiquMaid-v2-70B - model: sophosympatheia/Midnight-Miqu-70B-v1.0 - model: migtissera/Tess-70B-v1.6 - model: 152334H/miqu-1-70b-sf merge_method: model_stock base_model: 152334H/miqu-1-70b-sf dtype: bfloat16 ```