--- base_model: - mistralai/Mistral-7B-Instruct-v0.2 - LeroyDyer/Mixtral_AI_Cyber_3.0 - LeroyDyer/Mixtral_AI_MultiToken - LeroyDyer/Mixtral_AI_Multi_TEST library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [LeroyDyer/Mixtral_AI_Cyber_3.0](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_3.0) as a base. ### Models Merged The following models were included in the merge: * [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) * [LeroyDyer/Mixtral_AI_MultiToken](https://huggingface.co/LeroyDyer/Mixtral_AI_MultiToken) * [LeroyDyer/Mixtral_AI_Multi_TEST](https://huggingface.co/LeroyDyer/Mixtral_AI_Multi_TEST) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: LeroyDyer/Mixtral_AI_Multi_TEST parameters: density: [0.87, 0.721, 0.451] # density gradient weight: 0.876 - model: LeroyDyer/Mixtral_AI_MultiToken parameters: density: 0.232 weight: [0.36, 0.3, 0.437, 0.76] # weight gradient - model: mistralai/Mistral-7B-Instruct-v0.2 parameters: density: 0.475 weight: - filter: mlp value: 0.5 - value: 0 merge_method: ties base_model: LeroyDyer/Mixtral_AI_Cyber_3.0 parameters: normalize: true int8_mask: true dtype: float16 ```