--- base_model: - Nexusflow/Starling-LM-7B-beta - openchat/openchat-3.5-1210 - openchat/openchat-3.5-0106 - mistral-community/Mistral-7B-v0.2 - berkeley-nest/Starling-LM-7B-alpha library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [mistral-community/Mistral-7B-v0.2](https://huggingface.co/mistral-community/Mistral-7B-v0.2) as a base. ### Models Merged The following models were included in the merge: * [Nexusflow/Starling-LM-7B-beta](https://huggingface.co/Nexusflow/Starling-LM-7B-beta) * [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210) * [openchat/openchat-3.5-0106](https://huggingface.co/openchat/openchat-3.5-0106) * [berkeley-nest/Starling-LM-7B-alpha](https://huggingface.co/berkeley-nest/Starling-LM-7B-alpha) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Nexusflow/Starling-LM-7B-beta - model: openchat/openchat-3.5-0106 - model: openchat/openchat-3.5-1210 - model: berkeley-nest/Starling-LM-7B-alpha merge_method: model_stock base_model: mistral-community/Mistral-7B-v0.2 dtype: bfloat16 ```