--- base_model: - mistralai/Mistral-7B-v0.3 - mistralai/Mistral-7B-Instruct-v0.3 library_name: transformers tags: - mergekit - merge license: apache-2.0 --- # Mistral-7B-Instruct-demi-merge-v0.3-7B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). This is a blend of base and instruct models, intended to enable fine-tuning and/or merging (by anyone). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [mistralai/Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3) * [mistralai/Mistral-7B-Instruct-v0.3](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: mistralai/Mistral-7B-Instruct-v0.3 layer_range: [0,32] - model: mistralai/Mistral-7B-v0.3 layer_range: [0,32] merge_method: slerp base_model: mistralai/Mistral-7B-Instruct-v0.3 parameters: t: - value: 0.5 dtype: bfloat16 ```