license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- OpenPipe/mistral-ft-optimized-1218 | |
- mlabonne/NeuralHermes-2.5-Mistral-7B | |
# Marcoro14-7B-ties | |
Marcoro14-7B-ties is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218) | |
* [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B) | |
## 🧩 Configuration | |
```yaml | |
models: | |
- model: mistralai/Mistral-7B-v0.1 | |
# no parameters necessary for base model | |
- model: OpenPipe/mistral-ft-optimized-1218 | |
parameters: | |
density: 0.5 | |
weight: 0.5 | |
- model: mlabonne/NeuralHermes-2.5-Mistral-7B | |
parameters: | |
density: 0.5 | |
weight: 0.3 | |
merge_method: ties | |
base_model: mistralai/Mistral-7B-v0.1 | |
parameters: | |
normalize: true | |
dtype: float16 | |
``` |