--- license: other tags: - merge - mergekit - lazymergekit - deepseek-ai/deepseek-math-7b-rl - deepseek-ai/deepseek-math-7b-instruct --- # deep-wizard-7B-slerp deep-wizard-7B-slerp is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [deepseek-ai/deepseek-math-7b-rl](https://huggingface.co/deepseek-ai/deepseek-math-7b-rl) * [deepseek-ai/deepseek-math-7b-instruct](https://huggingface.co/deepseek-ai/deepseek-math-7b-instruct) ## 🧩 Configuration ```yaml models: - model: deepseek-ai/deepseek-math-7b-base # no parameters necessary for base model - model: deepseek-ai/deepseek-math-7b-rl parameters: density: 0.5 weight: 0.7 - model: deepseek-ai/deepseek-math-7b-instruct parameters: density: 0.5 weight: 0.3 merge_method: dare_ties base_model: deepseek-ai/deepseek-math-7b-base parameters: int8_mask: true dtype: bfloat16 ```