Mistmes-slerp

Mistmes-slerp is a merge of the following models using mergekit:

🧩 Configuration

```yaml slices:

  • sources:
    • model: mistralai/Mistral-7B-v0.1
    • model: NousResearch/Hermes-2-Pro-Mistral-7B merge_method: slerp base_model: mistralai/Mistral-7B-v0.1 parameters: t:
    • filter: self_attn value: [0, 0.5, 0.3, 0.7, 1]
    • filter: mlp value: [1, 0.5, 0.7, 0.3, 0]
    • value: 0.5 dtype: float16 ```
Downloads last month
4
Safetensors
Model size
12.9B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for dozzke/Mistmes-slerp

Quantizations
1 model