Edit model card

berke-tr-slerp-merge-7B

berke-tr-slerp-merge-7B is a merge of the following models using mergekit:

Configs

```yaml slices:

  • sources:
    • model: TURKCELL/Turkcell-LLM-7b-v1 layer_range: [0, 32]
    • model: Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0 layer_range: [0, 32]

merge_method: slerp base_model: mistralai/Mistral-7B-Instruct-v0.2 parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.7, 0.7] - value: 0.7 dtype: bfloat16 ```

Downloads last month

-

Downloads are not tracked for this model. How to track
Unable to determine this model's library. Check the docs .