File size: 258 Bytes
6ebd98f |
1 2 3 4 5 6 7 8 9 10 |
---
license: apache-2.0
---
h2m/mhm-7b-v1.3-DPO-1
This is a DPO fine tuned mhm-7b-v1.3 on Intel/orca_dpo_pairs
Model based on mistral. created using dare_ties and models from openllm leaderboard. Mixed 7 models into 1. 3 times merging.
Just an experiment. |