metadata
license: apache-2.0
language:
- en
A DPO fine tuned mhm-7b-v1.3 on Intel/orca_dpo_pairs
Based upon mistral. Created using dare_ties and models from openllm leaderboard. Over 3 merges involving 7 different models, this was the result.
Just an experiment.