File size: 521 Bytes
6ebd98f d8c5c25 6ebd98f d8c5c25 6ebd98f d8c5c25 6ebd98f |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
---
license: apache-2.0
language:
- en
---
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ORVjYrpzyfKfP4ByOQnpQ.jpeg)
A DPO fine tuned [mhm-7b-v1.3](https://huggingface.co/h2m/mhm-7b-v1.3) on [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs)
Based upon mistral. Created using [dare_ties](https://github.com/cg123/mergekit) and models from openllm leaderboard. Over 3 merges involving 7 different models, this was the result.
Just an experiment. |