mhm-7b-v1.3-DPO-1 / README.md
h2m's picture
Update README.md (#1)
d8c5c25 verified
|
raw
history blame
521 Bytes
metadata
license: apache-2.0
language:
  - en

image/jpeg

A DPO fine tuned mhm-7b-v1.3 on Intel/orca_dpo_pairs

Based upon mistral. Created using dare_ties and models from openllm leaderboard. Over 3 merges involving 7 different models, this was the result.

Just an experiment.