File size: 239 Bytes
2455ffd |
1 2 3 4 5 6 7 8 9 10 11 |
---
license: mit
---
* [This is DPO improved version of cloudyu/Mixtral_11Bx2_MoE_19B](https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B)
* [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
* metrics not test! |