Venus_DPO_50 / README.md
cloudyu's picture
Update README.md
1135d8f verified
---
license: mit
tags:
- moe
---
* [This is DPO improved version of cloudyu/Mixtral_11Bx2_MoE_19B](https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B)
* [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)