cloudyu commited on
Commit
6749f63
1 Parent(s): 10d00c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -4,7 +4,7 @@
4
  * [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
5
 
6
 
7
- * [cloudyu/Mixtral_7Bx4_MOE_24B](https://huggingface.co/cloudyu/Mixtral_7Bx4_MOE_24B)
8
 
9
  * Metrics improved by DPO
10
  ![Metrsc improment](dpo.jpg)
 
4
  * [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
5
 
6
 
7
+ * [this is DPO improved version of cloudyu/Mixtral_7Bx4_MOE_24B](https://huggingface.co/cloudyu/Mixtral_7Bx4_MOE_24B)
8
 
9
  * Metrics improved by DPO
10
  ![Metrsc improment](dpo.jpg)