Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
ChuGyouk
's Collections
Instruction
DPO
CPT
DPO
updated
Apr 23
Popular DPO datasets
Upvote
-
mlabonne/orpo-dpo-mix-40k
Viewer
•
Updated
4 days ago
•
7.81k
•
186
Upvote
-
Share collection
View history
Collection guide
Browse collections