Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Ji-Xiang
's Collections
SFT Datasets
Recommended Datasets
Code LLM
Text-to-Video
Multimodal Language Models
Image Chatbot
traditional-chinese-dataset
Suggest Spaces
Suggestion Models
Chinese models
China models
Uncensored models
china-dataset
common-dataset
unfiltered dataset
Image Generator AI
Edge Computing
Voice
Medical
Big Models
GGUF Models
TTS
Visual Question Answering
Chat
Multi Tasks
Vision
DPO datasets
ORPO-DPO datasets
Code dataset
SLM (small language models)
automatic speech recognition (ASR)
Vision-Language dataset
MoE
Dense Passage Retrieval (DPR) Datasets
Audio-To-Text
background-removal
Extreme Quantization
Try on
ORPO-DPO datasets
updated
Jul 8
Upvote
-
mlabonne/orpo-dpo-mix-40k
Viewer
•
Updated
Oct 17
•
44.2k
•
1.53k
•
248
Upvote
-
Share collection
View history
Collection guide
Browse collections