Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
1
2
PDS DPO
pdsdpo
Follow
https://pds-dpo.github.io/
AI & ML interests
None yet
Recent Activity
updated
a dataset
23 days ago
pdsdpo/pdsdpo-v1_1-data
published
a dataset
24 days ago
pdsdpo/pdsdpo-v1_1-data
upvoted
a
paper
4 months ago
Multimodal Preference Data Synthetic Alignment with Reward Model
View all activity
Organizations
None yet
pdsdpo
's activity
All
Models
Datasets
Spaces
Papers
Collections
Community
Posts
Upvotes
Likes
Articles
liked
2 models
5 months ago
pdsdpo/PDS-DPO-7B
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
6
•
1
pdsdpo/PDS-DPO-7B-LoRA
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
2
•
1