Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Sakil
/
DPO_dataset
like
0
Modalities:
Text
Formats:
csv
Size:
1K - 10K
Libraries:
Datasets
pandas
Croissant
+ 1
License:
apache-2.0
Dataset card
Viewer
Files
Files and versions
Community
498ab6a
DPO_dataset
/
README.md
Sakil
initial commit
1356e26
over 1 year ago
preview
code
|
raw
Copy download link
history
blame
Safe
28 Bytes
metadata
license:
apache-2.0