Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Ren-Wei
/
Safe-RLHF-DPO-helpless-opt-1b
like
0
Safetensors
opt
Model card
Files
Files and versions
Community
main
Safe-RLHF-DPO-helpless-opt-1b
/
merges.txt
AAAhWei
Initial commit
ea33e13
27 days ago
raw
Copy download link
history
contribute
delete
Safe
456 kB
File too large to display, you can
check the raw version
instead.