Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MichaelFFan
/
dpo_model
like
0
PEFT
Safetensors
Model card
Files
Files and versions
Community
Use this model
main
dpo_model
Commit History
Update README.md
fd389ee
verified
MichaelFFan
commited on
Oct 16
Update README.md
a7056ea
verified
MichaelFFan
commited on
Oct 16
Update README.md
90276f3
verified
MichaelFFan
commited on
Oct 16
Update README.md
f94939a
verified
MichaelFFan
commited on
Oct 16
Upload folder using huggingface_hub
551e5d9
verified
MichaelFFan
commited on
Oct 16
initial commit
9c4c0fb
verified
MichaelFFan
commited on
Oct 16