Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
cloudyu
/
Nemo-DPO-v12
like
0
Safetensors
qwen2
4-bit precision
bitsandbytes
Model card
Files
Files and versions
Community
main
Nemo-DPO-v12
/
vocab.json
cloudyu
Upload folder using huggingface_hub
e6d5df0
verified
5 months ago
raw
Copy download link
history
contribute
delete
Safe
2.78 MB
File too large to display, you can
check the raw version
instead.