Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
wasertech
/
assistant-llama2-7b-chat-dpo-qlora
like
0
PEFT
wasertech/OneOS
wasertech/assistant-llama2-7b-chat-dpo
Model card
Files
Files and versions
Community
Use this model
main
assistant-llama2-7b-chat-dpo-qlora
/
README.md
wasertech
Update README.md
29733c8
about 1 year ago
preview
code
|
raw
Copy download link
history
blame
contribute
delete
Safe
161 Bytes
metadata
library_name:
peft
datasets:
-
wasertech/OneOS
-
wasertech/assistant-llama2-7b-chat-dpo
Training procedure
Framework versions
PEFT 0.6.0.dev0