Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Hellisotherpeople
/
toxic-dpo-v2-mistral7b-lora
like
3
PEFT
Safetensors
unalignment/toxic-dpo-v0.2
English
Not-For-All-Audiences
License:
mit
Model card
Files
Files and versions
Community
Use this model
main
toxic-dpo-v2-mistral7b-lora
/
adapter_model.safetensors
Commit History
Upload adapter_model.safetensors
a6c4cb5
verified
Hellisotherpeople
commited on
Feb 21