Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Suparious
/
CerebrumHyperion-7B-DPO-exl2
like
0
Merge
mergekit
lazymergekit
Locutusque/OpenCerebrum-1.0-7b-DPO
Locutusque/Hyperion-3.0-Mistral-7B-DPO
Model card
Files
Files and versions
Community
da19774
CerebrumHyperion-7B-DPO-exl2
Commit History
initial commit
da19774
verified
Suparious
commited on
Mar 28