Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
kvssetty
's Collections
LLMs
LLMs
updated
Mar 15
Upvote
-
NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
Text Generation
•
Updated
Apr 30
•
3.72k
•
•
419
Upvote
-
Share collection
View history
Collection guide
Browse collections