Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yunconglong
/
Mixtral_7Bx2_MoE_13B_DPO
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
conversational
Inference Endpoints
text-generation-inference
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
cloudyu
commited on
Jan 27
Commit
4c603f2
•
1 Parent(s):
8863572
Update README.md
Browse files
Files changed (1)
hide
show
README.md
+1
-1
README.md
CHANGED
Viewed
@@ -4,7 +4,7 @@
4
- moe
5
---
6
7
-
# Mixtral MOE 2x7B
8
9
10
4
- moe
5
---
6
7
+
#
Fine Tuned
Mixtral MOE 2x7B
8
9
10