Text Generation
Transformers
Safetensors
Chinese
English
mixtral
Mistral
conversational
Inference Endpoints
text-generation-inference
qq8933 commited on
Commit
04064a4
1 Parent(s): f118627

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,4 +15,4 @@ pipeline_tag: conversational
15
 
16
  We present to you the Zephyr-8x7b, a Mixtral 8x7B MoE model that SFT-only training on a dataset of nearly four million conversation corpora.
17
 
18
- It has demonstrated strong contextual understanding, reasoning, and human moral alignment without alignment like DPO, and we invite you to participate in our exploration!
 
15
 
16
  We present to you the Zephyr-8x7b, a Mixtral 8x7B MoE model that SFT-only training on a dataset of nearly four million conversation corpora.
17
 
18
+ It has demonstrated strong contextual understanding, reasoning, and human moral alignment without alignment techniques like DPO, and we invite you to participate in our exploration!