Edit model card

Mixtral MOE 2x7B

MOE the following models by mergekit and then fine tuned by DPO.

Downloads last month
724
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference API
Input a message to start chatting with yunconglong/Mixtral_7Bx2_MoE_13B_DPO.
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.