Edit model card

DraftReasoner-2x7B-MoE-v0.1

Experimental 2-expert MoE merge using mlabonne/Marcoro14-7B-slerp as base.

Notes

Please evaluate before use in any application pipeline. Activation for Math part of the model would be 'math', 'reason', 'solve', 'count'.

Downloads last month
1
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.