Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

How much introduction do you need? You know what it is. If you want something that's closer to regular-flavor Mistral Large, here you go.

A basic af 50/50 slerp merge of anthracite-org/magnum-v2-123b with mistralai/Mistral-Large-Instruct-2407

Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for drexample/magstral-123b-exl2-5.8bpw