Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer
MidnightMiqu

Midnight-Miqu-103B-v1.5-exl2-3.5bpw-rpcal

This is a 3.5bpw EXL2 quant of FluffyKaeloky/Midnight-Miqu-103B-v1.5

The pippa file used for calibration is optimised for roleplay. The measurement file can be found in the files if you want to do your own quants.

Details about the model and the merge info can be found at the fp16 model link above.

Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.