Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

EXL2 quants for Aqueducts 18B - https://huggingface.co/MarsupialAI/Aqueducts-18B

Downloads last month
2
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Quantized from