Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

5.90 bpw exl2 quant of c4ai-command-r-plus

Fits in 96 GB VRAM with 20k context.

More quants: https://huggingface.co/turboderp/command-r-plus-103B-exl2

Downloads last month
1