Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

5.75bpw-h6-exl2 quant of DavidAU's L3.1-RP-Hero-Dirty_Harry-8B

Link to orginal model and creator: https://huggingface.co/DavidAU/L3.1-RP-Hero-Dirty_Harry-8B-GGUF

Downloads last month
13
Inference API
Unable to determine this model's library. Check the docs .

Model tree for James2313123/L3.1-RP-Hero-Dirty_Harry-8B_5.75bpw-h6-exl2

Quantized
(7)
this model