Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

6.75bpw-h8-exl2 quant of DavidAU's L3.1-RP-Hero-BigTalker-8B

Link to orginal model and creator: https://huggingface.co/DavidAU/L3.1-RP-Hero-BigTalker-8B-GGUF

Downloads last month
1
Inference API
Unable to determine this model's library. Check the docs .

Model tree for James2313123/L3.1-RP-Hero-BigTalker-8B_6.75bpw-h8-exl2

Quantized
(6)
this model