YiSM-34B-0rn

ExLlamav2 3.2 bpw quants of https://huggingface.co/altomek/YiSM-34B-0rn

Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for altomek/YiSM-34B-0rn-3.2bpw-EXL2

Quantized
(9)
this model