YiSM-34B-0rn

ExLlamav2 6.5 bpw quants of https://huggingface.co/altomek/YiSM-34B-0rn

Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for altomek/YiSM-34B-0rn-6.5bpw-EXL2

Quantized
(9)
this model