YiSM-34B-0rn

ExLlamav2 4.65 bpw quants of https://huggingface.co/altomek/YiSM-34B-0rn

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for altomek/YiSM-34B-0rn-4.65bpw-EXL2

Quantized
(9)
this model