YiSM-34B-0rn

ExLlamav2 4.65 bpw quants of https://huggingface.co/altomek/YiSM-34B-0rn

Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for altomek/YiSM-34B-0rn-4.65bpw-EXL2

Quantized
(8)
this model