Just the Q3_K_L quantisation from bullerwins' quantisation For use in VLLM
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.