Smaug 72b q1_s gguf?

#2
by vonjack - opened

Hi @Nexesenex , thank you for your hard work to quantize models for low VRAM users.
Can you please also quantize Smaug-72b-v0.1, the leader in Open LLM Leaderboard?

Hey @vonjack . You're welcome!

For your request, I believe that you have some quants already available there : https://huggingface.co/dranger003/Smaug-72B-v0.1-iMat.GGUF

Yeah, but not a q1_s. I'd love to provide the full set, yet, I am unable to convert the smaug model to gguf. Although a q1_s of a 72b model is probably not going to make one happy, either.

Sign up or log in to comment