Quantized model
#1
by
MaziyarPanahi
- opened
Hi,
Thanks for sharing your model, I have quantized it to GGUF format for users with low resources: https://huggingface.co/MaziyarPanahi/Saul-Instruct-v1-GGUF
This is much appreciated, thanks !
Thanks so much !
The file is missing
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/MaziyarPanahi/Saul-Instruct-v1-GGUF/resolve/main/Saul-Instruct-v1-GGUF.Q4_K_M.gguf
I used the real model's name for the GGUF file rather than my own model that has -GGUF
. So the actual models are like: https://huggingface.co/MaziyarPanahi/Saul-Instruct-v1-GGUF/blob/main/Saul-Instruct-v1.Q4_K_M.gguf
PierreColombo
changed discussion status to
closed