Partitioned model like *.gguf.0*, please combine them into a single gguf file, for example:
*.gguf.0*
gguf
cat codellama-13b-python-q4_0.gguf.* > codellama-13b-python-q4_0.gguf
4-bit