Model type 'llama' is not supported.

#4
by harshini-kumar - opened

I am trying to run the gguf q5 version of the model. I come across the following error when I load the model from local. But it works fine when I load it directly from huggingface hub.

RuntimeError: Failed to create LLM 'llama' from './Model/codellama-34B/codellama-34b.Q5_K_M.gguf'.

Hf_codellama.png

Sign up or log in to comment