Issue Loading "TheBloke/WizardCoder-Guanaco-15B-V1.0-GGML" Model in "text-genration-ui"

#7
by ors229 - opened

I am encountering an error while attempting to run "TheBloke/WizardCoder-Guanaco-15B-V1.0-GGML" model with the "text-genration-ui" .
The error message I'm receiving is as follows:

"llama.cpp: loading model from models\TheBlokeWizardCoder-15B-1.0-GGML\WizardCoder-15B-1.0.ggmlv3.q4_0.bin
error loading model: unexpectedly reached end of file
llama_load_model_from_file: failed to load model
2023-07-23 20:20:03 ERROR:Failed to load the model.
...
AssertionError
...
AttributeError: 'Llama' object has no attribute 'ctx'
AttributeError: 'LlamaCppModel' object has no attribute 'model'"

llama-cpp-python=0.1.73

I have checked the model file, and it appears to be present and uncorrupted. However, I'm unable to load the model successfully. Could you kindly assist me in resolving this issue?

Thank you very much for your help.

This is not a Llama model and cannot be run in GGML format in text-generation-webui. Please see the README for a list of compatible clients.

Sign up or log in to comment