Error with llama.cpp under Oobabooga

#1
by Mykee - opened

I tried loading the 5.0 and 5.1 models in oobabooga with llama.cpp, but I get this error:

llama.cpp: loading model from models\Fredithefish_RedPajama-INCITE-7B-Chat-GGML\ggml-RedPajama-INCITE-7B-Chat-q5_0.bin
error loading model: unexpectedly reached end of file
llama_load_model_from_file: failed to load model
2023-07-17 10:15:32 ERROR:Failed to load the model.
Traceback (most recent call last):
File "C:\oobabooga\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\oobabooga\text-generation-webui\modules\models.py", line 79, in load_model
output = load_func_maploader
File "C:\oobabooga\text-generation-webui\modules\models.py", line 268, in llamacpp_loader
model, tokenizer = LlamaCppModel.from_pretrained(model_file)
File "C:\oobabooga\text-generation-webui\modules\llamacpp_model.py", line 56, in from_pretrained
result.model = Llama(**params)
File "C:\oobabooga\installer_files\env\lib\site-packages\llama_cpp\llama.py", line 305, in init
assert self.model is not None
AssertionError

Exception ignored in: <function Llama.__del__ at 0x0000015D35DDECB0>
Traceback (most recent call last):
File "C:\oobabooga\installer_files\env\lib\site-packages\llama_cpp\llama.py", line 1502, in del
if self.ctx is not None:
AttributeError: 'Llama' object has no attribute 'ctx'
Exception ignored in: <function LlamaCppModel.__del__ at 0x0000015D35D52290>
Traceback (most recent call last):
File "C:\oobabooga\text-generation-webui\modules\llamacpp_model.py", line 29, in del
self.model.del()
AttributeError: 'LlamaCppModel' object has no attribute 'model'

How can I fix it?

Sign up or log in to comment