Bug with this model under Oobabooga "Internal: unk is not defined"

#1
by Lordwind - opened

Hello,
I downloaded the model which was completet but when I want to load it, it gives following error message:

2023-08-04 13:53:52 INFO:Loading TheBloke_Pygmalion-7B-SuperHOT-8K-GPTQ...
2023-08-04 13:53:53 ERROR:Failed to load the model.
Traceback (most recent call last):
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_maploader
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\text-generation-webui\modules\models.py", line 293, in ExLlama_loader
model, tokenizer = ExllamaModel.from_pretrained(model_name)
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\text-generation-webui\modules\exllama.py", line 68, in from_pretrained
tokenizer = ExLlamaTokenizer(str(tokenizer_model_path))
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\installer_files\env\lib\site-packages\exllama\tokenizer.py", line 10, in init
self.tokenizer = SentencePieceProcessor(model_file = self.path)
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\installer_files\env\lib\site-packages\sentencepiece_init.py", line 447, in Init
self.Load(model_file=model_file, model_proto=model_proto)
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\installer_files\env\lib\site-packages\sentencepiece_init
.py", line 905, in Load
return self.LoadFromFile(model_file)
File "C:\Users\User\Desktop\oobabooga_windows\oobabooga_windows\installer_files\env\lib\site-packages\sentencepiece_init_.py", line 310, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
RuntimeError: Internal: unk is not defined.

It can't be loaded . How can I solve this problem.
I run a new machine Windows 11 with Nvidia RTX 4080

Thank you
Holger

Lordwind changed discussion title from Bug with this model under Oogabooga to Bug with this model under Oobabooga "Internal: unk is not defined"

First thing to do is try downloading again, to confirm all model files are correctly downloaded.

I downloaded the 7B version 4 times and the 13B version 2 times :-) I also tried other forks. Everythingabover Pygmaliion 6 results in the same issue. All from Huggingface. I even disabled all kind of Firewall and Antivirus to secure a uninterrupted download. I think (but that's rather a guess), that the references in the actual Oobabooga refer to divergent folders. If you have an Oobabooga version where these references work, it would be helpful. Or maybe I have to change the references in thew code. But I have no idea what I shall replace there with what :-) . My oobabooga is the actual version.

image.png

Yeah OK I just saw someone else reporting this as well in a different context. I don't yet know what the issue is. I think maybe some code has updated in one of the libraries that ooba uses and now it makes certain older model configs no longer work. But I don't know what exactly, as this model does have an <unk> token defined. If I learn what to change I'll update the model

I haven’t gotten this error but seen a lot of Wierd errors in Ooga today

Thank you for taking care here. The problem seems tohappen with all kind of Llama 2 models. GPT I didn't try. Please send a notification when you found the issue. I think many user of your model (and other models) would be happy :-)

Hello. I got the answer by Oobabooga.

wrong repository

But this was all. Does it help to solve this issue?

Thank you for help. Really great

Holger

I had this error too.
For me, the weight files (.h5 and .bin) had zero volume. I downloaded them again, and now they are correct and It work.

Sign up or log in to comment