Text Generation
Transformers
Safetensors
English
llama
causal-lm
text-generation-inference
4-bit precision
gptq

TypeError: not a string

#15
by m0-ch - opened

I am able to use other models such as "vicuna-13b-GPTQ-4bit-128g" and "gpt4-x-alpaca-13b-native-4bit-128g", but I am unable to use this model.

After reading through other discussions, I have already changed the folder name to "TheBloke_stable-vicuna-13B-4bit-128g-GPTQ" as instructed, and I have properly set the GPTQ parameters.

Here is the error when loading the model:

Traceback (most recent call last):
File “C:\Users\USER\Documents\Programs\oobabooga_windows\text-generation-webui\server.py”, line 60, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name)
File “C:\Users\USER\Documents\Programs\oobabooga_windows\text-generation-webui\modules\models.py”, line 242, in load_model
tokenizer = LlamaTokenizer.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}/“), clean_up_tokenization_spaces=True)
File “C:\Users\USER\Documents\Programs\oobabooga_windows\installer_files\env\lib\site-packages\transformers\tokenization_utils_base.py”, line 1811, in from_pretrained
return cls.from_pretrained(
File “C:\Users\USER\Documents\Programs\oobabooga_windows\installer_files\env\lib\site-packages\transformers\tokenization_utils_base.py”, line 1965, in from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File “C:\Users\USER\Documents\Programs\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\llama\tokenization_llama.py”, line 96, in init
self.sp_model.Load(vocab_file)
File "C:\Users\USER\Documents\Programs\oobabooga_windows\installer_files\env\lib\site-packages\sentencepiece_init.py", line 905, in Load
return self.LoadFromFile(model_file)
File "C:\Users\USER\Documents\Programs\oobabooga_windows\installer_files\env\lib\site-packages\sentencepiece_init.py”, line 310, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
TypeError: not a string

That's a weird error. Please check that all files are downloaded correctly. It might be that you have a bad tokenizer.model, or that file or another file is missing.

You are right, tokenizer.model was missing. Now it worked. Thank you!

m0-ch changed discussion status to closed

Sign up or log in to comment