TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType

#24
by lizongran - opened

The above error occurred when I used the local model path

error:
Traceback (most recent call last):
File "cli_demo.py", line 11, in
tokenizer = AutoTokenizer.from_pretrained("/home/admin/chatglm2-6b", trust_remote_code=True)
File "/home/admin/anaconda3/envs/glm2/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 678, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/admin/anaconda3/envs/glm2/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1825, in from_pretrained
return cls._from_pretrained(
File "/home/admin/anaconda3/envs/glm2/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1988, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/admin/.cache/huggingface/modules/transformers_modules/chatglm2-6b/tokenization_chatglm.py", line 73, in init
self.tokenizer = SPTokenizer(vocab_file)
File "/home/admin/.cache/huggingface/modules/transformers_modules/chatglm2-6b/tokenization_chatglm.py", line 13, in init
assert os.path.isfile(model_path), model_path
File "/home/admin/anaconda3/envs/glm2/lib/python3.8/genericpath.py", line 30, in isfile
st = os.stat(path)
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType

I have the same problem. And i find in my model_dir the tokenizer.model is missing. You can try copy it to this dir.

I have the same problem. And i find in my model_dir the tokenizer.model is missing. You can try copy it to this dir.
I checked and found that this file is indeed missing. Thank you very much!

chatglm3-6b have the same problem, Did the chatglm developers never load the model from local?

lizongran changed discussion status to closed

chatglm3-6b have the same problem, Did the chatglm developers never load the model from local?

check your local fold, maybe you miss the file called "tokenzie.model"

Sign up or log in to comment