RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

#25
by charles2030 - opened

Traceback (most recent call last):
File "/home/work/wuxuesong/ChatGLM2-6B/web_demo.py", line 6, in
tokenizer = AutoTokenizer.from_pretrained("/home/work/wuxuesong/ChatGLM2-6B/chatglm2-6b", trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/miniconda3/envs/speed/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 678, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/miniconda3/envs/speed/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1825, in from_pretrained
return cls._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^
File "/home/work/miniconda3/envs/speed/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1988, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/.cache/huggingface/modules/transformers_modules/chatglm2-6b/tokenization_chatglm.py", line 73, in init
self.tokenizer = SPTokenizer(vocab_file)
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/.cache/huggingface/modules/transformers_modules/chatglm2-6b/tokenization_chatglm.py", line 14, in init
self.sp_model = SentencePieceProcessor(model_file=model_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/miniconda3/envs/speed/lib/python3.11/site-packages/sentencepiece/init.py", line 447, in Init
self.Load(model_file=model_file, model_proto=model_proto)
File "/home/work/miniconda3/envs/speed/lib/python3.11/site-packages/sentencepiece/init.py", line 905, in Load
return self.LoadFromFile(model_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/work/miniconda3/envs/speed/lib/python3.11/site-packages/sentencepiece/init.py", line 310, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University org

Please check the tokenizer.model file you downloaded is the same as https://huggingface.co/THUDM/chatglm2-6b/blob/main/tokenizer.model

zxdu20 changed discussion status to closed

Sign up or log in to comment