runtime error

/s] Downloading ice_text.model: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2.71M/2.71M [00:00<00:00, 21.9MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 9, in <module> tokenizer_glm = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) File "/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 738, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2045, in from_pretrained return cls._from_pretrained( File "/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 196, in __init__ super().__init__( File "/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 366, in __init__ self._add_tokens(self.all_special_tokens_extended, special_tokens=True) File "/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 462, in _add_tokens current_vocab = self.get_vocab().copy() File "/home/user/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 248, in get_vocab vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)} File "/home/user/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 244, in vocab_size return self.sp_tokenizer.num_tokens AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'. Did you mean: '_tokenize'?

Container logs:

Fetching error logs...