runtime error

Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Traceback (most recent call last): File "/home/user/app/app.py", line 20, in <module> tokenizer = AutoTokenizer.from_pretrained(llm) File "/home/user/.local/lib/python3.10/site-packages/ctransformers/hub.py", line 268, in from_pretrained return CTransformersTokenizer(model._llm) File "/home/user/.local/lib/python3.10/site-packages/ctransformers/transformers.py", line 84, in __init__ super().__init__(**kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 367, in __init__ self._add_tokens( File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 467, in _add_tokens current_vocab = self.get_vocab().copy() File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1687, in get_vocab raise NotImplementedError() NotImplementedError

Container logs:

Fetching error logs...