runtime error
Exit code: 1. Reason: 00:00<?, ?B/s][A config.json: 100%|██████████| 620/620 [00:00<00:00, 3.01MB/s] You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 - if you loaded a llama tokenizer from a GGUF file you can ignore this message Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> tokenizer = AutoTokenizer.from_pretrained(model_id, use_fast=False) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 1035, in from_pretrained return tokenizer_class_py.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2025, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2278, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama.py", line 171, in __init__ self.sp_model = self.get_spm_processor(kwargs.pop("from_slow", False)) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama.py", line 198, in get_spm_processor tokenizer.Load(self.vocab_file) File "/usr/local/lib/python3.10/site-packages/sentencepiece/__init__.py", line 961, in Load return self.LoadFromFile(model_file) File "/usr/local/lib/python3.10/site-packages/sentencepiece/__init__.py", line 316, in LoadFromFile return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg) TypeError: not a string
Container logs:
Fetching error logs...