runtime error
Exit code: 1. Reason: ter_pytree_node` is deprecated. Please use `torch.utils._pytree.register_pytree_node` instead. _torch_pytree._register_pytree_node( tokenizer_config.json: 0%| | 0.00/2.10k [00:00<?, ?B/s][A tokenizer_config.json: 100%|ββββββββββ| 2.10k/2.10k [00:00<00:00, 10.2MB/s] tokenizer.model: 0%| | 0.00/493k [00:00<?, ?B/s][A tokenizer.model: 100%|ββββββββββ| 493k/493k [00:00<00:00, 49.9MB/s] tokenizer.json: 0%| | 0.00/1.80M [00:00<?, ?B/s][A tokenizer.json: 100%|ββββββββββ| 1.80M/1.80M [00:00<00:00, 40.6MB/s] special_tokens_map.json: 0%| | 0.00/414 [00:00<?, ?B/s][A special_tokens_map.json: 100%|ββββββββββ| 414/414 [00:00<00:00, 1.57MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 19, in <module> from backend.query_llm import generate_hf, generate_openai File "/home/user/app/backend/query_llm.py", line 12, in <module> tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 787, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 124, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 111, in __init__ fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file) Exception: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 40 column 3
Container logs:
Fetching error logs...