runtime error
Exit code: 1. Reason: Downloading base model... Downloading LoRA adapter... Loading base model and tokenizer... Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 403, in cached_file resolved_file = hf_hub_download( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py", line 101, in inner_f return f(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/home/user/.cache/huggingface/hub/models--unsloth--Llama-3.2-3B-Instruct-GGUF/snapshots/c93dae3959fbdd2bfcf2306c53e741cfbaf5b724/Llama-3.2-3B-Instruct-Q8_0.gguf'. Use `repo_type` argument if needed. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 19, in <module> tokenizer = AutoTokenizer.from_pretrained(base_model_path) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 857, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 689, in get_tokenizer_config resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 469, in cached_file raise EnvironmentError( OSError: Incorrect path_or_model_id: '/home/user/.cache/huggingface/hub/models--unsloth--Llama-3.2-3B-Instruct-GGUF/snapshots/c93dae3959fbdd2bfcf2306c53e741cfbaf5b724/Llama-3.2-3B-Instruct-Q8_0.gguf'. Please provide either the path to a local folder or the repo_id of a model on the Hub.
Container logs:
Fetching error logs...