runtime error

Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 398, in cached_file resolved_file = hf_hub_download( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/home/user/app/..\models\glm-4-9b-chat'. Use `repo_type` argument if needed. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/trans_web_demo.py", line 60, in <module> model, tokenizer = load_model_and_tokenizer(MODEL_PATH, trust_remote_code=True) File "/home/user/app/trans_web_demo.py", line 50, in load_model_and_tokenizer model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 484, in from_pretrained resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 462, in cached_file raise EnvironmentError( OSError: Incorrect path_or_model_id: '/home/user/app/..\models\glm-4-9b-chat'. Please provide either the path to a local folder or the repo_id of a model on the Hub.

Container logs:

Fetching error logs...