runtime error

Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 398, in cached_file resolved_file = hf_hub_download( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 111, in _inner_fn validate_repo_id(arg_value) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 165, in validate_repo_id raise HFValidationError( huggingface_hub.utils._validators.HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: '@xenova/gpt-3.5-turbo'. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> nlp = pipeline("text-generation", model="@xenova/gpt-3.5-turbo", tokenizer="@xenova/gpt-3.5-turbo", device=0) # 指定GPU设备,可选参数 File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 779, in pipeline resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 462, in cached_file raise EnvironmentError( OSError: Incorrect path_or_model_id: '@xenova/gpt-3.5-turbo'. Please provide either the path to a local folder or the repo_id of a model on the Hub.

Container logs:

Fetching error logs...