runtime error

D: Root=1-666a8d2c-0f5dd2fd0cfa311177c15e0e;f03d09e3-c033-4213-b1e5-eebcb0c1430a) Repository Not Found for url: https://huggingface.co/clarin-knext/wsd-encoder/resolve/main/config.json. Please make sure you specified the correct `repo_id` and `repo_type`. If you are trying to access a private or gated repo, make sure you are authenticated. Organization API tokens are deprecated and have stopped working. Use User Access Tokens instead: https://huggingface.co/settings/tokens During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 41, in <module> model = load_model() File "/home/user/app/app.py", line 37, in load_model model = pipeline("feature-extraction", model=model_name, use_auth_token=auth_token) File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 648, in pipeline config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 776, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 559, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 614, in _get_config_dict resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 424, in cached_file raise EnvironmentError( OSError: clarin-knext/wsd-encoder is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.

Container logs:

Fetching error logs...