runtime error

l-MiniLM-L6-v2/resolve/main/config.json. Please make sure you specified the correct `repo_id` and `repo_type`. If you are trying to access a private or gated repo, make sure you are authenticated. User Access Token "colab" is expired The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 29, in <module> embedding_model = SentenceTransformer("sentence-transformers/all-MiniLM-L6-v2") File "/usr/local/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 205, in __init__ modules = self._load_auto_model( File "/usr/local/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 1197, in _load_auto_model transformer_model = Transformer( File "/usr/local/lib/python3.10/site-packages/sentence_transformers/models/Transformer.py", line 35, in __init__ config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 928, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 421, in cached_file raise EnvironmentError( OSError: sentence-transformers/all-MiniLM-L6-v2 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`

Container logs:

Fetching error logs...