runtime error

nection is on. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "app.py", line 16, in <module> mario_lm = MarioLM() File "/home/user/app/mario_gpt/lm.py", line 35, in __init__ self.lm = self.load_pretrained_lm() File "/home/user/app/mario_gpt/lm.py", line 55, in load_pretrained_lm return AutoModelWithLMHead.from_pretrained(PRETRAINED_MODEL_PATH) File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/models/auto/modeling_auto.py", line 1570, in from_pretrained return super().from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs) File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 1082, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/configuration_utils.py", line 644, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/configuration_utils.py", line 699, in _get_config_dict resolved_config_file = cached_file( File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/utils/hub.py", line 429, in cached_file raise EnvironmentError( OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like shyamsn97/Mario-GPT2-700-context-length is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

Container logs:

Fetching error logs...