runtime error

co/meta-llama/Meta-Llama-3-8B/resolve/bf48d88ac1f13e640e9664a4baac7326023e4118/config.json. Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/transformers/src/transformers/pipelines/base.py", line 279, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/modeling_utils.py", line 2983, in from_pretrained config, model_kwargs = cls.config_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 602, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "/src/transformers/src/transformers/utils/hub.py", line 416, in cached_file raise EnvironmentError( OSError: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B. 401 Client Error. (Request ID: Root=1-663f5fbd-7b77b13d6c18af93124234e0;a8f28a3b-c768-4f73-93cb-4a7388e68eca) Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/bf48d88ac1f13e640e9664a4baac7326023e4118/config.json. Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.

Container logs:

Fetching error logs...