runtime error

eration_error raise http_error File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 1511, in text_generation bytes_output = self.post(json=payload, model=model, task="text-generation", stream=stream) # type: ignore File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 240, in post hf_raise_for_status(response) File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 333, in hf_raise_for_status raise HfHubHTTPError(str(e), response=response) from e huggingface_hub.utils._errors.HfHubHTTPError: 500 Server Error: Internal Server Error for url: https://api-inference.huggingface.co/models/Ttimofeyka/MistralRP-Noromaid-NSFW-Mistral-7B-GGUF (Request ID: DIPVqvGfT3a12o3ByE3s3) Could not load model Ttimofeyka/MistralRP-Noromaid-NSFW-Mistral-7B-GGUF with any of the following classes: (<class 'transformers.models.mistral.modeling_mistral.MistralForCausalLM'>,). See the original errors: while loading with MistralForCausalLM, an error is thrown: Traceback (most recent call last): File "/src/transformers/src/transformers/pipelines/base.py", line 278, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/modeling_utils.py", line 3483, in from_pretrained resolved_archive_file, sharded_metadata = get_checkpoint_shard_files( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/utils/hub.py", line 1025, in get_checkpoint_shard_files cached_filename = cached_file( ^^^^^^^^^^^^ File "/src/transformers/src/transformers/utils/hub.py", line 380, in cached_file raise EnvironmentError(f"Could not locate {full_filename} inside {path_or_repo_id}.") OSError: Could not locate model-00001-of-00008.safetensors inside Ttimofeyka/MistralRP-Noromaid-NSFW-Mistral-7B-GGUF.

Container logs:

Fetching error logs...