runtime error

o install `accelerate` for faster and less memory-intense model loading. You can do so with: ``` pip install accelerate ``` . Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/diffusers/models/model_loading_utils.py", line 105, in load_state_dict return safetensors.torch.load_file(checkpoint_file, device="cpu") AttributeError: module 'safetensors' has no attribute 'torch' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/diffusers/models/model_loading_utils.py", line 116, in load_state_dict if f.read().startswith("version"): File "/usr/local/lib/python3.10/codecs.py", line 322, in decode (result, consumed) = self._buffer_decode(data, self.errors, final) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa8 in position 0: invalid start byte During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 128, in <module> pipeline_paella = PaellaImageRoundtripPipeline() File "/home/user/app/app.py", line 90, in __init__ self.vqgan = PaellaVQModel.from_pretrained( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/diffusers/models/modeling_utils.py", line 831, in from_pretrained state_dict = load_state_dict(model_file, variant=variant) File "/usr/local/lib/python3.10/site-packages/diffusers/models/model_loading_utils.py", line 128, in load_state_dict raise OSError( OSError: Unable to load weights from checkpoint file for '/home/user/.cache/huggingface/hub/models--warp-ai--wuerstchen/snapshots/c3da41406ddd4d9c48c49aa93981a82354351b83/vqgan/diffusion_pytorch_model.safetensors' at '/home/user/.cache/huggingface/hub/models--warp-ai--wuerstchen/snapshots/c3da41406ddd4d9c48c49aa93981a82354351b83/vqgan/diffusion_pytorch_model.safetensors'.

Container logs:

Fetching error logs...