runtime error
Exit code: 1. Reason: to their corresponding key in the module instead of copying them in place?) warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta ' /usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py:2025: UserWarning: for vision_model.head.mlp.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta ' Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s][A Loading checkpoint shards: 25%|βββ | 1/4 [00:04<00:14, 4.97s/it][A Loading checkpoint shards: 50%|βββββ | 2/4 [00:13<00:14, 7.10s/it][A Loading checkpoint shards: 75%|ββββββββ | 3/4 [00:35<00:13, 13.75s/it][A Loading checkpoint shards: 100%|ββββββββββ| 4/4 [00:36<00:00, 8.81s/it][A Loading checkpoint shards: 100%|ββββββββββ| 4/4 [00:36<00:00, 9.11s/it] ZeroGPU tensors packing: 0%| | 0.00/16.8G [00:00<?, ?B/s][A ZeroGPU tensors packing: 0%| | 0.00/16.8G [00:00<?, ?B/s] Traceback (most recent call last): File "/home/user/app/app.py", line 451, in <module> demo.launch() File "/usr/local/lib/python3.10/site-packages/spaces/zero/gradio.py", line 142, in launch task(*task_args, **task_kwargs) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 339, in pack _pack(Config.zerogpu_offload_dir) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 331, in _pack pack = pack_tensors(originals, fakes, offload_dir, callback=update) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/packing.py", line 110, in pack_tensors os.posix_fallocate(fd, 0, total_asize) OSError: [Errno 28] No space left on device
Container logs:
Fetching error logs...