runtime error

Exit code: 1. Reason: k_proj.lora_B.lightx2v_2.weight, blocks.39.attn2.add_k_proj.lora_B.lightx2v_2.bias, blocks.39.attn2.add_v_proj.lora_A.lightx2v_2.weight, blocks.39.attn2.add_v_proj.lora_B.lightx2v_2.weight, blocks.39.attn2.add_v_proj.lora_B.lightx2v_2.bias. 0%| | 0/50 [00:00<?, ?it/s] 0%| | 0/50 [00:00<?, ?it/s] skipping cudagraphs due to cpp wrapper enabled W0913 23:52:50.948000 284 site-packages/torch/_inductor/utils.py:1436] Not enough SMs to use max_autotune_gemm mode skipping cudagraphs due to cpp wrapper enabled /usr/local/lib/python3.10/site-packages/gradio/helpers.py:162: UserWarning: In future versions of Gradio, the `cache_examples` parameter will no longer accept a value of 'lazy'. To enable lazy caching in Gradio, you should set `cache_examples=True`, and `cache_mode='lazy'` instead. warnings.warn( Will cache examples in '/home/user/app/.gradio/cached_examples/20' directory at first use. ZeroGPU tensors packing: 0%| | 0.00/35.6G [00:00<?, ?B/s] ZeroGPU tensors packing: 0%| | 0.00/35.6G [00:01<?, ?B/s] Traceback (most recent call last): File "/home/user/app/app.py", line 233, in <module> demo.queue().launch(mcp_server=True) File "/usr/local/lib/python3.10/site-packages/spaces/zero/gradio.py", line 162, in launch task(*task_args, **task_kwargs) File "/usr/local/lib/python3.10/site-packages/spaces/zero/__init__.py", line 24, in startup total_size = torch.pack() File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 397, in pack total_size = _pack(Config.zerogpu_offload_dir) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 387, in _pack pack = pack_tensors(originals, fakes, offload_dir, callback=update) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/packing.py", line 110, in pack_tensors fd = os.open(pack.path(), os.O_CREAT | os.O_WRONLY | os.O_DIRECT) FileNotFoundError: [Errno 2] No such file or directory: '/data-nvme/zerogpu-offload/140209772753440'

Container logs:

Fetching error logs...