runtime error
οΏ½οΏ½β | 32.5M/200M [00:00<00:02, 59.6MB/s] 27%|βββ | 53.6M/200M [00:00<00:01, 103MB/s] 33%|ββββ | 65.5M/200M [00:00<00:01, 105MB/s] 38%|ββββ | 76.7M/200M [00:00<00:01, 105MB/s] 44%|βββββ | 87.5M/200M [00:01<00:01, 92.5MB/s] 50%|βββββ | 100M/200M [00:01<00:01, 102MB/s] 55%|ββββββ | 110M/200M [00:01<00:00, 103MB/s] 63%|βββββββ | 125M/200M [00:01<00:00, 96.4MB/s] 67%|βββββββ | 135M/200M [00:01<00:00, 98.2MB/s] 73%|ββββββββ | 145M/200M [00:01<00:00, 97.4MB/s] 78%|ββββββββ | 155M/200M [00:01<00:00, 75.8MB/s] 84%|βββββββββ | 167M/200M [00:02<00:00, 88.3MB/s] 89%|βββββββββ | 177M/200M [00:02<00:00, 87.6MB/s] 93%|ββββββββββ| 186M/200M [00:02<00:00, 77.7MB/s] 100%|ββββββββββ| 200M/200M [00:02<00:00, 88.0MB/s] Traceback (most recent call last): File "/home/user/app/./run/gradio_ootd.py", line 20, in <module> parsing_model_hd = Parsing(0) File "/home/user/app/preprocess/humanparsing/run_parsing.py", line 20, in __init__ self.session = ort.InferenceSession(os.path.join(Path(__file__).absolute().parents[2].absolute(), 'checkpoints/humanparsing/parsing_atr.onnx'), File "/usr/local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/usr/local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 452, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.NoSuchFile: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from /home/user/app/checkpoints/humanparsing/parsing_atr.onnx failed:Load model /home/user/app/checkpoints/humanparsing/parsing_atr.onnx failed. File doesn't exist
Container logs:
Fetching error logs...