runtime error

Space failed. Exit code: 1. Reason: l.onnx: 94%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 430M/455M [00:04<00:00, 101MB/s] model.onnx: 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹| 440M/455M [00:04<00:00, 100MB/s] model.onnx: 99%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 451M/455M [00:04<00:00, 100MB/s] model.onnx: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 455M/455M [00:04<00:00, 103MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 267, in <module> main() File "/home/user/app/app.py", line 216, in main change_model("SwinV2") File "/home/user/app/app.py", line 65, in change_model model = load_model(SWIN_MODEL_REPO, MODEL_FILENAME) File "/home/user/app/app.py", line 57, in load_model model = rt.InferenceSession(path) File "/home/user/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/user/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 452, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/user/.cache/huggingface/hub/models--SmilingWolf--wd-v1-4-swinv2-tagger-v2/snapshots/cdb0c7fdc70646f0af29c6f80f8df564344a69b6/model.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:46 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::basic_string<char>, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only *guarantees* support for models stamped with official released onnx opset versions. Opset 4 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 3.

Container logs:

Fetching error logs...