runtime error
Exit code: 1. Reason: onnx_model.onnx: 0%| | 0.00/40.0 [00:00<?, ?B/s][A onnx_model.onnx: 100%|██████████| 40.0/40.0 [00:00<00:00, 156kB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 19, in <module> session = rt.InferenceSession(model_path) File "/usr/local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/usr/local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 550, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from /home/user/.cache/huggingface/hub/models--NaveenKumar5--Yolov8n-onnx-export/snapshots/8bc4991de3a9356ed1772895fa41b0c7210196d8/onnx_model.onnx failed:Protobuf parsing failed.
Container logs:
Fetching error logs...