--- license: mit library_name: unity-sentis --- These are [MiDaS](https://github.com/isl-org/MiDaS) models converted to ONNX to do Monocular Depth Estimation with Unity Sentis. ## How to Use Import the package into Unity using [GitHub](https://github.com/julienkay/com.doji.midas), [OpenUPM](https://openupm.com/packages/com.doji.midas/) or the [Asset Store](https://assetstore.unity.com/packages/slug/268501) Example source code to run this model can be found in the [Documentation](https://docs.doji-tech.com/com.doji.midas/manual/getting-started.html) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/610320a1c00d060d893f2a93/hzyDJUdMyRgs2poyin4Xw.png) ## Model Details The MiDaS models below were converted to ONNX using [this colab notebook](https://github.com/julienkay/com.doji.midas/blob/v1.0.0/tools/MiDaS_ONNX_Export.ipynb) Input normalization is baked into the models, so the model input 'input_image' is expected to be in the [0,1] range. Input sizes are static (e.g. 256, 384 or 512 denoted by the model suffix) ## Unity Sentis Unity Sentis is the inference engine that runs in Unity 3D. More information can be found [here](https://unity.com/products/sentis)