Inverted depthmaps

#2
by crimsK - opened

The depthmaps generated from using this model using the pipeline class seems inverted in the sense that closer objects have a higher value and further objects have a lower value. This was not the case in the demo. Any pointers on how to update the output to reflect that in the demo? How do I scale pipe(image)['predicted_depth'] to get metric depth estimation in meters?

Using an example image, this is what is the otuput in the demo:

image.png

When I run the model through huggingface pipeline, this is what I get:

image.png

Rendering it using

def render_depth(values, colormap_name="magma_r") -> Image:
    min_value, max_value = values.min(), values.max()
    normalized_values = (values - min_value) / (max_value - min_value)

    colormap = matplotlib.colormaps[colormap_name]
    colors = colormap(normalized_values, bytes=True)
    colors = colors[:, :, :3] # Discard alpha component
    return Image.fromarray(colors)

image.png

How do I update the predicted_depth so I can find the correct metric depth estimation in meters?

Hi,

Refer to my demo notebook.

Hi @nielsr , thank you for the reference to notebook. Upon a bit more digging, this just seems to be a trivial colormap issue. I seem to be getting the same results as the ones in your notebook. I had assumed the output was a depthmap but it seems like it is outputting disparity. I will close this issue for now but if you do have any pointers on loading the metric depth estimation models to output metric depth opened a separate issue, feel free to comment. Again, really appreciate all the work you have been doing, this makes experimenting a lot faster.

crimsK changed discussion status to closed

Sign up or log in to comment