Batched inference?

#11
by snimu - opened

I would like to run this model on a lot of images. Currently, this is very slow on my device, and I would like to speed it up by batching the images, but don't see how. Can this be done?

Sign up or log in to comment