Batched Inference

#3
by grimpeaper23 - opened

Hi!

Rookie doubt, I was going through the code and noticed there was no means to perform a batched inference so I have been writing my own code, did I miss out on anything or is there no batched support available exactly? And if there isn't, do you have any pointers on how I could be going about it? Currently I've been modifying the predict function to support batches.

Teklia org

Hello! There is currently no support for batch processing. We plan to add it in the next version.

In the meantime, the best way to process in batches is to modify the predict function. Please note that you need to pad your images before processing them with the model. To pad them, you can take inspiration from the pad_images_masks method used during training and available in the train/utils.__init__.py file.

When will the next version be out? Is there a specific date in mind?

Sign up or log in to comment