Defines the size of a mini-batch uses during an iteration of the inference. **Batch size** defines the batch size used per GPU.