Any example of batch inference?
#46
by
PrintScr
- opened
Hey!
I am looking for a more optimized way of running this model against a larger dataset of inputs.
Is there are a way to provide batched input into the model? Do you have any examples?
Thanks!