the batch inference result
#5
by
hrdxwandg1987
- opened
with a lot of data, following the readme example, the inference speed is too slow, i want to batch inference. but the batch inference score result is different with single request inference score result.