ONNXRuntimeError

#2
by jackyzhang - opened

Hi Guys,

I try to run the code in python 3.7.14, after loading the input batch and clicking the "Submit" button, then it encounters some error. Here is the log. Thanks.

[i] Input is list
[i] Input size: 26
[i] Data is URL
[i] Extracting contents using trafilatura
[i] Batch size: 26
[i] Running ESG classifier inference...
Traceback (most recent call last):
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\gradio\routes.py", line 298, in run_predict
iterators=iterators,
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\gradio\blocks.py", line 1007, in process_api
result = await self.call_function(fn_index, inputs, iterator, request)
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\gradio\blocks.py", line 849, in call_function
block_fn.fn, *processed_input, limiter=self.limiter
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\anyio\to_thread.py", line 32, in run_sync
func, *args, cancellable=cancellable, limiter=limiter
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\anyio_backends_asyncio.py", line 867, in run
result = context.run(func, *args)
File "H:\python\ESG_BERT\ESG_API_BATCH\app.py", line 340, in inference
prob_outs = _inference_classifier(input_batch_content)
File "H:\python\ESG_BERT\ESG_API_BATCH\app.py", line 267, in _inference_classifier
ort_outs = ort_session.run(None, input_feed=dict(inputs))
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python37_64\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 200, in run
return self._sess.run(output_names, input_feed, run_options)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(int32)) , expected: (tensor(int64))

jackyzhang changed discussion status to closed
TFM ESG UV IA org

Hi jackyzhang, I understand you found a solution for this issue. Perhaps you could share it here in case someone else encounters the same issue? Thanks

sure, the solution is as follows:
in app.py,
add a new line
inputs = {k: v.astype(np.int64) for k, v in inputs.items()}
in the following function to transform the data type.

def _inference_classifier(text):
tokenizer = AutoTokenizer.from_pretrained(MODEL_TRANSFORMER_BASED)
inputs = tokenizer(_lematise_text(text), return_tensors="np", padding="max_length", truncation=True) #this assumes head-only!
inputs = {k: v.astype(np.int64) for k, v in inputs.items()}
ort_session = onnxruntime.InferenceSession(MODEL_ONNX_FNAME)
onnx_model = onnx.load(MODEL_ONNX_FNAME)
onnx.checker.check_model(onnx_model)

jackyzhang changed discussion status to open
jackyzhang changed discussion status to closed

Sign up or log in to comment