SDXL ONNX infer ValueError
While following the instructions provided here https://huggingface.co/docs/diffusers/optimization/onnx to run the Stable diffusion SDXL ONNX model, the following error pops up during inference.
Error info:
ValueError: Required inputs (['text_embeds', 'time_ids']) are missing from input feed (['sample', 'timestep', 'encoder_hidden_states']).
Error log:
My pip list:
Package Version
accelerate 0.22.0
aiohttp 3.8.5
aiosignal 1.3.1
async-timeout 4.0.3
attrs 23.1.0
certifi 2023.7.22
charset-normalizer 3.2.0
cmake 3.27.2
coloredlogs 15.0.1
datasets 2.14.4
diffusers 0.20.2
dill 0.3.7
evaluate 0.4.0
filelock 3.12.3
flatbuffers 23.5.26
frozenlist 1.4.0
fsspec 2023.6.0
huggingface-hub 0.16.4
humanfriendly 10.0
idna 3.4
importlib-metadata 6.8.0
invisible-watermark 0.2.0
Jinja2 3.1.2
lit 16.0.6
MarkupSafe 2.1.3
mpmath 1.3.0
multidict 6.0.4
multiprocess 0.70.15
networkx 3.1
numpy 1.24.4
nvidia-cublas-cu11 11.10.3.66
nvidia-cuda-cupti-cu11 11.7.101
nvidia-cuda-nvrtc-cu11 11.7.99
nvidia-cuda-runtime-cu11 11.7.99
nvidia-cudnn-cu11 8.5.0.96
nvidia-cufft-cu11 10.9.0.58
nvidia-curand-cu11 10.2.10.91
nvidia-cusolver-cu11 11.4.0.1
nvidia-cusparse-cu11 11.7.4.91
nvidia-nccl-cu11 2.14.3
nvidia-nvtx-cu11 11.7.91
onnx 1.14.1
onnxruntime 1.15.1
opencv-python 4.8.0.76
optimum 1.12.0
packaging 23.1
pandas 2.0.3
Pillow 10.0.0
pip 23.2.1
protobuf 4.24.2
psutil 5.9.5
pyarrow 13.0.0
python-dateutil 2.8.2
pytz 2023.3
PyWavelets 1.4.1
PyYAML 6.0.1
regex 2023.8.8
requests 2.31.0
responses 0.18.0
safetensors 0.3.3
sentencepiece 0.1.99
setuptools 68.0.0
six 1.16.0
sympy 1.12
tokenizers 0.13.3
torch 2.0.1
tqdm 4.66.1
transformers 4.32.1
triton 2.0.0
typing_extensions 4.7.1
tzdata 2023.3
urllib3 2.0.4
wheel 0.38.4
xxhash 3.3.0
yarl 1.9.2
zipp 3.16.2
Fixed by changing from optimum.onnxruntime import ORTStableDiffusionPipeline
to from optimum.onnxruntime import ORTStableDiffusionXLPipeline