Exporting to ONNX

#1
by JamesXanda - opened

Hi, I was curious how you managed to get the onnx files with the decoder_models_merged for this. I have tried exporting this model 2 ways.

The first was I used the ORTModelForSeq2SeqLM class as defined in optimum and then used save_pretrained method. This seemed to work swimmingly but only outputs:

  1. encoder_model.onnx
  2. decoder_model.onnx
  3. decoder_with_past_model.onnx

I then tried with the optimum-cli using:

optimum-cli export onnx --model google-t5/t5-small onnx_model

This throws the error:

Exception: An error occured during validation, but the model was saved nonetheless at onnx_model. Detailed error: [ONNXRuntimeError] : 1 : FAIL : Load model from onnx_model/decoder_model_merged.onnx failed:/Users/runner/work/1/s/onnxruntime/core/graph/model.cc:180 onnxruntime::Model::Model(ModelProto &&, const PathString &, const IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const ModelOptions &) Unsupported model IR version: 10, max supported IR version: 9

I recognise you are not part of optimum but you somehow managed to make this work and I was wondering how?

You should be able to get it working by downgrading to onnx<1.17.0.

That's odd because my onnx is 1.16.0

According to https://github.com/microsoft/onnxruntime/issues/16638#issuecomment-2095029846, looks like you actually need to downgrade again to onnx<1.16.0. Sorry about that!

Ah, ok thanks

Sign up or log in to comment