You can to export models to ONNX from two frameworks in 🤗 Optimum: PyTorch and TensorFlow. There is an export function for each of these frameworks, export_pytorch() and export_tensorflow(), but the recommended way of using those is via the main export function export(), which will take care of using the proper exporting function according to the available framework on the system and on the model class.
(
model: typing.Union[ForwardRef('PreTrainedModel'), ForwardRef('ModelMixin')]
config: OnnxConfig
opset: int
output: Path
device: str = 'cpu'
input_shapes: typing.Optional[typing.Dict] = None
)
→
Tuple[List[str], List[str]]
Parameters
PreTrainedModel
) —
The model to export.
int
) —
The version of the ONNX operator set to use.
Path
) —
Directory to store the exported ONNX model.
str
, defaults to "cpu"
) —
The device on which the ONNX model will be exported. Either cpu
or cuda
. Only PyTorch is supported for
export on CUDA devices.
optional[Dict]
, defaults to None
) —
If specified, allows to use specific shapes for the example input provided to the ONNX exporter.
Returns
Tuple[List[str], List[str]]
A tuple with an ordered list of the model’s inputs, and the named inputs from the ONNX configuration.
Exports a PyTorch model to an ONNX Intermediate Representation.
(
model: TFPreTrainedModel
config: OnnxConfig
opset: int
output: Path
)
→
Tuple[List[str], List[str]]
Parameters
TFPreTrainedModel
) —
The model to export.
int
) —
The version of the ONNX operator set to use.
Path
) —
Directory to store the exported ONNX model.
str
, optional, defaults to cpu
) —
The device on which the ONNX model will be exported. Either cpu
or cuda
. Only PyTorch is supported for
export on CUDA devices.
Returns
Tuple[List[str], List[str]]
A tuple with an ordered list of the model’s inputs, and the named inputs from the ONNX configuration.
Exports a TensorFlow model to an ONNX Intermediate Representation.
(
model: typing.Union[ForwardRef('PreTrainedModel'), ForwardRef('TFPreTrainedModel'), ForwardRef('ModelMixin')]
config: OnnxConfig
output: Path
opset: typing.Optional[int] = None
device: str = 'cpu'
input_shapes: typing.Optional[typing.Dict] = None
)
→
Tuple[List[str], List[str]]
Parameters
PreTrainedModel
or TFPreTrainedModel
) —
The model to export.
Path
) —
Directory to store the exported ONNX model.
Optional[int]
, defaults to None
) —
The version of the ONNX operator set to use.
str
, optional, defaults to cpu
) —
The device on which the ONNX model will be exported. Either cpu
or cuda
. Only PyTorch is supported for
export on CUDA devices.
Optional[Dict]
, defaults to None
) —
If specified, allows to use specific shapes for the example input provided to the ONNX exporter.
Returns
Tuple[List[str], List[str]]
A tuple with an ordered list of the model’s inputs, and the named inputs from the ONNX configuration.
Exports a Pytorch or TensorFlow model to an ONNX Intermediate Representation.
( model: typing.Union[ForwardRef('PreTrainedModel'), ForwardRef('TFPreTrainedModel'), ForwardRef('ModelMixin')] dummy_input_names: typing.Iterable[str] )
Checks that the dummy inputs from the ONNX config is a subset of the allowed inputs for model
.
( config: OnnxConfig reference_model: typing.Union[ForwardRef('PreTrainedModel'), ForwardRef('TFPreTrainedModel'), ForwardRef('ModelMixin')] onnx_model: Path onnx_named_outputs: typing.List[str] atol: typing.Optional[float] = None input_shapes: typing.Optional[typing.Dict] = None device: str = 'cpu' )
Parameters
~OnnxConfig
—
The configuration used to export the model.
~PreTrainedModel
or ~TFPreTrainedModel
) —
The model used for the export.
Path
) —
The path to the exported model.
List[str]
) —
The names of the outputs to check.
Optional[float]
, defaults to None
) —
The absolute tolerance in terms of outputs difference between the reference and the exported model.
Optional[Dict]
, defaults to None
) —
If specified, allows to use specific shapes to validate the ONNX model on.
str
, defaults to "cpu"
) —
The device on which the ONNX model will be validated. Either cpu
or cuda
. Validation on a CUDA device is supported only for PyTorch.
Raises
ValueError
ValueError
— If the outputs shapes or values do not match between the reference and the exported model.Validates the export by checking that the outputs from both the reference and the exported model match.