Exporting a model from one framework to some format (also called backend here) involves specifying inputs and outputs information that the export function needs. The way optimum.exporters
is structured for each backend is as follows:
The role of the TasksManager is to be the main entry-point to load a model given a name and a task, and to get the proper configuration for a given (architecture, backend) couple. That way, there is a centralized place to register the task -> model class
and (architecture, backend) -> configuration
mappings. This allows the export functions to use this, and to rely on the various checks it provides.
The tasks supported might depend on the backend, but here are the mappings between a task name and the auto class for both PyTorch and TensorFlow.
It is possible to know which tasks are supported for a model for a given backend, by doing:
>>> from optimum.exporters.tasks import TasksManager
>>> model_type = "distilbert"
>>> # For instance, for the ONNX export.
>>> backend = "onnx"
>>> distilbert_tasks = list(TasksManager.get_supported_tasks_for_model_type(model_type, backend).keys())
>>> print(distilbert_tasks)
["default", "masked-lm", "causal-lm", "sequence-classification", "token-classification", "question-answering"]
Task | Auto Class |
---|---|
causal-lm , causal-lm-with-past |
AutoModelForCausalLM |
default , default-with-past |
AutoModel |
masked-lm |
AutoModelForMaskedLM |
question-answering |
AutoModelForQuestionAnswering |
seq2seq-lm , seq2seq-lm-with-past |
AutoModelForSeq2SeqLM |
sequence-classification |
AutoModelForSequenceClassification |
token-classification |
AutoModelForTokenClassification |
multiple-choice |
AutoModelForMultipleChoice |
image-classification |
AutoModelForImageClassification |
object-detection |
AutoModelForObjectDetection |
image-segmentation |
AutoModelForImageSegmentation |
masked-im |
AutoModelForMaskedImageModeling |
semantic-segmentation |
AutoModelForSemanticSegmentation |
speech2seq-lm |
AutoModelForSpeechSeq2Seq |
Task | Auto Class |
---|---|
causal-lm , causal-lm-with-past |
TFAutoModelForCausalLM |
default , default-with-past |
TFAutoModel |
masked-lm |
TFAutoModelForMaskedLM |
question-answering |
TFAutoModelForQuestionAnswering |
seq2seq-lm , seq2seq-lm-with-past |
TFAutoModelForSeq2SeqLM |
sequence-classification |
TFAutoModelForSequenceClassification |
token-classification |
TFAutoModelForTokenClassification |
multiple-choice |
TFAutoModelForMultipleChoice |
semantic-segmentation |
TFAutoModelForSemanticSegmentation |
Handles the task name -> model class
and architecture -> configuration
mappings.
(
model: str
framework: str = None
)
→
str
Determines the framework to use for the export.
The priority is in the following order:
framework
.(
model_type: str
exporter: str
task: str = 'default'
model_name: typing.Optional[str] = None
)
→
ExportConfigConstructor
Gets the ExportConfigConstructor
for a model type and task combination.
( task: str framework: str = 'pt' )
Attempts to retrieve an AutoModel class from a task name.
( task: str model: str framework: str = None cache_dir: str = None )
Parameters
str
) —
The task required.
str
) —
The name of the model to export.
str
, optional) —
The framework to use for the export. See TasksManager.determine_framework
for the priority should
none be provided.
str
, optional) —
Path to a directory in which a downloaded pretrained model weights have been cached if the standard cache should not be used.
Retrieves a model from its name and the task to be enabled.
(
model_type: str
exporter: str
model_name: typing.Optional[str] = None
)
→
TaskNameToExportConfigDict
Parameters
str
) —
The model type to retrieve the supported tasks for.
str
) —
The name of the exporter.
str
, optional) —
The name attribute of the model object, only used for the exception message.
Returns
TaskNameToExportConfigDict
The dictionary mapping each task to a corresponding ExportConfig
constructor.
Retrieves the task -> exporter backend config constructors
map from the model type.
(
model_name_or_path: str
)
→
str
Infers the task from the model repo.