Optimum documentation

Normalized Configurations

You are viewing v1.7.3 version. A newer version v1.23.3 is available.
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Normalized Configurations

Model configuration classes in 🤗 Transformers are not standardized. Although Transformers implements an attribute_map attribute that mitigates the issue to some extent, it does not make it easy to reason on common configuration attributes in the code. NormalizedConfig classes try to fix that by allowing access to the configuration attribute they wrap in a standardized way.

Base class

While it is possible to create NormalizedConfig subclasses for common use-cases, it is also possible to overwrite the original attribute name -> normalized attribute name mapping directly using the with_args() class method.

class optimum.utils.NormalizedConfig

< >

( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )

Parameters

  • config (PretrainedConfig) — The config to normalize.

Handles the normalization of PretrainedConfig attribute names, allowing to access attributes in a general way.

Existing normalized configurations

class optimum.utils.NormalizedTextConfig

< >

( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )

class optimum.utils.NormalizedSeq2SeqConfig

< >

( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )

class optimum.utils.NormalizedVisionConfig

< >

( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )

class optimum.utils.NormalizedTextAndVisionConfig

< >

( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )