Normalized Configurations
Model configuration classes in 🤗 Transformers are not standardized. Although Transformers implements an attribute_map
attribute that mitigates the issue to some extent, it does not make it easy to reason on common configuration attributes in the code.
NormalizedConfig classes try to fix that by allowing access to the configuration
attribute they wrap in a standardized way.
Base class
While it is possible to create NormalizedConfig
subclasses for common use-cases, it is also possible to overwrite
the original attribute name -> normalized attribute name
mapping directly using the
with_args()
class method.
class optimum.utils.NormalizedConfig
< source >( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )
Handles the normalization of PretrainedConfig
attribute names, allowing to access attributes in a general way.
Existing normalized configurations
class optimum.utils.NormalizedTextConfig
< source >( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )
class optimum.utils.NormalizedSeq2SeqConfig
< source >( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )
class optimum.utils.NormalizedVisionConfig
< source >( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )
class optimum.utils.NormalizedTextAndVisionConfig
< source >( config: typing.Union[transformers.configuration_utils.PretrainedConfig, typing.Dict] allow_new: bool = False **kwargs )