The base class PretrainedConfig implements the common methods for loading/saving a configuration either from a local file or directory, or from a pretrained model configuration provided by the library (downloaded from HuggingFace’s AWS S3 repository).


class transformers.PretrainedConfig(**kwargs)[source]

Base class for all configuration classes. Handles a few parameters common to all models’ configurations as well as methods for loading/downloading/saving configurations.


A configuration file can be loaded and saved to disk. Loading the configuration file and using this file to initialize a model does not load the model weights. It only affects the model’s configuration.

Class attributes (overridden by derived classes):
  • pretrained_config_archive_map: a python dict of with short-cut-names (string) as keys and url (string) of associated pretrained model configurations as values.

  • finetuning_task – string, default None. Name of the task used to fine-tune the model. This can be used when converting from an original (TensorFlow or PyTorch) checkpoint.

  • num_labels – integer, default 2. Number of classes to use when the model is a classification model (sequences/tokens)

  • output_attentions – boolean, default False. Should the model returns attentions weights.

  • output_hidden_states – string, default False. Should the model returns all hidden-states.

  • torchscript – string, default False. Is the model used with Torchscript.

classmethod from_dict(json_object)[source]

Constructs a Config from a Python dictionary of parameters.

classmethod from_json_file(json_file)[source]

Constructs a BertConfig from a json file of parameters.

classmethod from_pretrained(pretrained_model_name_or_path, **kwargs)[source]

Instantiate a PretrainedConfig (or a derived class) from a pre-trained model configuration.

  • pretrained_model_name_or_path


    • a string with the shortcut name of a pre-trained model configuration to load from cache or download, e.g.: bert-base-uncased.

    • a path to a directory containing a configuration file saved using the save_pretrained() method, e.g.: ./my_model_directory/.

    • a path or url to a saved configuration JSON file, e.g.: ./my_model_directory/configuration.json.

  • cache_dir – (optional) string: Path to a directory in which a downloaded pre-trained model configuration should be cached if the standard cache should not be used.

  • kwargs

    (optional) dict: key/value pairs with which to update the configuration object after loading.

    • The values in kwargs of any keys which are configuration attributes will be used to override the loaded values.

    • Behavior concerning key/value pairs whose keys are not configuration attributes is controlled by the return_unused_kwargs keyword parameter.

  • force_download – (optional) boolean, default False: Force to (re-)download the model weights and configuration files and override the cached versions if they exists.

  • proxies – (optional) dict, default None: A dictionary of proxy servers to use by protocol or endpoint, e.g.: {‘http’: ‘’, ‘http://hostname’: ‘’}. The proxies are used on each request.

  • return_unused_kwargs

    (optional) bool:

    • If False, then this function returns just the final configuration object.

    • If True, then this functions returns a tuple (config, unused_kwargs) where unused_kwargs is a dictionary consisting of the key/value pairs whose keys are not configuration attributes: ie the part of kwargs which has not been used to update config and is otherwise ignored.


# We can't instantiate directly the base class `PretrainedConfig` so let's show the examples on a
# derived class: BertConfig
config = BertConfig.from_pretrained('bert-base-uncased')    # Download configuration from S3 and cache.
config = BertConfig.from_pretrained('./test/saved_model/')  # E.g. config (or model) was saved using `save_pretrained('./test/saved_model/')`
config = BertConfig.from_pretrained('./test/saved_model/my_configuration.json')
config = BertConfig.from_pretrained('bert-base-uncased', output_attention=True, foo=False)
assert config.output_attention == True
config, unused_kwargs = BertConfig.from_pretrained('bert-base-uncased', output_attention=True,
                                                   foo=False, return_unused_kwargs=True)
assert config.output_attention == True
assert unused_kwargs == {'foo': False}

Save a configuration object to the directory save_directory, so that it can be re-loaded using the from_pretrained() class method.


Serializes this instance to a Python dictionary.


Save this instance to a json file.


Serializes this instance to a JSON string.