Diffusers documentation

Configuration

You are viewing v0.13.0 version. A newer version v0.31.0 is available.
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Configuration

The handling of configurations in Diffusers is with the ConfigMixin class.

class diffusers.ConfigMixin

< >

( )

Base class for all configuration classes. Stores all configuration parameters under self.config Also handles all methods for loading/downloading/saving classes inheriting from ConfigMixin with

Class attributes:

  • config_name (str) — A filename under which the config should stored when calling save_config() (should be overridden by parent class).
  • ignore_for_config (List[str]) — A list of attributes that should not be saved in the config (should be overridden by subclass).
  • has_compatibles (bool) — Whether the class has compatible classes (should be overridden by subclass).
  • _deprecated_kwargs (List[str]) — Keyword arguments that are deprecated. Note that the init function should only have a kwargs argument if at least one argument is deprecated (should be overridden by subclass).

from_config

< >

( config: typing.Union[diffusers.configuration_utils.FrozenDict, typing.Dict[str, typing.Any]] = None return_unused_kwargs = False **kwargs )

Parameters

  • config (Dict[str, Any]) — A config dictionary from which the Python class will be instantiated. Make sure to only load configuration files of compatible classes.
  • return_unused_kwargs (bool, optional, defaults to False) — Whether kwargs that are not consumed by the Python class should be returned or not.
  • kwargs (remaining dictionary of keyword arguments, optional) — Can be used to update the configuration object (after it being loaded) and initiate the Python class. **kwargs will be directly passed to the underlying scheduler/model’s __init__ method and eventually overwrite same named arguments of config.

Instantiate a Python class from a config dictionary

Examples:

>>> from diffusers import DDPMScheduler, DDIMScheduler, PNDMScheduler

>>> # Download scheduler from huggingface.co and cache.
>>> scheduler = DDPMScheduler.from_pretrained("google/ddpm-cifar10-32")

>>> # Instantiate DDIM scheduler class with same config as DDPM
>>> scheduler = DDIMScheduler.from_config(scheduler.config)

>>> # Instantiate PNDM scheduler class with same config as DDPM
>>> scheduler = PNDMScheduler.from_config(scheduler.config)

load_config

< >

( pretrained_model_name_or_path: typing.Union[str, os.PathLike] return_unused_kwargs = False **kwargs )

Parameters

  • pretrained_model_name_or_path (str or os.PathLike, optional) — Can be either:

    • A string, the model id of a model repo on huggingface.co. Valid model ids should have an organization name, like google/ddpm-celebahq-256.
    • A path to a directory containing model weights saved using save_config(), e.g., ./my_model_directory/.
  • cache_dir (Union[str, os.PathLike], optional) — Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used.
  • force_download (bool, optional, defaults to False) — Whether or not to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist.
  • resume_download (bool, optional, defaults to False) — Whether or not to delete incompletely received files. Will attempt to resume the download if such a file exists.
  • proxies (Dict[str, str], optional) — A dictionary of proxy servers to use by protocol or endpoint, e.g., {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. The proxies are used on each request.
  • output_loading_info(bool, optional, defaults to False) — Whether or not to also return a dictionary containing missing keys, unexpected keys and error messages.
  • local_files_only(bool, optional, defaults to False) — Whether or not to only look at local files (i.e., do not try to download the model).
  • use_auth_token (str or bool, optional) — The token to use as HTTP bearer authorization for remote files. If True, will use the token generated when running transformers-cli login (stored in ~/.huggingface).
  • revision (str, optional, defaults to "main") — The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on huggingface.co, so revision can be any identifier allowed by git.
  • subfolder (str, optional, defaults to "") — In case the relevant files are located inside a subfolder of the model repo (either remote in huggingface.co or downloaded locally), you can specify the folder name here.

Instantiate a Python class from a config dictionary

It is required to be logged in (huggingface-cli login) when you want to use private or gated models.

Activate the special “offline-mode” to use this method in a firewalled environment.

save_config

< >

( save_directory: typing.Union[str, os.PathLike] push_to_hub: bool = False **kwargs )

Parameters

  • save_directory (str or os.PathLike) — Directory where the configuration JSON file will be saved (will be created if it does not exist).

Save a configuration object to the directory save_directory, so that it can be re-loaded using the from_config() class method.

to_json_file

< >

( json_file_path: typing.Union[str, os.PathLike] )

Parameters

  • json_file_path (str or os.PathLike) — Path to the JSON file in which this configuration instance’s parameters will be saved.

Save this instance to a JSON file.

to_json_string

< >

( ) str

Returns

str

String containing all the attributes that make up this configuration instance in JSON format.

Serializes this instance to a JSON string.

Under further construction 🚧, open a PR if you want to contribute!