You are viewing main version, which requires installation from source. If you'd like
regular pip install, checkout the latest stable version (v0.27.2).
Normalization layers
Customized normalization layers for supporting various models in 🤗 Diffusers.
AdaLayerNorm
class diffusers.models.normalization.AdaLayerNorm
< source >( embedding_dim: int num_embeddings: int )
Norm layer modified to incorporate timestep embeddings.
AdaLayerNormZero
class diffusers.models.normalization.AdaLayerNormZero
< source >( embedding_dim: int num_embeddings: int )
Norm layer adaptive layer norm zero (adaLN-Zero).
AdaLayerNormSingle
class diffusers.models.normalization.AdaLayerNormSingle
< source >( embedding_dim: int use_additional_conditions: bool = False )
Norm layer adaptive layer norm single (adaLN-single).
As proposed in PixArt-Alpha (see: https://arxiv.org/abs/2310.00426; Section 2.3).
AdaGroupNorm
class diffusers.models.normalization.AdaGroupNorm
< source >( embedding_dim: int out_dim: int num_groups: int act_fn: Optional = None eps: float = 1e-05 )
Parameters
- embedding_dim (
int
) — The size of each embedding vector. - num_embeddings (
int
) — The size of the embeddings dictionary. - num_groups (
int
) — The number of groups to separate the channels into. - act_fn (
str
, optional, defaults toNone
) — The activation function to use. - eps (
float
, optional, defaults to1e-5
) — The epsilon value to use for numerical stability.
GroupNorm layer modified to incorporate timestep embeddings.