You are viewing v0.26.3 version.
A newer version
v0.31.0 is available.
Activation functions
Customized activation functions for supporting various models in 🤗 Diffusers.
GELU
class diffusers.models.activations.GELU
< source >( dim_in: int dim_out: int approximate: str = 'none' bias: bool = True )
GELU activation function with tanh approximation support with approximate="tanh"
.
GEGLU
class diffusers.models.activations.GEGLU
< source >( dim_in: int dim_out: int bias: bool = True )
A variant of the gated linear unit activation function.
ApproximateGELU
class diffusers.models.activations.ApproximateGELU
< source >( dim_in: int dim_out: int bias: bool = True )
The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.