Diffusers documentation
Activation functions
Diffusers
You are viewing v0.25.0 version.
A newer version
v0.32.2 is available.
Activation functions
Customized activation functions for supporting various models in π€ Diffusers.
GELU
class diffusers.models.activations.GELU
< source >( dim_in: intdim_out: intapproximate: str = 'none'bias: bool = True )
GELU activation function with tanh approximation support with approximate="tanh"
.
GEGLU
class diffusers.models.activations.GEGLU
< source >( dim_in: intdim_out: intbias: bool = True )
A variant of the gated linear unit activation function.
ApproximateGELU
class diffusers.models.activations.ApproximateGELU
< source >( dim_in: intdim_out: intbias: bool = True )
The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.