svjack's picture
Upload 1392 files
43b7e92 verified
|
raw
history blame
1.53 kB

UNet

Some training methods - like LoRA and Custom Diffusion - typically target the UNet's attention layers, but these training methods can also target other non-attention layers. Instead of training all of a model's parameters, only a subset of the parameters are trained, which is faster and more efficient. This class is useful if you're only loading weights into a UNet. If you need to load weights into the text encoder or a text encoder and UNet, try using the [~loaders.LoraLoaderMixin.load_lora_weights] function instead.

The [UNet2DConditionLoadersMixin] class provides functions for loading and saving weights, fusing and unfusing LoRAs, disabling and enabling LoRAs, and setting and deleting adapters.

To learn more about how to load LoRA weights, see the LoRA loading guide.

UNet2DConditionLoadersMixin

[[autodoc]] loaders.unet.UNet2DConditionLoadersMixin