ValueError: Checkpoint not supported

#4
by foolmoron - opened

Getting this error on the latest colab. Looks like the layer string is lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight but the loader code doesn't have a case for down_blocks so it fails.
https://github.com/huggingface/diffusers/blob/main/src/diffusers/loaders.py#L1098

Any ideas?

It might be grabbing the safetensors file by default instead of the bin?

Most likely yes.

I removed it for now just in case, until there is further testing (might have to wait until another diffusers release which has more support for safetensors)

Sounds good. I also found you can pass weight_name="pytorch_lora_weights.bin" to load_lora_weights

foolmoron changed discussion status to closed

Sign up or log in to comment