Sorry I was not able to load the weights to use with SDXL

#1
by vionwinnie - opened

Can you show me an example of how to load the weights?

I am having error message of checkpoint not supported :(

ValueError: Checkpoint not supported

ValueError Traceback (most recent call last)
File :1
----> 1 pipe.load_lora_weights("ostris/photorealistic-slider-sdxl-lora", weight_name="sdxl_photorealistic_slider_v1-0.safetensors")

File :1163, in StableDiffusionXLInpaintPipeline.load_lora_weights(self, pretrained_model_name_or_path_or_dict, **kwargs)
1159 def load_lora_weights(self, pretrained_model_name_or_path_or_dict: Union[str, Dict[str, torch.Tensor]], **kwargs):
1160 # We could have accessed the unet config from lora_state_dict() too. We pass
1161 # it here explicitly to be able to tell that it's coming from an SDXL
1162 # pipeline.
-> 1163 state_dict, network_alphas = self.lora_state_dict(
1164 pretrained_model_name_or_path_or_dict,
1165 unet_config=self.unet.config,
1166 **kwargs,
1167 )
1168 self.load_lora_into_unet(state_dict, network_alphas=network_alphas, unet=self.unet)
1170 text_encoder_state_dict = {k: v for k, v in state_dict.items() if "text_encoder." in k}

File /local_disk0/.ephemeral_nfs/envs/pythonEnv-151e0cf5-b984-4ece-8168-b955fdbb222d/lib/python3.10/site-packages/diffusers/loaders.py:1075, in LoraLoaderMixin.lora_state_dict(cls, pretrained_model_name_or_path_or_dict, **kwargs)
1063 if all(
1064 (
1065 k.startswith("lora_te_")
(...)
1071 ):
1072 # Map SDXL blocks correctly.
1073 if unet_config is not None:
1074 # use unet config to remap block numbers
-> 1075 state_dict = cls._map_sgm_blocks_to_diffusers(state_dict, unet_config)
1076 state_dict, network_alphas = cls._convert_kohya_lora_to_diffusers(state_dict)
1078 return state_dict, network_alphas

File /local_disk0/.ephemeral_nfs/envs/pythonEnv-151e0cf5-b984-4ece-8168-b955fdbb222d/lib/python3.10/site-packages/diffusers/loaders.py:1098, in LoraLoaderMixin._map_sgm_blocks_to_diffusers(cls, state_dict, unet_config, delimiter, block_slice_pos)
1096 output_block_ids.add(layer_id)
1097 else:
-> 1098 raise ValueError("Checkpoint not supported")
1100 input_blocks = {
1101 layer_id: [key for key in state_dict if f"input_blocks{delimiter}{layer_id}" in key]
1102 for layer_id in input_block_ids
1103 }
1104 middle_blocks = {
1105 layer_id: [key for key in state_dict if f"middle_block{delimiter}{layer_id}" in key]
1106 for layer_id in middle_block_ids
1107 }

ValueError: Checkpoint not supported

Can you show me an example of how to load the weights?

I am having error message of checkpoint not supported :(

use "lora the explorer" from huggingspace.co

Can you show me an example of how to load the weights?

I am having error message of checkpoint not supported :(

use lora the explorer(multi moda art-lora the explorer)

Sign up or log in to comment