img2img example error

#10
by tintwotin - opened

After adding "import torch" to the img2img example I get this error:

The config attributes {'feature_extractor': [None, None], 'image_encoder': [None, None]} were passed to StableDiffusionXLImg2ImgPipeline, but are not expected and will be ignored. Please verify your model_index.json configuration file.
Keyword arguments {'feature_extractor': [None, None], 'image_encoder': [None, None]} are not expected by StableDiffusionXLImg2ImgPipeline and will be ignored.
Loading pipeline components...: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 7/7 [00:02<00:00, 2.57it/s]
Error: Python: Traceback (most recent call last):
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\Program Files\Blender Foundation\Blender 4.0\4.0\python\lib\site-packages\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py", line 1074, in call
) = self.encode_prompt(
File "C:\Program Files\Blender Foundation\Blender 4.0\4.0\python\lib\site-packages\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py", line 360, in encode_prompt
prompt_embeds = text_encoder(text_input_ids.to(device), output_hidden_states=True)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\transformers\models\clip\modeling_clip.py", line 798, in forward
return self.text_model(
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\transformers\models\clip\modeling_clip.py", line 703, in forward
encoder_outputs = self.encoder(
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\transformers\models\clip\modeling_clip.py", line 630, in forward
layer_outputs = encoder_layer(
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\transformers\models\clip\modeling_clip.py", line 371, in forward
hidden_states = self.layer_norm1(hidden_states)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\modules\normalization.py", line 196, in forward
return F.layer_norm(
File "C:\Users\45239\AppData\Roaming\Python\Python310\site-packages\torch\nn\functional.py", line 2543, in layer_norm
return torch.layer_norm(input, normalized_shape, weight, bias, eps, torch.backends.cudnn.enabled)
RuntimeError: "LayerNormKernelImpl" not implemented for 'Half'

Can you make sure to run your model on GPU or remove torch_dtype=torch.float16?

Ah, yes, it was just pipe.to("cuda") missing from that example.

tintwotin changed discussion status to closed

Sign up or log in to comment