Model not working now?

#18
by LunSei - opened

Hey, it hasn't been working for days now! What's going on? This was my favorite Stable tool. :(

This comment has been hidden

Getting a bunch of errors!

Also giving me problems all of the sudden:

Exception: Expected is_sm8x || is_sm75 to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.)
Traceback (most recent call last):
File "/root/app/app/server.py", line 66, in createimage
stableDiffusion = getStableDiffusionImage(data['input'])
File "/root/app/app/models.py", line 56, in getStableDiffusionImage
output = gpus[gpu][model](
File "/opt/conda/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py", line 517, in call
noise_pred = self.unet(latent_model_input, t, encoder_hidden_states=text_embeddings).sample
File "/opt/conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/diffusers/models/unet_2d_condition.py", line 392, in forward
sample = self.mid_block(sample, emb, encoder_hidden_states=encoder_hidden_states)
File "/opt/conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/diffusers/models/unet_2d_blocks.py", line 414, in forward
hidden_states = attn(hidden_states, encoder_hidden_states).sample
File "/opt/conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/diffusers/models/attention.py", line 216, in forward
hidden_states = block(hidden_states, context=encoder_hidden_states, timestep=timestep)
File "/opt/conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/diffusers/models/attention.py", line 484, in forward
hidden_states = self.attn1(norm_hidden_states) + hidden_states
File "/opt/conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/diffusers/models/attention.py", line 584, in forward
hidden_states = self._memory_efficient_attention_xformers(query, key, value)
File "/opt/conda/lib/python3.9/site-packages/diffusers/models/attention.py", line 663, in _memory_efficient_attention_xformers
hidden_states = xformers.ops.memory_efficient_attention(query, key, value, attn_bias=None)
File "/opt/conda/lib/python3.9/site-packages/xformers/ops/fmha/init.py", line 191, in memory_efficient_attention
return _memory_efficient_attention(
File "/opt/conda/lib/python3.9/site-packages/xformers/ops/fmha/init.py", line 287, in _memory_efficient_attention
return _memory_efficient_attention_forward(
File "/opt/conda/lib/python3.9/site-packages/xformers/ops/fmha/init.py", line 308, in memory_efficient_attention_forward
out, *
= op.apply(inp, needs_gradient=False)
File "/opt/conda/lib/python3.9/site-packages/xformers/ops/fmha/flash.py", line 155, in apply
softmax_lse, *rest = cls.OPERATOR(
RuntimeError: Expected is_sm8x || is_sm75 to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.)

My issue was I was using the xformers library.
enable_xformers_memory_efficient_attention()
Some model work it and some do not. It uses less memory and runs faster but seems a little unstable.

Sign up or log in to comment