runtime error
ule) File "/usr/local/lib/python3.9/site-packages/diffusers/pipelines/pipeline_utils.py", line 1480, in fn_recursive_set_mem_eff module.set_use_memory_efficient_attention_xformers(valid, attention_op) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 235, in set_use_memory_efficient_attention_xformers fn_recursive_set_mem_eff(module) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 231, in fn_recursive_set_mem_eff fn_recursive_set_mem_eff(child) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 231, in fn_recursive_set_mem_eff fn_recursive_set_mem_eff(child) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 231, in fn_recursive_set_mem_eff fn_recursive_set_mem_eff(child) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 228, in fn_recursive_set_mem_eff module.set_use_memory_efficient_attention_xformers(valid, attention_op) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 235, in set_use_memory_efficient_attention_xformers fn_recursive_set_mem_eff(module) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 231, in fn_recursive_set_mem_eff fn_recursive_set_mem_eff(child) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 231, in fn_recursive_set_mem_eff fn_recursive_set_mem_eff(child) File "/usr/local/lib/python3.9/site-packages/diffusers/models/modeling_utils.py", line 228, in fn_recursive_set_mem_eff module.set_use_memory_efficient_attention_xformers(valid, attention_op) File "/usr/local/lib/python3.9/site-packages/diffusers/models/attention_processor.py", line 199, in set_use_memory_efficient_attention_xformers raise ValueError( ValueError: torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is only available for GPU
Container logs:
Fetching error logs...