"Torch not compiled with CUDA enabled" on Mac M1, how do people on OSX avoid this?

#19
by OriginalJunglist - opened

(Sorry for double posting in a comment further down)

Hi, I'm trying to get this to work on Mac (M1 Max, 64GB RAM) using the workflow seen below. I get an error message saying that CUDA is not enabled. How did you tell controlnet (or pytorch?) to not try to run on CUDA? Thanks in advance for any help!

!!! Exception during processing !!! Torch not compiled with CUDA enabled
Traceback (most recent call last):
File "/Applications/Flux/ComfyUI/execution.py", line 317, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Applications/Flux/ComfyUI/execution.py", line 192, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Applications/Flux/ComfyUI/execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "/Applications/Flux/ComfyUI/execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Applications/Flux/ComfyUI/custom_nodes/x-flux-comfyui/nodes.py", line 304, in sampling
if torch.cuda.is_bf16_supported():
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Applications/Flux/ComfyUI/pytorch_env/lib/python3.12/site-packages/torch/cuda/init.py", line 128, in is_bf16_supported
device = torch.cuda.current_device()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Applications/Flux/ComfyUI/pytorch_env/lib/python3.12/site-packages/torch/cuda/init.py", line 778, in current_device
_lazy_init()
File "/Applications/Flux/ComfyUI/pytorch_env/lib/python3.12/site-packages/torch/cuda/init.py", line 284, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

Screenshot 2024-08-29 at 14.58.49.png

Sign up or log in to comment