how to run it in mac?

#2
by chenliangjing - opened

Dear Ferret-UI-Llama8b repository authors,

I am writing to inquire about running the model_UI code on a Mac computer in a CPU-only environment. I have reviewed the code and made some modifications to ensure it can run without CUDA support, but I would appreciate your guidance on the best approach.

Specifically, I have made the following changes to the code:

  1. Disabled CUDA support by setting torch.cuda.is_available = lambda: False and setting the CUDA_VISIBLE_DEVICES and TORCH_DEVICE environment variables.
  2. Set the data_type argument to use either torch.float16, torch.bfloat16, or torch.float32 depending on the user's preference, in order to leverage mixed precision on the CPU.
  3. Modified the image preprocessing function to use a custom image_process_func that resizes the images without center cropping, as the original code assumes CUDA availability.
  4. Ensured that any region masks are converted to the appropriate data type before being used in the model.

These changes should allow the model_UI code to run on a Mac in a CPU-only environment. However, I would appreciate if you could provide any additional guidance or considerations for running the code in this configuration. For example, are there any specific requirements or recommendations for the CPU hardware, or any other optimizations that could be made to improve the performance on a CPU-only system?

Thank you in advance for your assistance. I look forward to your response and to continuing to work with your excellent Ferret-UI-Llama8b project.

Best regards,
chenliangjing

Sign up or log in to comment