how to run it in mac?
Dear Ferret-UI-Llama8b repository authors,
I am writing to inquire about running the model_UI
code on a Mac computer in a CPU-only environment. I have reviewed the code and made some modifications to ensure it can run without CUDA support, but I would appreciate your guidance on the best approach.
Specifically, I have made the following changes to the code:
- Disabled CUDA support by setting
torch.cuda.is_available = lambda: False
and setting theCUDA_VISIBLE_DEVICES
andTORCH_DEVICE
environment variables. - Set the
data_type
argument to use eithertorch.float16
,torch.bfloat16
, ortorch.float32
depending on the user's preference, in order to leverage mixed precision on the CPU. - Modified the image preprocessing function to use a custom
image_process_func
that resizes the images without center cropping, as the original code assumes CUDA availability. - Ensured that any region masks are converted to the appropriate data type before being used in the model.
These changes should allow the model_UI
code to run on a Mac in a CPU-only environment. However, I would appreciate if you could provide any additional guidance or considerations for running the code in this configuration. For example, are there any specific requirements or recommendations for the CPU hardware, or any other optimizations that could be made to improve the performance on a CPU-only system?
Thank you in advance for your assistance. I look forward to your response and to continuing to work with your excellent Ferret-UI-Llama8b project.
Best regards,
chenliangjing