Spaces:
Running
on
Zero
Apply for community grant: Academic project (gpu)
StableDesign is a SOTA deep learning model that transforms images of empty rooms into fully furnished spaces based on text descriptions. This model pipeline won 2nd place in the Generative Interior Design 2024 competition (https://www.aicrowd.com/challenges/generative-interior-design-challenge-2024/leaderboards?challenge_round_id=1314) and outperform currently best pipeline for same task (https://huggingface.co/spaces/ml6team/controlnet-interior-design).
This demo would benefit greatly from GPU hardware for better interactivity and user experience.
Also it will be great to have GPU with 24 Gb VRAM to give the opportunity to deploy model for high resolution images.
Please refer to our github (https://github.com/Lavreniuk/generative-interior-design) for more details.
Hi,
@hysts
, thank you very much!
however I have a problem with utilizing GPU. Locally it works good, but in space I faced next issue:
device = 'cuda'
but after self.pipe = self.pipe.to(device)
or any other models they are still on cpu, but I could print and see that device:0 in A100 gpu. At the same time when I have image "image_to_depth" it moved to gpu...
could you please help with it?
Awesome!!
what an amazing project, congrats
@MykolaL Sorry about this issue. It's a known issue of gradio 4.25.0 and ZeroGPU, which is fixed in gradio==4.26.0. So upgrading the gradio version should fix it. You can change the gradio version in the README.md. https://huggingface.co/spaces/MykolaL/StableDesign/blob/f17a90ea63e1f6896e9fd2859c861b12aab31b0c/README.md?code=true#L7
I'm seeing this error:
File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 55, in create_examples
examples_obj = Examples(
File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 153, in __init__
raise ValueError("If caching examples, `fn` and `outputs` must be provided")
ValueError: If caching examples, `fn` and `outputs` must be provided
I think it's related to this issue and you can avoid this error by adding cache_examples=False
in gr.Examples
for now. https://huggingface.co/spaces/MykolaL/StableDesign/blob/6369e62ee776689bccb4b429f96c49030e974acc/app.py#L325-L326