ControlLight / README.md
Nahrawy's picture
Update README.md
cc2f177
metadata
title: ControlLight
emoji: πŸ“Š
colorFrom: red
colorTo: indigo
sdk: gradio
sdk_version: 3.28.2
app_file: app.py
pinned: false
license: cc-by-4.0
tags:
  - stable-diffusion
  - stable-diffusion-diffusers
  - text-to-image
  - diffusers
  - controlnet
  - jax-diffusers-event

ControlLight: Light control through ControlNet and Depth Maps conditioning

We propose a ControlNet using depth maps conditioning that is capable of controlling the light direction in a scene while trying to maintain the scene integrity. The model was trained on VIDIT dataset and A Dataset of Flash and Ambient Illumination Pairs from the Crowd as a part of the Jax Diffusers Event.

Due to the limited available data the model is clearly overfit, but it serves as a proof of concept to what can be further achieved using enough data.

A large part of the training data is synthetic so we encourage further training using synthetically generated scenes, using Unreal engine for example.

The WandB training logs can be found here, it's worth noting that the model was left to overfit for experimentation and it's advised to use the 8K steps weights or prior weights.

This project is a joint work between ParityError and Nahrawy.