Research

Diffusion Based 3DGS Relighting

Michael Rubloff

Michael Rubloff

Jul 2, 2024

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
Diffusion Radiance Field
Diffusion Radiance Field

It was only recently where I was discussing how improving reflections in captures were an active topic of research. I have steadily watched paper after paper targeting relighting pass over. Starting with IllumiNeRF, then LE3D, to DiLightNet, to Neural Gaffer, and most recently GS-ROR, relighting as a topic is truly on fire.

Today we're looking at another relighting method that's using a diffusion prior to relight Radiance Fields. They use 3DGS because of their upfront explicit representation advantage, but this method can be adopted by NeRFs or any subsequent Radiance Field method.

The first step in their method is the creation of a 2D relighting neural network with explicit control over lighting direction. This was achieved by fine-tuning a pre-trained Stable Diffusion (SD) model using a multi-illumination dataset. The dataset, which contained images of various scenes captured under different lighting conditions, enabled the network to generate realistic relit versions of each image. The fine-tuning process involved using ControlNet. By leveraging depth maps and spherical harmonics encoding of lighting directions, the network could accurately control the light direction during the relighting process.

Once the 2D relighting network is established, it's used to augment a multi-view dataset captured under a single lighting condition. This involves generating multiple relit versions of each image in the dataset, effectively transforming the original single-illumination dataset into a comprehensive multi-illumination dataset. This augmentation is crucial as it provides the necessary data diversity to train a robust relightable radiance field.

The final step in the method trains a relightable radiance field using 3D Gaussian Splatting (3DGS). This representation is enhanced with a small Multi-Layer Perceptron (MLP), which replaces the spherical harmonic coefficients, and an auxiliary feature vector to address inconsistencies and inaccuracies in the synthesized relit images. The MLP allowed for direct control of the lighting direction, enabling realistic and interactive relighting of full scenes in real time. By optimizing a per-image auxiliary latent vector, the method ensured multi-view consistency, effectively handling the inherent challenges of varying illumination across different views. The training process started with the unlit images to establish a baseline radiance field, followed by integrating the multi-illumination data to develop a robust model capable of interactive and realistic relighting.

To improve the quality of the relit images, the researchers adopted additional strategies to enhance brightness, contrast, and edge sharpness. These included adjusting the denoising objectives and performing color matching between the input and predicted images to maintain consistency in color and brightness across different lighting conditions. Additionally, to ensure the temporal consistency of the generated relighting, the auxiliary latent vectors were optimized to correct for discrepancies in lighting directions, providing smooth transitions across different views.

Once the training is complete, they are able to render it with relighting capabilities, still at 30 fps.

The ControlNet method they're using specializes in indoor scenes, and this appears to be a result of the research focus rather than an addressable issue. One other limitation is that ControlNet does not perform physics-based reasoning, meaning that occasionally shadows might appear in places they shouldn't be and vice versa. There's still a bit of work to be done on that front, but it seems by no means insurmountable.

In total, it runs for 30K steps, with the first 5K dedicated to 3DGS and the subsequent 25K training the full multi-illumination solution. They are able to run this with similar VRAM consumption compared to the original 3DGS, meaning that it can be done on consumer-grade hardware. The code has not been published yet, but with the paper's acceptance into EGSR 2024, we might be seeing it sooner than later! Their project page can be found here for more information.

Featured

Featured

Featured

Platforms

splatviz: Interactive 3DGS Viewer and Python Editor

splatviz, an innovative interactive viewer available on GitHub offers real-time editing and visualization capabilities using a native Python GUI library.

Michael Rubloff

Jul 17, 2024

Platforms

splatviz: Interactive 3DGS Viewer and Python Editor

splatviz, an innovative interactive viewer available on GitHub offers real-time editing and visualization capabilities using a native Python GUI library.

Michael Rubloff

Jul 17, 2024

Platforms

splatviz: Interactive 3DGS Viewer and Python Editor

splatviz, an innovative interactive viewer available on GitHub offers real-time editing and visualization capabilities using a native Python GUI library.

Michael Rubloff

News

Amazon Prime Day Sales

Amazon Prime Day is here and with it are some deals that will help you train more NeRFs/3D Gaussian Splatting captures!

Michael Rubloff

Jul 16, 2024

News

Amazon Prime Day Sales

Amazon Prime Day is here and with it are some deals that will help you train more NeRFs/3D Gaussian Splatting captures!

Michael Rubloff

Jul 16, 2024

News

Amazon Prime Day Sales

Amazon Prime Day is here and with it are some deals that will help you train more NeRFs/3D Gaussian Splatting captures!

Michael Rubloff

Platforms

Three.js-based implemetation, GaussianSplats3D adds 2DGS Support

The Three.js-based implementation of Gaussian Splatting has officially added 2DGS support in v0.4.3.

Michael Rubloff

Jul 16, 2024

Platforms

Three.js-based implemetation, GaussianSplats3D adds 2DGS Support

The Three.js-based implementation of Gaussian Splatting has officially added 2DGS support in v0.4.3.

Michael Rubloff

Jul 16, 2024

Platforms

Three.js-based implemetation, GaussianSplats3D adds 2DGS Support

The Three.js-based implementation of Gaussian Splatting has officially added 2DGS support in v0.4.3.

Michael Rubloff

Research

3D Gaussian Ray Tracing

Today, things are taking an exciting step forward with the introduction of 3D Gaussian Ray Tracing (3DGRT).

Michael Rubloff

Jul 10, 2024

Research

3D Gaussian Ray Tracing

Today, things are taking an exciting step forward with the introduction of 3D Gaussian Ray Tracing (3DGRT).

Michael Rubloff

Jul 10, 2024

Research

3D Gaussian Ray Tracing

Today, things are taking an exciting step forward with the introduction of 3D Gaussian Ray Tracing (3DGRT).

Michael Rubloff

Trending articles

Trending articles

Trending articles

Platforms

Nerfstudio Releases gsplat 1.0

Just in time for your weekend, Ruilong Li and the team at Nerfstudio are bringing a big gift.

Michael Rubloff

Jun 7, 2024

Platforms

Nerfstudio Releases gsplat 1.0

Just in time for your weekend, Ruilong Li and the team at Nerfstudio are bringing a big gift.

Michael Rubloff

Jun 7, 2024

Platforms

Nerfstudio Releases gsplat 1.0

Just in time for your weekend, Ruilong Li and the team at Nerfstudio are bringing a big gift.

Michael Rubloff

News

SIGGRAPH 2024 Program Announced

The upcoming SIGGRAPH conference catalog has been released and the conference will be filled of radiance fields!

Michael Rubloff

May 14, 2024

News

SIGGRAPH 2024 Program Announced

The upcoming SIGGRAPH conference catalog has been released and the conference will be filled of radiance fields!

Michael Rubloff

May 14, 2024

News

SIGGRAPH 2024 Program Announced

The upcoming SIGGRAPH conference catalog has been released and the conference will be filled of radiance fields!

Michael Rubloff

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff