Research

MixNeRF: Faster and Cleaner than Instant-NGP?

Michael Rubloff

Michael Rubloff

Apr 26, 2023

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
MixNeRF
MixNeRF

For the last year+, Instant NGP has been seen as the fastest local training NeRF method that would run on a consumer grade GPU. Yesterday, a paper unleashing MixNeRF was released, which trains faster and uses less parameters than Instant-NGP.

NeRFs have made significant advancements in quality and efficiency, but storing features in dense grids can lead to memory bottlenecks and lengthy training times.

One of the common issues with NeRF thus far has been the amount of memory or VRAM that it takes to run a NeRF. MixNeRF attempts to solve this problem by using a mixed up hash table that improves memory efficiency, while also achieving outstanding training time and maintains the quality of reconstruction.

This is primarily done in two steps. The first, as mentioned above is the inclusion of the mixed up hash table to map it into a single hash table. Inherently, that also decreases the number of encoding parameters. However, it may also lead to a decline in learning performance due to limited memory space, so it's critical that the number of hash tables that multiple windows is optimized.

Afterwards, they design an index transformation to get the correct index of a grid point.

Researchers benchmarked MixNeRF against state-of-the-art methods such as Instant-NGP, TensoRF, and DVGO. Results showed that MixNeRF achieved the fastest training time on the same GPU hardware while maintaining similar or higher rendering quality. For example, compared to Instant-NGP, the MixNeRF-8(222) setup achieved around a 12% training time reduction on the same GPU with 0.2 higher average PSNR in the synthetic NeRF dataset.

With its memory-efficient approach, MixNeRF is poised to become the go-to framework for fast and high-quality 3D scene representation.

"Compared to Instant-NGP, we can see that MixNeRF-8 achieved a parameter reduction of 47.00%/47.44%/47.23%/45.94%, and training time reductions of 4s/38s/175s/290s, with a 0.09/0.2/0.24/0.18 PSNR improvement at all four different T settings, respectively."

MixNeRF improves memory efficiency, reduce training and rendering time, while maintaining reconstruction quality.

It seems that the NeRF training times are only going to get faster and faster, while still keeping the same reconstruction quality. I'm excited to see how this all progresses and how our definition of "instant" evolves.

Check out the full paper here!

Featured

Featured

Featured

Research

Frustum Volume Caching

A criticism of NeRFs is their rendering rates. Quietly a couple of papers have been published over the last two months which push NeRFs into real time rates.

Michael Rubloff

Jul 26, 2024

Research

Frustum Volume Caching

A criticism of NeRFs is their rendering rates. Quietly a couple of papers have been published over the last two months which push NeRFs into real time rates.

Michael Rubloff

Jul 26, 2024

Research

Frustum Volume Caching

A criticism of NeRFs is their rendering rates. Quietly a couple of papers have been published over the last two months which push NeRFs into real time rates.

Michael Rubloff

Research

N-Dimensional Gaussians for Fitting of High Dimensional Functions

It significantly improves the fidelity of reflections and other view-dependent effects, making scenes look more realistic.

Michael Rubloff

Jul 24, 2024

Research

N-Dimensional Gaussians for Fitting of High Dimensional Functions

It significantly improves the fidelity of reflections and other view-dependent effects, making scenes look more realistic.

Michael Rubloff

Jul 24, 2024

Research

N-Dimensional Gaussians for Fitting of High Dimensional Functions

It significantly improves the fidelity of reflections and other view-dependent effects, making scenes look more realistic.

Michael Rubloff

Platforms

Luma AI launches Loops for Dream Machine

Luma AI is starting the week off hot, with the release of Loops.

Michael Rubloff

Jul 22, 2024

Platforms

Luma AI launches Loops for Dream Machine

Luma AI is starting the week off hot, with the release of Loops.

Michael Rubloff

Jul 22, 2024

Platforms

Luma AI launches Loops for Dream Machine

Luma AI is starting the week off hot, with the release of Loops.

Michael Rubloff

Platforms

SuperSplat adds Histogram Editing

PlayCanvas is back with a new update to SuperSplat. It's the release of v0.22.2 and then the quick update to v0.24.0.

Michael Rubloff

Jul 18, 2024

Platforms

SuperSplat adds Histogram Editing

PlayCanvas is back with a new update to SuperSplat. It's the release of v0.22.2 and then the quick update to v0.24.0.

Michael Rubloff

Jul 18, 2024

Platforms

SuperSplat adds Histogram Editing

PlayCanvas is back with a new update to SuperSplat. It's the release of v0.22.2 and then the quick update to v0.24.0.

Michael Rubloff

Trending articles

Trending articles

Trending articles

Platforms

Nerfstudio Releases gsplat 1.0

Just in time for your weekend, Ruilong Li and the team at Nerfstudio are bringing a big gift.

Michael Rubloff

Jun 7, 2024

Platforms

Nerfstudio Releases gsplat 1.0

Just in time for your weekend, Ruilong Li and the team at Nerfstudio are bringing a big gift.

Michael Rubloff

Jun 7, 2024

Platforms

Nerfstudio Releases gsplat 1.0

Just in time for your weekend, Ruilong Li and the team at Nerfstudio are bringing a big gift.

Michael Rubloff

News

SIGGRAPH 2024 Program Announced

The upcoming SIGGRAPH conference catalog has been released and the conference will be filled of radiance fields!

Michael Rubloff

May 14, 2024

News

SIGGRAPH 2024 Program Announced

The upcoming SIGGRAPH conference catalog has been released and the conference will be filled of radiance fields!

Michael Rubloff

May 14, 2024

News

SIGGRAPH 2024 Program Announced

The upcoming SIGGRAPH conference catalog has been released and the conference will be filled of radiance fields!

Michael Rubloff

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff