Memory-Efficient Radiance Fields (MERF) Announced by Google Research

Memory-Efficient Radiance Fields (MERF) Announced by Google Research

Memory-Efficient Radiance Fields (MERF) Announced by Google Research

Michael Rubloff

Michael Rubloff

Feb 24, 2023

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
Google Research
Google Research

Almost coincidentally, just hours after Luma AI announced in web browser realtime rendering, a new method of radiance fields has been published, Memory-Efficient Radiance Fields or MERF. The paper appears to be a collaboration between Google Research and the University of Tübingen AI Center in Germany, though on refresh of the page, the authors have been removed.

MERF reduces the memory consumption of prior sparse volumetric radiance fields using a combination of a sparse feature grid and high-resolution 2D feature planes. To support large-scale unbounded scenes, we introduce a novel contraction function that maps scene coordinates into a bounded volume while still allowing for efficient ray-box intersection.

Abstract

In other words, MERF aims to solve the problem of how much memory strain NeRF's place on a device and improves it to the point of enabling realtime rendering in a web browser. In the demo video, a 3090 GPU is able to display the MERF at 60fps without issue.

MERF appears to be significantly well suited to reduce the VRAM Consumption compared to existing methods.

The team has published a collection of interactable demos here. Give them all a try; each one is astounding in its own right. Additionally, see below for a detail comparison of the different NeRF methods, including Instant-NGP.

Whether or not this technology is powering the new announcement by Luma AI or not, it represents a positive step in the direction of bringing down the barrier to entry for NeRFs. The rate at which NeRF advancements have been occurring are nothing short of stunning and seemingly what was thought to be a fanciful request becomes more and more real every day.

Featured

Recents

Featured

Platforms

3DGS to Dense Point Cloud V2

3DGS to Dense Point Cloud returns with awesome updates!

Michael Rubloff

Apr 16, 2025

Platforms

3DGS to Dense Point Cloud V2

3DGS to Dense Point Cloud returns with awesome updates!

Michael Rubloff

Apr 16, 2025

Platforms

3DGS to Dense Point Cloud V2

3DGS to Dense Point Cloud returns with awesome updates!

Michael Rubloff

Platforms

Preferred Networks Inc. Announces 3DGS Unreal Engine 5 Plugin

Tokyo based Preferred Networks Inc has announced an exciting upcoming UE 5 Plugin.

Michael Rubloff

Apr 15, 2025

Platforms

Preferred Networks Inc. Announces 3DGS Unreal Engine 5 Plugin

Tokyo based Preferred Networks Inc has announced an exciting upcoming UE 5 Plugin.

Michael Rubloff

Apr 15, 2025

Platforms

Preferred Networks Inc. Announces 3DGS Unreal Engine 5 Plugin

Tokyo based Preferred Networks Inc has announced an exciting upcoming UE 5 Plugin.

Michael Rubloff

Platforms

404—GEN: Bringing Text to 3D Gaussian Splatting to Unity

A free text to 3D generator in Unity using 3DGS.

Michael Rubloff

Apr 14, 2025

Platforms

404—GEN: Bringing Text to 3D Gaussian Splatting to Unity

A free text to 3D generator in Unity using 3DGS.

Michael Rubloff

Apr 14, 2025

Platforms

404—GEN: Bringing Text to 3D Gaussian Splatting to Unity

A free text to 3D generator in Unity using 3DGS.

Michael Rubloff

Interview

Interview with Rev Lebaredian: Simulation, Robotics, and the Future of Imaging

At NVIDIA’s GTC conference, Rev Lebaredian discusses how simulation can power the next wave of AI, from lifelike 3D reconstruction to industrial scale robotics and physical intelligence.

Michael Rubloff

Apr 14, 2025

Interview

Interview with Rev Lebaredian: Simulation, Robotics, and the Future of Imaging

At NVIDIA’s GTC conference, Rev Lebaredian discusses how simulation can power the next wave of AI, from lifelike 3D reconstruction to industrial scale robotics and physical intelligence.

Michael Rubloff

Apr 14, 2025

Interview

Interview with Rev Lebaredian: Simulation, Robotics, and the Future of Imaging

At NVIDIA’s GTC conference, Rev Lebaredian discusses how simulation can power the next wave of AI, from lifelike 3D reconstruction to industrial scale robotics and physical intelligence.

Michael Rubloff