How EVER (Exact Volumetric Ellipsoid Rendering) Does This Work?

Michael Rubloff

Michael Rubloff

Oct 3, 2024

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
Exact Volumetric Ellipsoid Rendering
Exact Volumetric Ellipsoid Rendering

The world of real-time 3D rendering through Radiance Field based methodologies has seen impressive advancements in the last four years, particularly through methods such as 3D Gaussian Splatting (3DGS). However, even as these techniques have made leaps in speed and efficiency, they continue to fall short in certain key areas, such as rendering consistency and accuracy.

Exact Volumetric Ellipsoid Rendering (EVER), a newly announced method from Google has something to say about that. Built on a foundation of ray tracing and a ellipsoid-based scene representation, EVER eliminates many of the artifacts that hinder 3DGS.

Like 3D Gaussian Ray Tracing from NVIDIA, EVER is a complete departure from rasterization based Gaussian Splatting, but still creates an underlying representation of the Radiance Field. Additionally, EVER use 3DGRT’s ray tracing acceleration method. However, EVER replaces Gaussians with ellipsoids and uses ray tracing, compared to Gaussian Splatting's gaussians and rasterization.

This shift allows EVER to sidestep the most persistent problem in 3DGS: the "popping" effects that occur when primitives overlap or shift as the camera moves. While the "Stop the Pop" paper introduced at SIGGRAPH 2024 helped mitigate these issues, EVER takes it further by eliminating these artifacts entirely, offering smoother and more consistent results. By using a constant-density ellipsoid representation, EVER can compute the volume rendering equation exactly. This eliminates many of the blending inconsistencies and popping artifacts seen in splatting-based methods like 3DGS.

EVER is able to render scenes with true volumetric consistency, similar to NeRF. Unlike 3DGS, which uses view-independent opacity to approximate scene radiance, EVER maintains a physically accurate density field. This enables proper blending of primitive colors according to the physics of volume rendering, producing sharper and more accurate images.

EVER follows a process similar to Gaussian Splatting by starting with a set of posed images and a sparse point cloud representing the 3D scene. However, instead of Gaussian primitives, EVER initializes ellipsoids with constant density and view-dependent color. These ellipsoids represent distinct parts of the scene and offer more precise control over how light interacts within the scene, surpassing the capabilities of Gaussian primitives.

Unlike 3DGS’s rasterization approach, EVER relies on ray tracing to compute exact intersections between rays and ellipsoidal primitives. The use of a Bounding Volume Hierarchy (BVH) is key here, organizing the ellipsoids into a tree structure that accelerates ray tracing by quickly determining which ellipsoids a ray intersects, without needing to check all primitives individually. This approach ensures that large, complex scenes are rendered efficiently, without sacrificing accuracy.

Once ray-ellipsoid intersections are determined, EVER analytically solves the volume rendering integral, sidestepping the numerical quadrature approximations that often lead to artifacts in splatting-based techniques. As rays traverse the scene, their interactions with ellipsoids are computed, and the color and density contributions of each primitive are integrated to produce the final image. This precise blending process allows EVER to capture the fine details and complex lighting that other methods, such as 3DGS, often fail to resolve.

A further advantage of EVER lies in its use of adaptive density control techniques, which dynamically optimize the scene representation. The system can clone, split, or prune ellipsoids based on their visibility and importance, ensuring computational resources are focused where they matter most. This dynamic optimization allows EVER to balance performance and quality, even in large-scale scenes, by refining the scene without overloading the rendering pipeline.

Because EVER is built on ray tracing, it naturally supports complex optical effects like defocus blur, refractions, reflections, and lens distortions (e.g., from fisheye cameras). These effects are difficult to achieve with rasterization-based methods like 3DGS but are handled efficiently in Ray Tracing based methods, like EVER and 3DGRT.

Training times are a bit longer compared to Gaussian Splatting, coming in at 1-2 hours, but that's a sacrifice I would be willing to make for higher fidelity reconstructions. Similar to 3DGRT, the authors tried to keep the comparison close to the original implementation of 3DGS and should be plug and play for the vast majority of research efforts across the last year, including MCMC and Bilagrid.

It also might not be surprising to learn that EVER leverages Slang.d, as co-author of this paper, George Kopanas created a Slang.d implementation of the Gaussian Splatting Rasterizer, in addition to being co-first author of Gaussian Splatting.

As more advanced techniques emerge, it’s important to focus on the larger trends at play, particularly the ongoing refinement of Radiance Field representations, which form the backbone of these 3D reconstructions for NeRFs, Gaussian Splatting, and now Ray Tracing based approaches. Things are moving quickly in this space, and we can expect to see even more exciting developments on the horizon.

The EVER Project Page contains more comparisons between reconstruction methods. There are plans for the code to be released, but the timeline remains to be seen.

Featured

Recents

Featured

Platforms

GSOPs 2.0: Now Commercially Viable with Houdini Commercial License

The 2.0 release for GSOPs is here with a commercial license!

Michael Rubloff

Dec 20, 2024

Platforms

GSOPs 2.0: Now Commercially Viable with Houdini Commercial License

The 2.0 release for GSOPs is here with a commercial license!

Michael Rubloff

Dec 20, 2024

Platforms

GSOPs 2.0: Now Commercially Viable with Houdini Commercial License

The 2.0 release for GSOPs is here with a commercial license!

Michael Rubloff

Platforms

Odyssey Announces Generative World Model, Explorer

Odyssey shows off their photo real world generator, powered by Radiance Fields.

Michael Rubloff

Dec 18, 2024

Platforms

Odyssey Announces Generative World Model, Explorer

Odyssey shows off their photo real world generator, powered by Radiance Fields.

Michael Rubloff

Dec 18, 2024

Platforms

Odyssey Announces Generative World Model, Explorer

Odyssey shows off their photo real world generator, powered by Radiance Fields.

Michael Rubloff

Platforms

PICO Splat for Unreal Engine Plugin

The Unreal Engine plugin for Pico headsets has been released in beta.

Michael Rubloff

Dec 13, 2024

Platforms

PICO Splat for Unreal Engine Plugin

The Unreal Engine plugin for Pico headsets has been released in beta.

Michael Rubloff

Dec 13, 2024

Platforms

PICO Splat for Unreal Engine Plugin

The Unreal Engine plugin for Pico headsets has been released in beta.

Michael Rubloff

Research

HLOC + GLOMAP Repo

A GitHub repo from Pablo Vela has integrated GLOMAP with HLOC.

Michael Rubloff

Dec 10, 2024

Research

HLOC + GLOMAP Repo

A GitHub repo from Pablo Vela has integrated GLOMAP with HLOC.

Michael Rubloff

Dec 10, 2024

Research

HLOC + GLOMAP Repo

A GitHub repo from Pablo Vela has integrated GLOMAP with HLOC.

Michael Rubloff