Rounding out the busy month of February for NeRFs, another paper has been announced by Google Research.
This method also allows for NeRFs to be displayed on common hardware such as a web browser or phone with a photo realistic and high resolution mesh. Additionally, BakedSDF allows for a high frame rate to be maintained as a user explores the scene. In the below demo, the scene is demonstrated to achieve frame rates above 60fps, while Instant-NGP is only able to achieve >10 fps. While MobileNeRF is able to render ~50 fps, the underlying mesh is clunky and doesn't represent the scene cleanly.
However, BakedSDF captures the best of both worlds by achieving a high frame rate and representing the underlying mesh cleanly. The method also supports Diffuse and Specular colors, while also allowing for accurately relighting of the scene. While only briefly mentioned, there is also the ability to edit the appearance, which is quite underrated especially as people continue to explore Control Net and Gen-1.
This is yet another contribution to the field, no pun intended, for Google Research Scientist team.
The official video of BakedSDF
I highly suggest trying out the 8 demos for yourself to further envision how the majority of people will be interacting with NeRFs in the near future.
My sources have been telling me that this upcoming week will see updates and announcements that will be considered to be "major" and will catapult NeRF forwards. Stay tuned here for the upcoming news and developments.
While several of the most recent developments have been published without authors, NeRF creator, Matt Tancik explained why below: