Google Publishes Blog on Immersive View NeRFs

Michael Rubloff

Michael Rubloff

Jun 14, 2023

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
Google Product NeRFs
Google Product NeRFs

As we have reported a couple of times this year, Google's release of Immersive View in Google Maps utilizes NeRFs. This revolutionary feature aims to provide photorealistic, multi-dimensional, and interactive experiences to users, giving them a virtual yet authentic feel of various locations like restaurants, cafes, and other venues.

While exteriors are still utilizing other methods, author Marcos Seefelder— describes the workflow of the NeRF interiors of venues.

Right now Google has been using a combination of Mip-NeRF 360, Block NeRF, and NeRF in the Wild.

Google then creates a specific camera path and enables an interactive 360 degree view for users. Google lists a challenge of dimly lit venues such as bars and night time events. However, filmmaker Jake Oleson has recently proven that it is possible to get high fidelity NeRFs in low light scenarios, capturing locations such as a crowded nightclub.

Google plainly describes the strength of NeRFs as:

"Once a NeRF is trained, we have the ability to produce new photos of a scene from any viewpoint and camera lens we choose."

Seefelder is not exaggerating here and it's one of the reasons why I am personally so excited about NeRFs. As soon as a NeRF has been generated, the options that are available to the user border on infinite and allow for tailored editing for each specific use case.

While that all sounds great, it's important to understand the actual workflow as to how Google ends up with NeRFs in Maps. Below is a video outlining that exactly.

Google claims to be able to fully capture a venue within an hour, but I would be curious to see their method, as that appears significantly longer than expected to me. It's also possible that they're looking at large structures, but on average it takes me roughly 20 seconds to 2 minutes depending on the size of a room. They are using DSLR cameras, which give the advantage of high resolution images, can often slow down the capture process as well as introducing a challenge of looking through a viewfinder.

They build their NeRFs on top of Mip-NeRF 360, which shouldn't be a surprise to anyone. It's a robust NeRF pipeline developed by Google and in my opinion, was long considered the gold standard of methods (until Google's Zip NeRF). It's no secret that NeRFs can suffer from artifacts. For the final product to be seen by end users, Google must ensure that their NeRFs are clean and free of artifacts. That's why they bring in low-dimensional generative latent optimization (GLO) vectors from NeRF in The Wild. These vectors embed appearance information for each image and allow for a cleaner final output. Finally, they take exposure conditioning from Block-NeRF to help with some of the dimly lit venues.

While outdoor NeRFs are yet to come for Google Maps, they did tease a small walk through of an outdoor path.

In doing so, we'd unlock similar experiences to every corner of the world and change how users could experience the outdoor world.

The result is stunning and showcases it handling the large amount of reflections you would find in a city. To me, this represents an exciting preview of what is to come and how we might find ourselves interacting with the world at large within the next year.

Google has said that Immersive View will continue to rollout and expand through the course of this year.

I tend to have option paralysis when it comes to picking a coffee shop to go to. LA takes its coffee (and food in general) very seriously and this results with amazing venues to choose from. Having Immersive View to give me all the information of each coffee shop ahead of time will speed up my decision making time and result in a better overall experience. That is if the coffee lives up to the NeRF.

Featured

Recents

Featured

Platforms

Kiri Engine Blender 2.0 Released

3DGS Render by KIRI Engine 2.0 introduces improved ease of use and performance optimization for Blender 4.2.

Michael Rubloff

Nov 22, 2024

Platforms

Kiri Engine Blender 2.0 Released

3DGS Render by KIRI Engine 2.0 introduces improved ease of use and performance optimization for Blender 4.2.

Michael Rubloff

Nov 22, 2024

Platforms

Kiri Engine Blender 2.0 Released

3DGS Render by KIRI Engine 2.0 introduces improved ease of use and performance optimization for Blender 4.2.

Michael Rubloff

Platforms

StorySplat Continues to Evolve: V1.3 Brings Major Update

Several new features, additions, and bugs have been fixed in the educational 3DGS platform.

Michael Rubloff

Nov 21, 2024

Platforms

StorySplat Continues to Evolve: V1.3 Brings Major Update

Several new features, additions, and bugs have been fixed in the educational 3DGS platform.

Michael Rubloff

Nov 21, 2024

Platforms

StorySplat Continues to Evolve: V1.3 Brings Major Update

Several new features, additions, and bugs have been fixed in the educational 3DGS platform.

Michael Rubloff

Research

3DGS to Dense Point Cloud PLY

This GitHub repository is making it easy to convert 3DGS to dense point clouds.

Michael Rubloff

Nov 21, 2024

Research

3DGS to Dense Point Cloud PLY

This GitHub repository is making it easy to convert 3DGS to dense point clouds.

Michael Rubloff

Nov 21, 2024

Research

3DGS to Dense Point Cloud PLY

This GitHub repository is making it easy to convert 3DGS to dense point clouds.

Michael Rubloff

Platforms

RealityCapture 1.5 Released with Radiance Field and COLMAP Export

Transforms.json and COLMAP export have arrived for RealityCapture.

Michael Rubloff

Nov 20, 2024

Platforms

RealityCapture 1.5 Released with Radiance Field and COLMAP Export

Transforms.json and COLMAP export have arrived for RealityCapture.

Michael Rubloff

Nov 20, 2024

Platforms

RealityCapture 1.5 Released with Radiance Field and COLMAP Export

Transforms.json and COLMAP export have arrived for RealityCapture.

Michael Rubloff