Live 3D Portraits from a Single Image Shown by NVIDIA

Michael Rubloff

Michael Rubloff

May 3, 2023

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
NVIDIA Realtime Radiance Fields
NVIDIA Realtime Radiance Fields

When NVIDIA first announced Maxine and the ability to have eye contact be redirected on a video call, my own eyes nearly popped out of my skull.

Today I had a similar reaction to Live 3D Portraits released by NVIDIA, UC San Diego, and Stanford University that is able to render a photorealistic 3D representation off a single photo. What!? It goes without saying that this opens doors for vast improvements in video conferences, AR. and VR. I want Apple's Animoji, but with photorealistic representations through the Live 3D Portrait.

The method is also able to run at 24fps, which should not be jittery or look out of place and avoids the expensive GAN inversion process.

Look at this video; look at it! How is this possible? You may have guessed what helps power it. Neural radiance fields.

I find myself occasionally thinking that the outputs are good, but still could be improved and then I remember this is being generated off of a literal single image. It's not a great surprise that the authors mention when the input image is a strong profile, the method can struggle. To be honest, I'm more impressed it's able to recognize anything coherent from a profiled view.

Some other limitations show the importance of Vision Transformer (ViT) layers, that help it learn minute and view dependent details, such as the spider tattoos below. Additionally, focal length, principal point, and camera roll, are all critical to allow for realistic outputs.

The video demonstrates that the Live 3D Portrait also works with glasses and headphones. While it's not shown, I am curious to see how hats affect it. It appears that Live 3D Portraits are ready to go, without needing to do any calibration or tuning for a person using it for the first time.

But it doesn't stop there.

The Live 3D Portraits add another dimension to these stylized images and the characters that have been created through them. I am excited to see how this helps lower the barrier to entry for animations for short films and monologues. I cannot wait to see what people create from this!

But it doesn't stop there either. Have you ever seen a cat image, where you felt like you wanted to make it three dimensional?

Personally these cat examples just remind me of the last time I had a laser pointer.

We showcase our results on human and cat face categories in this paper, but the methodology
can apply to any category for which 3D-aware image generators are available.

Real-Time Radiance Field

It's almost a footnote at the end of the paper, but they are planning on extending this technology to objects and reconstructions of real-world objects and interactive 3D visualization from a picture. This truly is amazing and could be pushing us very quickly towards a future that is not dissimilar from Harry Potter.

Moreover, I am excited to see other NeRF papers get incorporated into this technology such as Instruct-NeRF2NeRF, so that we can all have real time mustaches during our Zoom calls.

Featured

Recents

Featured

News

NeRFs Nominated for Regional Emmy

Rent a tuxedo. NeRFs are going to the Emmys.

Michael Rubloff

Sep 12, 2024

News

NeRFs Nominated for Regional Emmy

Rent a tuxedo. NeRFs are going to the Emmys.

Michael Rubloff

Sep 12, 2024

News

NeRFs Nominated for Regional Emmy

Rent a tuxedo. NeRFs are going to the Emmys.

Michael Rubloff

Research

Quadrature Fields

This method from the 3DGS MCMC team pushes NeRF rendering rates well into real time and up to 500fps.

Michael Rubloff

Sep 11, 2024

Research

Quadrature Fields

This method from the 3DGS MCMC team pushes NeRF rendering rates well into real time and up to 500fps.

Michael Rubloff

Sep 11, 2024

Research

Quadrature Fields

This method from the 3DGS MCMC team pushes NeRF rendering rates well into real time and up to 500fps.

Michael Rubloff

Platforms

Agisoft Metashape adds COLMAP Export to Standard License

Another state of the art SfM method just got a bit easier to use.

Michael Rubloff

Sep 9, 2024

Platforms

Agisoft Metashape adds COLMAP Export to Standard License

Another state of the art SfM method just got a bit easier to use.

Michael Rubloff

Sep 9, 2024

Platforms

Agisoft Metashape adds COLMAP Export to Standard License

Another state of the art SfM method just got a bit easier to use.

Michael Rubloff

News

Introducing View Dependent: A New Podcast Exploring the Future of 3D Tech

Join hosts Michael and MrNeRF as they explore the groundbreaking advancements in Radiance Field technology, from NeRFs to Gaussian Splatting, with insights from leading engineers, researchers, and industry veterans shaping the future of 3D tech.

Michael Rubloff

Sep 9, 2024

News

Introducing View Dependent: A New Podcast Exploring the Future of 3D Tech

Join hosts Michael and MrNeRF as they explore the groundbreaking advancements in Radiance Field technology, from NeRFs to Gaussian Splatting, with insights from leading engineers, researchers, and industry veterans shaping the future of 3D tech.

Michael Rubloff

Sep 9, 2024

News

Introducing View Dependent: A New Podcast Exploring the Future of 3D Tech

Join hosts Michael and MrNeRF as they explore the groundbreaking advancements in Radiance Field technology, from NeRFs to Gaussian Splatting, with insights from leading engineers, researchers, and industry veterans shaping the future of 3D tech.

Michael Rubloff