DroNeRF: Get Cleaner NeRF's with this Capture Path

Michael Rubloff

Michael Rubloff

Mar 18, 2023

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
DroNeRF Camera Position
DroNeRF Camera Position

One of the biggest challenges with creating NeRFs has been estimating the optimal camera capture method. Recently a new paper has been published with an aim to address that very thing. Using their method, they have been able to increase camera pose overlap, resulting in a much sharper output.



Instead of the standard circle around a subject, they argue that it is more effective to get a variety of heights and angles. This results in a 15% more effective coverage than the standard capture method.

This additional 15% makes a huge difference as reported in their paper, showing a Peak Signal to
Noise Ratio (PSNR) and Structural Similarity Index Measure (SSIM)
. PSNR measures the difference between the original and reconstructed images. It computes the ratio between the maximum possible value of a pixel and the mean squared error (MSE) between the original and reconstructed images. Higher PSNR scores indicate lower distortion or noise in the reconstructed image, while SSIM valuates the structural similarity between the original and reconstructed image.

DroNeRF represents a new thought process on a subject that has not received as much attention as some of the other research methods. However, by utilizing the methods described from DroNeRF, this will surely help boost additional research in the field. One existing limitation is that from camera paths, there are often missing information or the program is unclear as to what the subject of the NeRF is.

DroNerf addresses this issue by detecting the largest region of interest, computing the corresponding bounding box, then adjusting the drones to the desired locations accordingly. This allows the drones to
capture images containing the central object’s most important details, resulting in a better NeRF model.

DroNeRF Paper

This method is specifically for a drone flying around a subject, however, I do not see why this could not be adapted to any specific camera. The path and information should remain exactly the same to get the results shown.

DroNeRF significantly improves the NeRF quality of the samples. The subjects are much sharper, despite the settings and the camera remaining the same. Shockingly, these were created from only 24 images each at a resolution of 960x720!This is achieved by parallelizing the optimizing algorithm for individual drones.

Give their capture method a try and see how it works for your NeRF creations! As a note, this only applies to stationary subjects.

Featured

Recents

Featured

News

NeRFs Nominated for Regional Emmy

Rent a tuxedo. NeRFs are going to the Emmys.

Michael Rubloff

Sep 12, 2024

News

NeRFs Nominated for Regional Emmy

Rent a tuxedo. NeRFs are going to the Emmys.

Michael Rubloff

Sep 12, 2024

News

NeRFs Nominated for Regional Emmy

Rent a tuxedo. NeRFs are going to the Emmys.

Michael Rubloff

Research

Quadrature Fields

This method from the 3DGS MCMC team pushes NeRF rendering rates well into real time and up to 500fps.

Michael Rubloff

Sep 11, 2024

Research

Quadrature Fields

This method from the 3DGS MCMC team pushes NeRF rendering rates well into real time and up to 500fps.

Michael Rubloff

Sep 11, 2024

Research

Quadrature Fields

This method from the 3DGS MCMC team pushes NeRF rendering rates well into real time and up to 500fps.

Michael Rubloff

Platforms

Agisoft Metashape adds COLMAP Export to Standard License

Another state of the art SfM method just got a bit easier to use.

Michael Rubloff

Sep 9, 2024

Platforms

Agisoft Metashape adds COLMAP Export to Standard License

Another state of the art SfM method just got a bit easier to use.

Michael Rubloff

Sep 9, 2024

Platforms

Agisoft Metashape adds COLMAP Export to Standard License

Another state of the art SfM method just got a bit easier to use.

Michael Rubloff

News

Introducing View Dependent: A New Podcast Exploring the Future of 3D Tech

Join hosts Michael and MrNeRF as they explore the groundbreaking advancements in Radiance Field technology, from NeRFs to Gaussian Splatting, with insights from leading engineers, researchers, and industry veterans shaping the future of 3D tech.

Michael Rubloff

Sep 9, 2024

News

Introducing View Dependent: A New Podcast Exploring the Future of 3D Tech

Join hosts Michael and MrNeRF as they explore the groundbreaking advancements in Radiance Field technology, from NeRFs to Gaussian Splatting, with insights from leading engineers, researchers, and industry veterans shaping the future of 3D tech.

Michael Rubloff

Sep 9, 2024

News

Introducing View Dependent: A New Podcast Exploring the Future of 3D Tech

Join hosts Michael and MrNeRF as they explore the groundbreaking advancements in Radiance Field technology, from NeRFs to Gaussian Splatting, with insights from leading engineers, researchers, and industry veterans shaping the future of 3D tech.

Michael Rubloff