Guide

What's the best Focal Length to take a NeRF?

Michael Rubloff

Michael Rubloff

Jul 10, 2023

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp
NeRF Focal Length
NeRF Focal Length

As one of the website's readers was so kind to point out, I promised to dive into focal lengths from my post about NeRF camera settings earlier this year. Obviously the camera settings make a massive difference for getting high quality NeRFs, but I believe the lens choice is right up there.

There is a longstanding photography maxim that you want to marry your lens and trade out camera bodies over time. This holds true with NeRFs, as switching lenses has seriously upped my NeRF game.

So with that in mind, what's worked best for me?

What's the ideal focal length for NeRFs?

In my opinion, the sweet spot seems to be 14mm. Yes this gets into fisheye territory and the added field of view means you're giving your NeRF method more data with each input image. However the benefits extend further as we approach some photography principles.

When I was first told about using a fisheye lens, my head kept thinking about this gif that compares focal lengths.

My sole thought was I was taught never to shoot portraits below 20mm because of the distortion. It took me a few weeks to wrestle with it until my friend finally lent me a 14mm lens and very quickly I realized I was wrong. The NeRFs immediately be sharper with significantly less artifacts.

Additionally, I realized there are more benefits to shooting wide.

Wide angle lenses are less susceptible to shaky hands when hitting the shutter button — though don't look at this with a one stop fix. Still try and keep standard photography principles of creating a makeshift tripod with your elbows.

But this doesn't stop at taking NeRFs of people or objects. Imagine how much easier shooting NeRFs inside becomes when you can suddenly fit significantly more per image. The wide angle helps get into places where you might not be able to shoot in traditionally and help fill in the data gaps.

The final benefit has to do with another photography principle, hyperfocal distance. You see this term pop up a lot in landscape photography and cinematography where it's critical to have as much of the view in front of the lens in focus. BH Photo defines hyperfocal distance as:

The distance when the lens is focused at infinity, at which objects from half of this distance to infinity will be in focus (or “acceptable sharpness”) for a particular lens. Alternatively, hyperfocal distance may refer to the closest distance that a lens can be focused for a given aperture while objects at a distance (infinity) will remain acceptably sharp.

BH Photo

As you might be guessing, a major factor of calculating your hyperfocal distance is your focal length combined with your aperture. For those that really want to go down the rabbit hole, here's a hyperfocal distance calculator.

Due to focusing to infinity, you get that deep depth of field and sharp backgrounds. Pair that with the Nerfacto or Nerfacto-huge and you have a potential landscape scene ready to go in Unreal Engine.

What's a bad focal length for NeRFs?

This is a tough one that gives the dreaded answer of it depends. Once you get past 50mm, you might want to consider using a wider lens. That said, if you're using a rig or dolly as part of your setup, that will change the recommendation. While you will be able to cut out almost all camera shaking, you will need to supplement your data set with more images to compensate for less views.

Also keep in mind, the more zoomed in you are, the wider of a circle you will have to make around your subject to NeRF. Often the marketing buzz for new lenses, bokeh will be your enemy in NeRFs. If you'd like to control your aperture in post, both Instant-NGP and Luma offer controls to do so.

If purchasing a wide angle lens isn't justifiable for you, there are also a ton of great options. A lot of kit lenses start in the 24mm range and you might already have one. These can be a great option for someone looking to use an existing lens. For crop shooters, Canon also makes a great pancake 24mm lens for roughly $130, but make sure that it's compatible with your camera body.

Should I use a Prime Lens for NeRFs?

The short answer is yes. Prime lenses have better build quality and offer preciously calibrated glass to the focal distance. Zoom lenses in theory sound better, but in practice, do not work well.

Prime lenses will also often be lighter than their zoom counterparts and depending how long it takes you to shoot a NeRF, you might find your forearms happier at the end of the day with a lighter lens.

A lot of prime lenses will come with the ability to open wide, but this will not be necessary for NeRFs. As I mentioned in the camera settings article, I would much rather crank up my ISO and shoot with a higher aperture than anything. If you're going to be using this lens specifically for NeRFs, you don't need a lens that opens to f2.8 or larger. I try to not go below f4 — it's just not necessary and the shallow depth of field begins to detract from the NeRF.

This doesn't mean you need to go out and buy a crazy expensive lens. There are great options at most budget options. Personally, I've been using KEH.com to find used lenses at a discount. This isn't sponsored whatsoever, but it's what helped me pull the trigger on a lens. I also was scoping out eBay for a while, but opted to go with KEH because of their warranty and return policy (I was a little skeptical at first how much of a difference a wide angle lens would make).

Taking NeRFs on a Phone?

There's a decent amount of people reading this probably thinking, but I'm just using my iPhone/smartphone. What about me? Well, you're also in luck. Smartphones are amazing for taking NeRFs to begin with. They offer sharp, deep depth of field, and they fit in your pocket. But that's not why you're reading this. If your phone offers a wide angle mode, use that. You want to be getting the widest field of view possible.

For the iPhone, that means using the .5 setting. When I'm shooting NeRFs on my phone, it's always set to this.

Keep in mind, nothing has been set in stone thus far for NeRF best practices, but for me this has made a massive difference. I don't go anywhere without my lens and have been really happy with the results I've gotten thus far. So far the lowest amount of photos I've tried in a dataset that came out really clean has been 24, but I'm looking to see how low I can go.

If something has been working better for you, let me know! I'm really curious to see what works for you!

Featured

Featured

Featured

Platforms

OpenNeRF added to Nerfstudio

OpenNeRF is the latest method to be supported by Nerfstudio.

Michael Rubloff

May 24, 2024

Platforms

OpenNeRF added to Nerfstudio

OpenNeRF is the latest method to be supported by Nerfstudio.

Michael Rubloff

May 24, 2024

Platforms

OpenNeRF added to Nerfstudio

OpenNeRF is the latest method to be supported by Nerfstudio.

Michael Rubloff

Platforms

PlayCanvas's SuperSplat Updated with PWA support

PlayCanvas's Supersplat has continued to receive additional updates. This time it's coming with a big boost to performance, yet again.

Michael Rubloff

May 24, 2024

Platforms

PlayCanvas's SuperSplat Updated with PWA support

PlayCanvas's Supersplat has continued to receive additional updates. This time it's coming with a big boost to performance, yet again.

Michael Rubloff

May 24, 2024

Platforms

PlayCanvas's SuperSplat Updated with PWA support

PlayCanvas's Supersplat has continued to receive additional updates. This time it's coming with a big boost to performance, yet again.

Michael Rubloff

Research

Reflecting on NeRF-Casting

Late last year we looked at Uni-SDF which introduced dual radiance fields to better represent reflections in a scene. However, I just happened to see NeRF-Casting on Github a little while ago

Michael Rubloff

May 23, 2024

Research

Reflecting on NeRF-Casting

Late last year we looked at Uni-SDF which introduced dual radiance fields to better represent reflections in a scene. However, I just happened to see NeRF-Casting on Github a little while ago

Michael Rubloff

May 23, 2024

Research

Reflecting on NeRF-Casting

Late last year we looked at Uni-SDF which introduced dual radiance fields to better represent reflections in a scene. However, I just happened to see NeRF-Casting on Github a little while ago

Michael Rubloff

Platforms

Scaniverse arrives on Android

Gaussian Splatting platform, Scaniverse, is now available on Android.

Michael Rubloff

May 21, 2024

Platforms

Scaniverse arrives on Android

Gaussian Splatting platform, Scaniverse, is now available on Android.

Michael Rubloff

May 21, 2024

Platforms

Scaniverse arrives on Android

Gaussian Splatting platform, Scaniverse, is now available on Android.

Michael Rubloff

Trending articles

Trending articles

Trending articles

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

Research

Gaustudio

Gaussian Splatting methods have continued to pour in over the first three months of the year. With the rate of adoption, being able to merge and compare these methods, shortly after their release would be amazing.

Michael Rubloff

Apr 8, 2024

Research

Gaustudio

Gaussian Splatting methods have continued to pour in over the first three months of the year. With the rate of adoption, being able to merge and compare these methods, shortly after their release would be amazing.

Michael Rubloff

Apr 8, 2024

Research

Gaustudio

Gaussian Splatting methods have continued to pour in over the first three months of the year. With the rate of adoption, being able to merge and compare these methods, shortly after their release would be amazing.

Michael Rubloff

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Research

The MERF that turned into a SMERF

For the long time readers of this site, earlier this year, we looked into Google Research's Memory Efficient Radiance Fields (MERF). Now, they're back with another groundbreaking method: Streamable Memory Efficient Radiance Fields, or SMERF.

Michael Rubloff

Dec 13, 2023

Research

The MERF that turned into a SMERF

For the long time readers of this site, earlier this year, we looked into Google Research's Memory Efficient Radiance Fields (MERF). Now, they're back with another groundbreaking method: Streamable Memory Efficient Radiance Fields, or SMERF.

Michael Rubloff

Dec 13, 2023

Research

The MERF that turned into a SMERF

For the long time readers of this site, earlier this year, we looked into Google Research's Memory Efficient Radiance Fields (MERF). Now, they're back with another groundbreaking method: Streamable Memory Efficient Radiance Fields, or SMERF.

Michael Rubloff

Featured

Featured

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

It doesn't seem like a lot of people know this, but you can run CamP and Zip-NeRF in the cloud, straight through Google and it's actually super easy. It’s called CloudNeRF.

Michael Rubloff

May 8, 2024

Platforms

Google CloudNeRF: Zip-NeRF and CamP in the Cloud

Michael Rubloff

May 8, 2024

Research

Gaustudio

Gaussian Splatting methods have continued to pour in over the first three months of the year. With the rate of adoption, being able to merge and compare these methods, shortly after their release would be amazing.

Michael Rubloff

Apr 8, 2024

Gaustudio

Research

Gaustudio

Gaussian Splatting methods have continued to pour in over the first three months of the year. With the rate of adoption, being able to merge and compare these methods, shortly after their release would be amazing.

Michael Rubloff

Apr 8, 2024

Gaustudio

Research

Gaustudio

Michael Rubloff

Apr 8, 2024

Gaustudio

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

SplaTV

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Kevin Kwok, perhaps better known as Antimatter15, has released something amazing: splaTV.

Michael Rubloff

Mar 15, 2024

SplaTV

Tools

splaTV: Dynamic Gaussian Splatting Viewer

Michael Rubloff

Mar 15, 2024

SplaTV