3DGS Short Film: Where Did The Day Go?

Carlo Oppermann

Carlo Oppermann

Sep 24, 2024

Email
Copy Link
Twitter
Linkedin
Reddit
Whatsapp

In my recent conversation with filmmaker and director Carlo Oppermann, we went through into his hands-on experimentation with cutting-edge tools in the world of Radiance Fields, specifically using Gaussian Splatting. Carlo shared his unique approach to exploring these tools, providing insights into the challenges and triumphs of working with evolving software.

In this interview, we discuss everything from his platform of choice to render, the inspiration behind his latest short film Where Did the Day Go?, and the lessons learned from experimenting with rapidly advancing technologies. Please enjoy Carlo’s glimpse into the creative process and the potential of Radiance Field based methods for storytellng.

Can you explain the platform you used to train and render?

Let me preface this by saying I’m a filmmaker experimenting with these tools. Much of this technology is still quite confusing to me, but I try to understand what I need to make a short film with it 🙂

For training the splats, I experimented with various Gaussian Splatting tools to find a reliable and repeatable workflow. I tried Kiri Engine, Luma, XVerse3DGS, and Polycam. Eventually, I discovered Postshot from Jawset via radiancefields.com, and I was sold. It offered a live preview while creating the splat, providing valuable insights into what was working and what wasn't.

I settled on using a 21mm lens stopped down to f/5.6 (to avoid depth of field) on my Sony FX3. The camera's high second ISO allows for a fast shutter speed and no motion blur, enabling me to capture images from video footage while walking around. With locked exposure and white balance, I could quickly walk around static subjects and capture 4K videos in just a minute or two per scene. This method yielded around 1,500 pictures, which I then imported into Postshot. The software selected the 500 best images for further processing.

Once we identified the best scenes, we handed the Sony footage to our colorist. He performed color grading without typical post-processing effects like grain, halation, and glow. It was crucial to complete the color grading before applying Gaussian Splatting because the footage loses its color depth during the process. We also needed it to match our "regular" footage.

Although I tried plugins for Unreal Engine and Unity, I am most familiar with After Effects. The Gaussian Splatting plugin from Irrealix is impressive, and the artificial motion blur in After Effects worked best for our needs. I think the motion blur is essential in selling the effect and drawing the viewer in.

Can you provide some information about the project?

As a commercial director, I am always looking to develop new skills and styles on “passion projects”. When I stumbled across Gaussian Splatting, I wanted to get ahead of the hype and create a story with it. Usually, I write stories first and then find technologies to tell them, but this time was different. The Gaussian Splatting technology inspired me to focus on something static, like people staring at their phones.

This project was a fun, experimental film that I worked on with a few friends from their production company, Firmament. We aimed to have fun and explore this new technology. However, I intend to apply these skills to a larger project or perhaps something with a client next 🙂

Can you tell us more about the film?

The film reflects my daily experience of excessive smartphone use and endless scrolling. I wanted to portray the feeling of a dopamine-fueled rollercoaster that leads nowhere and makes life pass by. This inspired a visual approach that contrasts static scenes with emotional turmoil.

I often take on small passion projects, so I gathered friends and industry professionals for producing, cinematography, color grading, and sound. We shot the film with a €500 budget over two days in Berlin. It’s a classic no-budget creative Berlin project!

Any takeaways from experimenting with the technology?

The main takeaway is to be cautious with beta software in production. It’s not just about crashes—frequent updates and feature changes slowed me down as I kept adjusting my workflow. Unreal Engine and Unity didn’t meet my needs, but Irrealix released an After Effects plugin mid-shoot, causing another shift. With rapidly evolving technology, it’s in my humble opinion better to focus on the essentials and use what’s available right now. Just accept that your work might look “dated” in a year.

Check out the short film here!

Featured

Recents

Featured

Platforms

RealityCapture 1.5 Released with Radiance Field and COLMAP Export

Transforms.json and COLMAP export have arrived for RealityCapture.

Michael Rubloff

Nov 20, 2024

Platforms

RealityCapture 1.5 Released with Radiance Field and COLMAP Export

Transforms.json and COLMAP export have arrived for RealityCapture.

Michael Rubloff

Nov 20, 2024

Platforms

RealityCapture 1.5 Released with Radiance Field and COLMAP Export

Transforms.json and COLMAP export have arrived for RealityCapture.

Michael Rubloff

Platforms

Meta Hyperscape now available on Quest 2 and Quest Pro

Meta's Radiance Field VR demo can now be experienced on the Quest 2 and Quest Pro.

Michael Rubloff

Nov 19, 2024

Platforms

Meta Hyperscape now available on Quest 2 and Quest Pro

Meta's Radiance Field VR demo can now be experienced on the Quest 2 and Quest Pro.

Michael Rubloff

Nov 19, 2024

Platforms

Meta Hyperscape now available on Quest 2 and Quest Pro

Meta's Radiance Field VR demo can now be experienced on the Quest 2 and Quest Pro.

Michael Rubloff

NeRFs win Two More Emmys

The Phoenix Suns 2023 Intro video was recognized at last night's event.

Michael Rubloff

Nov 19, 2024

NeRFs win Two More Emmys

The Phoenix Suns 2023 Intro video was recognized at last night's event.

Michael Rubloff

Nov 19, 2024

NeRFs win Two More Emmys

The Phoenix Suns 2023 Intro video was recognized at last night's event.

Michael Rubloff

Platforms

Snap Brings 3DGS Trainer into Lens Studio 5.3

With Len Studio 5.3, you can now train individual objects with 3DGS.

Michael Rubloff

Nov 17, 2024

Platforms

Snap Brings 3DGS Trainer into Lens Studio 5.3

With Len Studio 5.3, you can now train individual objects with 3DGS.

Michael Rubloff

Nov 17, 2024

Platforms

Snap Brings 3DGS Trainer into Lens Studio 5.3

With Len Studio 5.3, you can now train individual objects with 3DGS.

Michael Rubloff