

Michael Rubloff
Nov 6, 2025
At this year’s Adobe MAX, one of the most unexpected “Sneaks” wasn’t another Firefly experiment or Photoshop trick, it was something that directly connects to the world we cover here every day. Project New Depths introduces a way to edit radiance fields directly, giving artists the ability to move objects through real 3D space inside what was once just a photo.
This year, radiance field methods, whether in the form of NeRFs or Gaussian splats, have lived at the edge of creative production. They have captivated the research world, mesmerized in demos, but just finally began to be deployed in the real world. Project New Depths suggests that’s about to be supercharged. Adobe described it as a “radiance field editor” that makes depth manipulation feel as intuitive as adjusting brightness or contrast. In the demo, a tree could be dragged around a field of wildflowers and the rest of the image would adapt in perspective and occlusion, as if the photo itself had volume.
Project New Depths brings intuitive editing tools to “radiance fields” (3D photos), allowing artists to tweak color, shape, and composition in three-dimensional space. It makes editing depth as easy as adjusting brightness—ushering in the next generation of visual storytelling.
Adobe confirmed that the existing radiance field they are working with is gaussian splatting. There definitely is a ton of inspiration taken from PlayCanvas's SuperSplat, which has served as the defacto editor for the radiance field world. There have been several plugins, such as Irrealix's AfterEffects, that bring gaussain splatting into part of the Adobe Suite, but to my knowledge, this is the first time that Adobe has discussed natively adding a radiance field method.
With yet another tech giant turning their gaze onto gaussians, Adobe’s entry suggests a path toward tools that treat spatial scenes as creative materials that are editable, relightable, remixable, without the friction of technical setup or specialized renderers.
For anyone working in this field, Project New Depths feels like a preview of what the next generation of creative software where volumetric capture and neural rendering sit comfortably alongside brushes and masks. It also hints at new challenges, formats, performance, and the question of whether Adobe will embrace open standards for radiance field assets or build its own.
Still, the symbolism matters. A company whose tools defined the 2D era of digital creativity is now showing radiance field editing on the MAX stage. This is an exciting moment, not just for Adobe, but for the entire ecosystem that’s been building the future of lifelike 3D.
If Project New Depths eventually ships as a real product, it won’t just give creators another effect to play with. It could mark the moment when radiance fields stop being a research novelty and start becoming a default part of visual storytelling.
I apologize for taking so long to cover this and will endeavor to ensure that a lapse in coverage like this does not occur again.





