
Michael Rubloff
Aug 25, 2025
Walking through the SIGGRAPH Emerging Technologies section, I was stopped by a sight I did not expect to see: a volumetric print of the Instant NGP fox head. As my eyes adjusted to the rest of the exhibit, more familiar shapes came into view. The Lego bulldozer, the ficus plant, even the Stanford bunny.

It was a showcase of DreamPrinting, a new method that bridges radiance field representations and physical fabrication. NeRFs, gaussian splatting, and other volumetric models have advanced how we capture and render 3D scenes digitally, but until now there’s been no straightforward way to bring those same volumetric assets into the real world with high fidelity. The researchers behind the project were also recognized for their work, winning Best in Show.
Their method introduces Volumetric Printing Primitives (VPPs), a material centric framework that translates radiance field data into instructions suitable for full color 3D printing. Instead of being limited to surface textures or opaque shells, DreamPrinting accounts for light scattering, density, and translucency inside an object. That means internal details, like the glow of clouds or the softness of fur can be faithfully reproduced in miniature physical form. This method is also agnostic to the radiance field representation and can work with either NeRFs or Gaussian Splatting.

The team demonstrated their system by taking outputs from TRELLIS, a text to 3D radiance field generator, and printing them on a Stratasys J850 Prime. The results, just a few centimeters across, captured intricate colors and depth cues that typical surface based prints miss. Unlike traditional pipelines, which struggle with the mismatch between digital rendering and real pigments, DreamPrinting calibrates pigments with optical models and employs 3D halftoning to achieve voxel level precision. The prints can be remarkably small too. I have the iconic bulldozer in a version that spans barely half an inch. However, they have the capability to print up to approximately 490 × 390 × 200 mm.
In practice, this makes DreamPrinting one of the first workflows that can move seamlessly from radiance fields to tangible, high resolution prints. It opens up possibilities for artists and educators who want to materialize digital scenes in ways that preserve their volumetric qualities, in addition to researchers that want datasets printed.
Over the past couple of years, quite a few people have asked me how to print their captures. DreamPrinting finally makes that possible. By aligning volumetric rendering with the physics of printing, it points to a new chapter in digital fabrication, where the rich internal structures of radiance fields are no longer confined to the screen, but can be held in your hands.
In the near future, people will be able to upload either .ingp files or .ply to be printed. For people that don’t have a dataset on hand, Cysta will also have many of the pieces shown at SIGGRAPH for sale. In the meantime, anyone interested in printing their own NeRF/3DGS can email them at hi@cysta.ai. They can work directly from .ingp or .ply, though raw training files are preferred because they use a modified 3DGS pipeline to improve print fidelity. More information about DreamPrinting and the larger company can be found at cysta.ai.