Former OpenAI artist-in-residence Alexander Reben was featured in The New York Times this morning for his innovative exploration of AI-based tools. One of his recent projects involved generating video content using OpenAI's video generation model, Sora. Reben then took this output and fed the resulting frames into NVIDIA's Instant NGP to reconstruct them into 3D.
Once the NeRF was reconstructed, the file was passed to Monumental Labs, where robots carved the model into marble. This project demonstrates the fascinating potential of Radiance Fields to bridge digital information with the physical world—what began as a simple text prompt ultimately transformed into a high-quality marble statue.
This isn’t the first time NeRFs have appeared in The New York Times. The publication previously profiled the Radiance Field method in its R&D section.
There’s still no word on when OpenAI's Sora model will be available for the public to try. You can read the full original article here.