Luma AI has been pressing the gas for the last two weeks and have been showing no sign of slowing down. Today, they announced a Video-to-3D API for just $1 per scene or object.
The output is an interactive 3D scene that can be embedded directly, coarse textured models to build interactions on in traditional 3D pipelines, and pre-rendered 360 images and videos.
This is another barrier to adoption that has been lowered by Luma AI and one can start to imagine a world powered by NeRFs. For those wondering what the practices are to capture a NeRF with Luma, check out their tutorial guide here!
Furthermore, the process is fast, taking less than 30 minutes to generate. It's easy to imagine how brands will be able to leverage these new features for conversions and gives Luma another revenue stream. While it's still very early days, I see it being inevitable that larger ecommerce platforms such as Amazon and Jet.com adopting NeRF functionality for its sellers.
Other use cases shown by the Luma team include VFX and Game Development.
For those looking to get started right away, the first ten renders are on Luma; afterwards, it's a $1 per scene or object. There's been no words if there are discounts available to brands looking to convert their entire inventory, but Luma's inbox is open to questions here.
Luma has begun to set itself apart by issuing new updates and features in what seems like mutliple times a week. Combined with their fresh capital, it's going to be exciting following along their development and goal of democratizing 3D.