Diving into a world where reality can take on the aesthetics of imagination, Geometry Transfer for stylizing radiance fields (Geo-SRF) showcases a glimpse, offering artists and technologists alike the whimsical power to reshape the very fabric of 3D spaces.
Authored by Seoul National University and Meta Reality Labs, this work introduces a novel method, Geo-SRF, that employs depth maps as a style guide, enabling the stylization of both the scene's appearance and shape in a coherent manner to better reflect the intended style.
There have been amazing companies such as Kaiber AI, EbSynth, and RunwayML that have created the most fantastical outputs. However, those are not editing the actual underlying three dimensional space. The significance of this research lies in its departure from existing 3D style transfer methods that primarily focus on transferring aesthetic qualities such as colors, textures, and brushstrokes, while largely neglecting the geometric aspects.
By incorporating geometric deformation into the style transfer process, the authors propose a method that significantly enhances the expressiveness and accuracy of 3D stylizations, offering a more comprehensive approach to capturing the stylistic essence of a reference style. Style Transfer is very simply put by the authors as "rendering one image in the stylistic manner of another, producing a new image that combines the foundational structure of the former with the aesthetic qualities of the latter."
Geometry Transfer is introduced as a novel method that leverages geometric deformation to stylize not just the color and texture but also the shape of 3D scenes. This approach uses depth maps to extract geometric styles from a style guide, which are then applied to the geometry of radiance fields. Instead of solely relying on RGB images, Geometry Transfer uses depth maps as style guides to capture the geometric style. Depth maps provide detailed information about the shape and structure of the style guide, which is crucial for geometric stylization.
The technique uses deformation fields to predict the offset vector for each 3D point, allowing for the accurate stylization of both shape and appearance. This ensures that as the shape of the scene is modified to match the style guide, the appearance (colors and textures) is also appropriately adjusted to align with the new geometry.
Geometry Transfer further enhances the stylization process by using an RGB-D pair (an RGB image and a depth map) as the style guide. This approach allows for a more expressive stylization that accurately reflects the given style in terms of both shape and appearance. The method introduces geometry-aware matching to enhance the diversity of stylization while preserving local geometry. Patch-wise optimization is utilized to effectively transfer the style of geometry, considering the spatial relationships within the scene.
To bring a richer sense of scene depth, Geometry Transfer employs perspective style augmentation, varying the sizes of patterns applied to surfaces based on their distance from the viewer. This technique enhances the perception of depth in the stylized scene, making the stylization more immersive and realistic.
However, this isn't an all or nothing approach. They're also able to create partial stylization of a scene through Panoptic Lifting. This flexibility allows for selective stylization of scene elements, further expanding the creative possibilities of 3D style transfer. I would also be curious how this might work with something like GARField.
The results of extensive experiments demonstrate the ability of Geometry Transfer to achieve a broader and more expressive range of stylizations compared to existing methods, thereby expanding the potential applications of 3D style transfer in content creation for virtual and augmented reality. However, it goes even further than that! Like a dimmer on a light switch, you're also able to interpolate between the strength of the original scene and the new style. This can allow for transition effects to fade into the scene or showcase some kind of evolution within your subject.
There are so many fun and cool applications that can be done with this. I can't imagine this being overly difficult to incorporating into existing platforms and allow people to explore creating fantastical radiance fields.
This research not only opens new avenues for artistic expression within the digital domain but also sets a new benchmark for future explorations in the intersection of art and technology. The integration of geometric considerations into the style transfer process represents a significant advancement in the field, promising to inspire further innovations and applications in the creation and stylization of 3D content.
The code has not yet been published, but we'll be keeping an eye on their Github page.