Sony XYN Launches Spatial Capture Solution with Gaussian Splatting

Michael Rubloff
Apr 16, 2026

The LED volumes and real time rendering engines that drive in camera VFX are hungry for photorealistic 3D environments, but building those environments at the fidelity the camera demands is still expensive, slow, and dependent on skilled 3D artists reconstructing real-world locations by hand. Sony's answer, announced yesterday at the lead up to NAB Show 2026, is XYN (pronounced "Zin"), a spatial capture solution that turns real world spaces into production quality 3DCG assets using 3D Gaussian Splatting, purpose built for the LED wall pipeline from capture to final display.
Sony has been building toward this for some time. Sony PCL demonstrated NeRF-based virtual production workflows as early as 2024. The XYN platform, first previewed in 2025, is where that R&D has landed as a productized offering. The Spatial Capture Solution, expected to be available to professional customers in the US starting summer 2026, comprises three integrated tools that together cover the full pipeline from on site capture to on wall rendering.
XYN Spatial Scan Navi is the capture front end. A smartphone application that pairs with a Sony Alpha camera and uses AR overlays to guide operators through the spatial capture process. XYN Spatial Scan is the cloud processing backend, where captured data is transformed into high quality 3DGS assets using Sony's proprietary algorithms. The pipeline produces assets with high resolution, accurate color reproduction, and HDR expression.
The third piece, and the one that distinguishes XYN from general purpose Gaussian splatting tools, is the XYN Spatial Renderer Plugin. This is the component designed explicitly for the LED wall context. It is supposed to feature stable, high quality display regardless of camera distance or angle, HDR support, depth of field adjustment, lighting matching, localized color correction, and CG overlay capabilities. These are the controls that a virtual production DP and VFX supervisor need when integrating a scanned environment with live action elements in real time. A separate plugin for Disguise, one of the dominant media server platforms in LED volume virtual production, has also been announced.
The Gaussian splatting foundation is what makes this technically interesting. Sony's rendering algorithm uses 3DGS to reproduce the texture and depth of real environments with the parallax accuracy and view dependent appearance that LED volumes require. The advantage over mesh based photogrammetry is the same one the research community has been documenting for years. higher visual fidelity at lower reconstruction cost, with continuous detail that meshes approximate but splats capture natively.
Sony has stated that XYN's roadmap extends well beyond virtual production. Future plans include gaming, animation, architecture, manufacturing, digital twins, and cultural heritage archiving, with plugins and services designed to integrate with standard 3DGS production tools and game engines. But the initial release is tightly focused on the VP use case.
The XYN Spatial Capture Solution will be exhibited at Sony's booth (C8401) at NAB Show 2026, April 19-22 in Las Vegas.




