
For all the attention 3D Gaussian splatting has attracted as a capture and visualization technology, its role in interactive applications has remained underrated. Splats produce photorealistic reconstructions of real spaces, but they lack the collision geometry, navigable surfaces, and dynamic agents that transform a scene into a place someone can inhabit. A new tutorial from PlayCanvas engineer Iakov Sumygin changes that, walking through the complete pipeline that turns a Gaussian splat scan into a fully playable game environment, with NPCs, navmeshes, and collision physics.
The starting point is a scan of an abandoned location captured by Christoph Schindelar. That raw PLY file contains millions of Gaussians encoding the visual texture of the space, but no game engine can do much with it directly. Sumygin's pipeline bridges the gap through a sequence of transformations that extract spatial intelligence from the photometric data. The first step converts the PLY to Streamed SOG format using PlayCanvas's splat transform CLI tool, which reorganizes the Gaussian data for efficient streaming. The final 68 MB build cold loads in seconds. That same CLI then generates a collision mesh automatically, running splat-transform with a single flag produces a .collision.glb file that accurately approximates the geometry of the scanned environment without any manual modeling.
Lighting is a bit more tricky. Gaussian splats bake illumination directly into each Gaussian's spherical harmonics, which looks stunning for static viewing but becomes problematic in games where characters and objects need to respond to environmental light. Sumygin solves this by baking a lightness grid from the splat itself. At 16x16 resolution, the system renders cube maps at probe locations using an offscreen render target, then converts the results to Rec. 601 luminance values. Game objects can then sample this grid to inherit the ambient lighting conditions of whatever part of the scanned space they occupy, producing convincing integration between the photorealistic background and the engine rendered foreground.
With collision and lighting solved, the pipeline moves into more familiar game development territory. A Recast navmesh, the same pathfinding substrate used by countless commercial titles, is computed over the collision geometry, giving characters the spatial graph they need to navigate the environment. Then comes the most personality driven element. Eight NPC characters, each driven by a behavior tree that governs how they explore, react, and interact with the space.
What makes this exciting beyond the technical cleverness is the modular character of the pipeline. Each step, streaming format, collision extraction, lighting bake, navmesh, behavior trees, is independently principled, and the whole thing is published as an open source PlayCanvas project. Developers who need only one piece of the stack can extract it without adopting the rest. The VS Code extension integration and GitHub versioning guidance Sumygin includes creates a reusable workflow.
The collision mesh extraction in particular is striking. Rather than requiring artists to retopologize a mesh by hand, it derives navigable geometry from the same source data that produces the visual representation. The abandoned location Schindelar scanned could have remained a passive visual experience; instead it becomes a space characters can traverse, a place where the lighting is accurate and the floors are solid and the world feels inhabited.
The full PlayCanvas project is available here. The tutorial runs to twelve reading minutes and covers every step in enough detail to reproduce independently.






