
Michael Rubloff
Jan 2, 2026
More than a year after its initial debut, the Julia based gaussian splatting project has begun releasing more updates and has now released GaussianSplatting v1.2.0. The update brings with it native support for Apple GPUs via Metal.
With this release, the project now runs across NVIDIA, AMD, and Apple hardware using Julia’s backend agnostic GPU stack. By integrating Metal.jl, GaussianSplatting.jl extends its original vision of solving the “two-language problem” beyond CUDA centric workflows, allowing Apple Silicon users to train and visualize splats directly on device using the same Julia codepath.
From a practical standpoint, nothing about the core algorithm has changed. The update does not alter training behavior, rasterization strategy, or the project’s automatic differentiation pipeline. Instead, it reinforces the architectural argument that Julia’s GPU ecosystem is mature enough to support complex, end to end Gaussian splatting implementations without resorting to mixed C++/Python stacks.
While GaussianSplatting.jl remains a research-oriented implementation rather than a production tool, v1.2.0 broadens who can experiment with it, especially as people get pulled into the Apple ecosystem thanks to recent releases like SHARP.
More information about the Apache 2.0 licensed project here.






