
Michael Rubloff
Sep 8, 2025
At this year’s SIGGRAPH conference, amid the buzz of new breakthroughs in computer graphics, I had the opportunity to sit down with Richard Kerris, Vice President of Media and Entertainment at NVIDIA. With a career that spans four decades and a front row seat to some of the most transformative moments in visual technology, Kerris brings both historical perspective and forward looking vision to the conversation.
From his first SIGGRAPH in 1985, where he likens the experience to Dorothy stepping into Technicolor in The Wizard of Oz, to today’s rapidly evolving world of radiance field representations like Gaussian splatting, Kerris has witnessed firsthand how tools once seen as experimental become indispensable to creators. Our discussion explored the trajectory of these technologies, how they’re shaping film, television, and sports, and why he believes we’re entering a new era of immersion, democratization, and storytelling.
Michael: Thank you so much, Richard, for joining me today here at SIGGRAPH. I’m really excited to sit down and have this conversation with you. I’ve got a couple of surprises related to Gaussian splatting and other radiance field representations. I’m really looking forward to chatting about media and entertainment.
Richard: Me too.
Michael: When did you first come across radiance field representations, and what was that like for you?
Richard: It was probably NeRF. Starting to see what could be done by capturing through photographs and turning them into 3D—and wondering where that would lead us. What’s been amazing is how quickly it’s moved along. Ideally, what everybody wants is to sample the world around them, learn from it, and use it.
Michael: You’ve had such a long career in media and entertainment, seeing the progression toward 3D and using it for production. That must be pretty cool for you.
Richard: This is actually my 40th SIGGRAPH. My first was in 1985, and I’ll never forget it. It felt like that scene in The Wizard of Oz when the door opens and the world is suddenly in color. That’s what it was like—seeing what big computer graphics were doing.
I had done some things on smaller systems, but at SIGGRAPH I saw the birth of Pixar before it became Pixar and met all those people. Ever since, every few years something bigger comes along.
In 2018, real-time RTX rendering became a reality—something we’d dreamed of for years. To see it happen was incredible. It’s so visual you have to experience it.
Michael: That’s why people are so excited about Gaussian splatting. NVIDIA also has the 3D Gaussian Splatting libraries like 3DGRUT, and people are amazed by how lifelike it makes 2D images into interactive 3D.
We’ve also seen big industry players—NVIDIA, Chaos, OTOY, SideFX—adopt gaussian splatting rapidly compared to other technologies. Why do you think that is?
Richard: Because they know their audiences are willing to experiment. At keynotes, we show what’s possible, but the real magic happens when customers and developers get their hands on it.
We like to get it out quickly, sometimes even with “wet paint,” because we know developers will take it to another place. That’s the exciting part—you’re giving the technology life.
Michael: It’s such an exciting time for developers and for audiences who get to enjoy the results. Speaking of keynotes, I want to highlight a couple this year—in 2025 at CES and at GTC. Jensen Huang has been incorporating Gaussian splatting to visualize NVIDIA’s Endeavor and Voyager offices in Santa Clara.
How does NVIDIA see this as a creative tool for bringing people into locations?
Richard: It’s about creating a tool that brings people into the environment you’re showing them. A static image is one thing, but when it comes to life and you can see around it and step inside it, that’s a whole different experience.
That changes media and entertainment—whether it’s films, sports, or at-home viewing. You’ll be able to change angles, follow a favorite player, and really immerse yourself.
When Jensen does a keynote, it’s always a glimpse into what’s possible and where the road is headed. A few years ago it was real-time ray tracing. Soon after, it was in games, then in industry and film. Gaussian splatting will follow the same path—it’ll be everywhere really soon.
Michael: For most of our lives, imaging has been limited to 2D. Now we’re entering a foundational period where we can extend that into lifelike 3D.
Richard: Absolutely. VR has been around for decades with bulky headsets, but now we’re seeing much higher-fidelity devices like the Vision Pro, connected to Omniverse for real-time photorealistic experiences.
When you put that on, you’re in the location. Soon the headsets will get smaller, capture will get quicker, and you’ll be able to go to places you’ve never been with just these technologies. It’s going to be a fun ride.
Michael: It feels like rediscovering imaging all over again.
Richard: Right. Like the shift from black and white to color. Now it’s static 3D to immersive 3D. The holodeck isn’t that far away.
Michael: I think it’s closer than people realize. And in some cases it’s already here. In the new Superman movie, there’s a scene with Superman’s parents that looks like damaged video—but it’s actually fully Gaussian splatting. It looks so perfect it just seems like 2D.
Richard: And that’s when you know it’s working—when the audience doesn’t think, “that’s a visual effect,” but instead just follows the story. The key is for the technology to support the story.
Michael: And Gaussian splatting also gives creators more time and options. The VFX team told me they could revisit shots, try more iterations, and focus on the creative because they captured everything at once.
Richard: That’s the whole idea. In the past, visual effects could take months or years outside the film, and directors were disconnected from the process. The more we bring it into the camera, the more the director can direct live.
It’s like going back to the purity of a director working with actors on stage. With virtual production and splats, instead of three shots a day you can do twelve. Different angles, different locations—all of it supports the story.
Michael: Exactly. PIXOMONDO in Toronto is already using Gaussian splatting for virtual production—capturing high-fidelity spaces with rigs of cameras and bringing them into LED volumes for the crew.
Richard: Right, and we’ll see LED walls evolve into LED objects, floors, and full environments. Directors will be able to change entire settings in real time. It gets us back to directors truly directing on stage instead of actors pretending with green screens.
Michael: Yes. It empowers creativity and lets directors expand into 3D to tell the best story.
Richard: Absolutely.
Michael: I also wanted to ask about NVIDIA’s Media 2 initiative.
Richard: Media and entertainment is in NVIDIA’s DNA. For years we almost took it for granted—the next film would use our tech and look incredible. But we realized we needed to put more energy into it because the landscape is changing so much.
Media 2 is about bringing AI across all media workflows. Better efficiency in location scouting, better optimization of resources, and of course AI in imagery. Part of it is demystifying AI. People shouldn’t fear it.
AI has been in entertainment for years—simulation, rendering, de-aging tools. Generative AI is new and surprising, which makes people nervous. But if you return to the core principle, it’s about the story.
Media 2 isn’t here to demolish Hollywood. It’s here to democratize creation. The more people who can tell their stories, the more content we’ll all get to enjoy.
Think about music. Grammy-winning records have been made on laptops in the Midwest. Why can’t Academy Award-winning films come from anywhere? AI lowers the barrier, gives more people access, and that means more stories.
Of course, we need guardrails—properly trained content, responsibility. But overall it’s something to celebrate.
Michael: I agree. And it could help us experience new locations and perspectives we’ve never seen before.
Richard: Exactly. The industry sometimes falls back on repeating the same successful movie, but democratized tools encourage new stories from independent filmmakers. That’s an exciting turning point.
Michael: You mentioned The Wizard of Oz earlier. Do you see that kind of black and white to color moment happening for 3D films?
Richard: I think we’ll see more immersive, hyper-localized, and personalized content. You might watch something and control elements of it. Films could adapt for different regions.
AI will help shape that so the core story stays intact but is tailored to audiences. We’re on the cusp of this next generation of entertainment. There will always be space for just sitting back and watching a film, just like there’s still space for reading a book. But now there are new doors opening.
Michael: It’s about continuing to innovate while keeping the story at the center.
I’m also curious about sports. Fans love getting close to the action—courtside seats, dugouts, premium experiences. How might sports become more immersive?
Richard: We’re already seeing it. Through our Inception program, startups like Arcturus and Viewpoint Pro are using AI to make games more interactive and immersive. You’ll be able to watch from any angle, gamify the viewing experience, even place yourself anywhere in the stadium.
Coaches and players will also benefit. Imagine reviewing plays from any perspective, giving players specific feedback. Or using observational data to predict outcomes. It’s more information and more access—making the whole experience richer.
Michael: Yes. People connect emotionally with images because they feel like they were there. This technology brings them even closer.
Richard: Exactly.
Michael: Looking ahead, do you think 3D capture will become as simple as taking a photo on your phone?
Richard: Absolutely. Soon you’ll just look around a location with your phone and capture it fully. You won’t just remember a photo—you’ll remember a location. You’ll relive it, walk through it, and share it.
That’s deeply meaningful. Memories are important to people. Imagine reliving a vacation, revisiting your childhood dinner table, or sharing experiences with family. Yes, there’s sports and entertainment, but at the heart of it, this technology connects to human memory and emotion.
Michael: I completely agree. That’s where I’m most excited—the preservation of memories and experiences.
Richard: Imagine if your life were documented—not all of it, but the moments you’d want to revisit. Trips to Italy or Tokyo, shared with your kids. That’s powerful.
Michael: I think we’ll see camera design evolve too.
Richard: We already are. Blackmagic has its immersive camera for the Vision Pro. And as splatting becomes more accessible, these tools will be in everyone’s hands. It’s closer than people think.
Michael: I’m very much looking forward to the future.
Richard: Believe me, it goes by fast—this is my 40th SIGGRAPH. Brace yourself.
Michael: Well, thank you so much for joining me today. I really appreciate the conversation and I’m excited about what’s ahead.
Richard: My pleasure. Happy to do it.