George Lucas was dreaming of a Star Wars live-action TV show long before The Mandalorian got off the ground. One of his collaborators at Industrial Light & Magic, Richard Bluff, remembers Lucas talking about it as far back as 2008. There was just one problem: Going to a galaxy far, far away on a TV budget was nearly impossible. “At the time, he felt he was limited in regards to how he was able to tell the story based upon the vast number of locations and worlds we would need to go to for the small screen,” says Bluff, a visual effects supervisor. “But the audience simply wouldn’t accept a Star Wars without them.”

A decade-plus later, things have changed. For one, Disney bought Lucasfilm—and everything that came with it, including ILM. It also put out a lot more Star Wars movies, let Jon Favreau remake The Lion King in virtual reality, and launched a streaming service, Disney+, where any potential live-action series could live forever. So, when Favreau started working on The Mandalorian in early 2018, the possibility that he could make a TV series without sending a whole crew to Jordan—like J.J. Abrams just did for Star Wars: The Rise of Skywalker—was far more feasible. “Through his experience on Jungle Book and The Lion King," Bluff says, "he felt very strongly that there had been breakthroughs in game-engine technology that were the key to solving this problem."

Indeed. Working with Epic Games, the studio behind the Unreal Engine (aka the thing that powers Fortnite), along with Bluff at ILM and cinematographer Greig Fraser, who had done a lot of work shooting LED screens on Rogue One, and other tech companies like video card maker Nvidia, Favreau and his team at Golem Creations developed a new virtual production platform that allows filmmakers to generate digital backdrops in real time, right in front of the camera. The tech, now called StageCraft and available to filmmakers everywhere, allowed the directors on each of The Mandalorian’s eight episodes to film in every part of the galaxy without ever having to leave Manhattan Beach Studios in Los Angeles.

Even though Favreau achieved what he'd set out to do, not everyone was convinced he could in the beginning. At the time, there was skepticism that the technology was good enough to do photo-real backgrounds, but "we pushed forward anyway," Favreau says. His hope was that he would be able to get a few shots for the first season and then improve the tech as the show went on. Eventually, he thought, if the tech got good enough it could be used on other Disney productions, whether they be Star Wars films or Marvel movies. Lucasfilm honcho Kathleen Kennedy agreed and committed to letting Favreau figure it out. “I came in with The Mandalorian and said, ‘Let this be the North Star we’re going for.’ Maybe we won’t get all the way there the first season, but at least we’ll plant our flag and try to do this," Favreau says. "It just took one production to see what could be done with the tools we have."

Here’s how it works. Imagine the scene at the cantina on Tatooine. The bounty hunter is there, there’s a general hive of scum and villainy vibe. But only a chunk of it is real. The booth is there, and some of the actors, but the rest is just being rendered on a 20-foot-tall, 270-degree semicircular LED video wall. It’s like a traditional Hollywood backdrop, except this one uses Fortnite’s game engine to place 28 million pixels' worth of characters and objects exactly where they need to be for the camera to capture them. All told, more than half of The Mandalorian was shot on virtual sets; the rest were done using practical effects on another part of the LA lot.

If that sounds like a lot, it is. Fortunately for the filmmakers behind the Disney+ show, a lot of the groundwork was already in place. The StageCraft platform works by allowing filmmakers to do a lot of their pre-visualization and shot blocking ahead of time in virtual reality, something Favreau had done with The Lion King. So as soon as the show’s concept artists and production designers came up with ideas, they could be created virtually, and the directors—whether it was producer Dave Filoni or Rick Famuyiwa or Bryce Dallas Howard or Taika Waititi—could put on a headset and see the world they would be filming in. That means, Bluff says, a lot of what is normally considered postproduction work is actually done in preproduction. On the day they’re shooting, the directors are working with almost fully rendered VFX, capturing everything in-camera, using Arri rigs.