“The Mandalorian,” the Star Wars series that became Disney+’s first big breakout hit, brought all kinds of planets and the spaceships that travel among them to the streaming video universe.

And it did it all in California, almost entirely at the MBS Media Campus in Manhattan Beach.

How? For the most part thanks to extensive use of a new soundstage technology that employs a horsehoe-shaped wall and ceiling of light-emitting diodes in place of the green- and bluescreens commonly used for special effects filming.

“For Season One, we effectively shot the equivalent of two feature films all in Los Angeles,” Richard Bluff, “Mandalorian’s” visual effects supervisor, confirmed. “It’s the first Star Wars show – movie or otherwise – that was shot in its entirety in California.”

The technology that enabled this is called StageCraft, developed by the George Lucas-founded, now Disney-owned Industrial Light & Magic visual effects outfit with tech and input from other companies, most prominently “Fortnite” and “Gears of War”-maker Epic Games. Epic’s Unreal game engine processes and powers the visuals that go up on the 20 by 180 foot LED screen at the Manhattan Beach soundstage, which the “Mandalorian” crew calls the Volume.

The show’s production designer Andrew Jones, who worked with “Mandalorian’s” showrunner Jon Favreau on the director’s earlier, technical breakthrough feature film “The Jungle Book” (2016), explained how imagery on the Volume’s LED wall could put the foreground players and practical props on deserty Tatooine or the forest planet Sorgan without leaving the studio. And then, how not only the cast and crew but the camera could move around and film practically the entire shot live, without guessing what the environment would look like once post-production digital effects replaced an empty greenscreen.

“The whole point of the Volume is that it’s not flat image like a TV screen,” Jones noted. “It’s a representation of a three-dimensional environment. As the camera moves within that environment, being tracked by motion capture technology, the screen updates to feed the camera what it would be seeing from any point in the Volume. So you have parallax, you have change of position, it updates the environment.”

Such perspective-shifting movement wasn’t possible with traditional rear- or front-projection backgrounds, a century-old Hollywood trick that StageCraft incalcuably improves upon.

Too look convincing, though, the Volume’s 3-D environments had to be created based on photography. Photogrammetry – scanning real landscapes with a camera, then feeding all of those photographs into software that generates a 3-D model of that environment and textures it with that photographic information – rendered photoreal assets to create the virtual environment with.

“The idea of the technology is that, instead of having to take a whole film company to various locations and then navigate complex filming and weather conditions there, we just go to that location with a three-person crew and scan it a number of ways, bring all of the information back, process it and load it up,” Jones said.

Those small camera teams took footage in Death Valley, Iceland, Chile, Utah and other striking, natural environments around the world. For Sorgan, they didn’t have to venture far from ILM’s Northern California home.

“Being so local, we were able to send out Enrico Damm (“Mandalorian’s” digital environments supervisor) to photograph a bunch of private land with wonderful redwood trees at various different times of day,” Bluff recalled. “We could, for example, provide the director of photography with 40 different panoramic images showing the light move in a meadow, and he’d choose the six that he wanted to shoot with on the day. Literally one person was able to photograph it, yet the DP was able to scout it on the Volume and look at the light travel throughout the entire day.”

Beside saving oodles on travel logistics, expense and time for the production – theoretically, shots at multiple locations could be done in one day in the Volume, depending on how fast modular and transportable practical elements of a set that Jones designed could be switched out – the LED screen also generated enough illumination for most shots.

The available light generated by the Volume’s LED wall also automatically fixed the “The Mandalorian’s” primary problem: the fact that its title hero, played by Pedro Pascal, always wears armor, and for lighting purposes is essentially a silver ball that reflects like crazy. The Volume made those reflections look natural to the eye, rendering movie lights, difficult to calibrate for that situation, all but unnecessary. They were used rarely if at all.

“Once we started doing tests the DP, Greig Fraser, realized that this tool was more than just a rear projection, it was a lighting tool,” Jones recalled. “He indicated that what was behind us and above us was as important as what was in front of us. It became a 360-degree immersive space.”

For when a shot needed harsh, natural sunlight, a small, exterior “backlot” at a nearby railroad yard was utilized. There and the MBC campus were where all the show’s live-action got shot. Most of the traditional post-production work was done in pre-production to help realize that; even digital extensions of massive vehicles such as the Jawas’ sand-crawler were projected on the LED screen, not CGI’d-in afterward. Post did, of course, contribute such things as alien creatures and starscapes, but thanks to the Volume it usually consisted of little more than color correction and enhancing mostly puppeted Baby Yoda.

“If you compare ‘Episode IX’ (“The Rise of Skywalker”) and the vast number of stages they had to hold onto in order to shoot that feature film versus the three stages we had at Manhattan Beach studios plus what we called the backlot about a mile and a half away from the campus, we never left those shooting stages,” Bluff said with a lingering tone of awe in his voice. “We did have a small number of photographers that went far and wide to collect imagery, to allow us to project it on the LED screens, but none of the shooting crew, none of the actors, none of the directors ever left Manhattan Beach studios while shooting Season One.”

And they finished principal photography for “Mandalorian’s” greatly anticipated second season at the same place last week. While MBS’s operators declined to comment about the Volume’s future for this story, Bluff revealed what little he could about what we’ll see when Season Two hits Disney+ in October.

“A lot of the environments for Season Two are places we’ve never been to in Star Wars, which is really exciting,” he said. “Because we had gotten very comfortable by the end of Season One with how the Volume worked, the challenge for all of us was to really push the boundaries out and expand on what we had done. So there are a number of occasions now where not only did we shoot in the Volume, but we actually are going to be shooting in the stage space outside of the Volume, pushing the camera from a practical set all the way into the LED screens, really expand the physical space that the filmmakers and the actors have available to them.”

Although technology akin to StageCraft has been used to a limited extent in “Rogue One” and such things are car commercials, “Mandalorian’s” massive, successful reliance on it could clearly have a wide effect on how shows are made from here out.

“It’s a sea change,” Bluff plainly stated. “For everybody, not only the actors and directors and DP, but also the audience. You can watch a $250 million-plus movie at the theater now, and I think a lot of audience members will look at a set and not believe that that actor is in that environment. That’s because there is an inherent disconnect between the lighting that is on the actor and what should be present with the set extensions done in post-production. Of course with what we’re doing, we’re putting that all on the screen live on the day.”