All throughout the production of the Disney+ series The Mandalorian, I had heard rumblings of the groundbreaking technology being used to create the first-ever Star Wars live-action television show. Producer Jon Favreau has talked briefly about the process while doing press, mentioning that the series makes use of a new tech that creates virtual backgrounds using large high-resolution screens. The filmmaker would rather have audiences concentrate on the story than the tech, so we won’t likely see any real behind-the-scenes features on the technology until after the first season airs in its entirety (Notably, Favreau took the same approach for Jungle Book and Lion King).

I’ve watched the first two episodes of The Mandalorian multiple times now and tried to find the seams. Where is this tech being used? What is practical and what is being generated virtually on a performance capture “volume” in the soundstages next to James Cameron’s Avatar sequels?

Yesterday, I attended a “Women of Lucasfilm—What Drives You?” panel discussion at the Porsche Experience Center Los Angeles in support of Star Wars: The Rise of Skywalker. It was wonderful to hear stories from the women who are bringing Star Wars into the next decade and beyond.

Late into the panel discussion, the conversation turned to George Lucas‘ fearless innovation that has been a part of the Star Wars DNA. Remember, ILM was created to help bring a galaxy far, far away to life. The female dream team on the panel spoke for over 15 minutes about how this new tech could change television and cinema forever. And director Deborah Chow talked about possibly using this new “Stagecraft” technology in Disney+’s upcoming Obi-Wan Kenobi tv series.

Kathleen Kennedy, president of Lucasfilm, has been involved in several big pieces of cinematic innovation, including Jurassic Park. They have made some advances at Lucasfilm, but the biggest might not be seen on the big screen. With The Mandalorian, she had “the foundation of a good screenplay” which allowed them to “push the technology”:

“Jon Favreau and myself went into Disney and said, this is something that we would like to try and they said, what exactly is it? And we said we’re not exactly sure. We have no idea what this is going to cost, and we hadn’t ever built anything with the technology, which we’re now calling “Stagecraft” inside ILM, but it basically is a projection system on screens, and the real innovation is that when you move the camera inside this space, the parallax changes. So suddenly you’re in an environment that actually begins to behave in the same way it would in an actual 3D environment like this.”

Kennedy told a story about how a Walt Disney Company executive visited the set and didn’t even realize he was in a virtual environment created by this new technology:

“It was really funny as we had an executive from The Walt Disney Company come down early in the process because it’s one of those things that is difficult to explain until you walk into the environment to see how it’s working. And he stepped in, and he looked around, and he said, Jon, I thought you weren’t going to build anything. And he had no idea he was standing in a virtual set. That’s how unbelievable it is.”

This new cinema tech changes how they are approaching to film the future of television and movies:

“It means that if you want a big establishing shot in Iceland, and you don’t want to take 700 people, spend four months prepping a set because you only want to do the establishing shot and you can bring everything back to shoot interiors on a stage, that becomes very meaningful on big, huge projects and small projects. So the interesting thing with Mandalorian, the fact that we tested this technology inside of television and not on the big screen was the way we felt that we could take a big risk but not a giant risk.”

Kennedy mentioned The Mandalorian premiere last week in Hollywood as an example of the quality of the technology holding up on the big screen: “We now know this technology works for the big screen as well.” She continued:

“What we refer to a game-changer, that’s how it happens: you have a story that offers you the opportunity to do it differently, and to push technology that you know is right on the edge. We’re incredibly fortunate with Lucasfilm because we have ILM inside our company. And this defined the company from the time that George created Star Wars. Star Wars creates ILM. And so, ILM continues to be at the forefront of this kind of innovation, because we’re constantly telling stories that are content in search of technology. That’s what it is.”

When panel moderator and Star Wars Show host Andi Gutierrez brought up fans questioning where they shot some of the exterior scenes in The Mandalorian, Kennedy quipped “You can say Iceland. But It was just one person [shooting it].”

Lynwen Brennan, Lucasfilm EVP and General Manager, explained that Stagecraft came from the need to accomplish big scale, larger-than-life scenarios on the small screen:

“Stagecraft really came to being from a need, from an idea of how to approach this differently and the scope and scale of a Star Wars film, but in a tv schedule, and the amount of content that we have to create and it was built upon some innovations that ILM had done previously. We were taking building blocks of using LED screens for lighting and reflections etc on previous films. And with Kathy and Jon’s encouragement, we really pushed it to how far could we take it.”

It’s not the same tech used on Solo: A Star Wars Story and Rogue One: A Star Wars Story, but the next evolution of it. Kennedy explains:

“There were LED screens being used on some of that in Rogue One, for instance, but we were mainly just using it for lighting, and we replace the imagery with [CG], and it was more traditional. And then when we went to Solo we were using very high-end laser projection. And this is the next iteration.”

Kennedy adds an interesting little tidbit about the material used to create the screen:

“But I’m going to add one other thing that I didn’t know anything about this and it’s an interesting little tidbit. You have to grow the crystals for these screens. Who knew? You have to wait five years for the crystals to grow. And the crystals means a limited number of screens. Not only do you have to grow them but if you have volume, it’s important that you have the same bunch of LCD screens so that all the crystals are growing together. And then, how they refract the light, then they go into a whole pass on the ground crystals to then curate which ones are refracting the light in the same way so Its quite a process.”

So now the soundstage, a performance capture volume like the one James Cameron used on the Avatar films, is wrapped with these very high-resolution LED screens that present footage either shot on location or “in combination with CG environments.” Brennan explains further:

“And we’re able to have the perspective with cameras, but that means that you can change from Iceland to the desert in one [minute] from setup to setup so it really changes the flow of production. I think it also helps because actors are not in a sea of green. They’re actually seeing the environments that they’re in. And you add to that, after the puppetry and they’ve got characters to perform against in the environments that they are in and I think it does change.”

This makes the pre-production process longer and more involved, and Kennedy praises director Deborah Chow as being the kind of filmmaker who it’s built for:

“This really requires [a filmmaker] like Deb who plans. This is not for somebody who’s winging it. You absolutely need to plan ahead because in essence what you’re doing is you’re taking a lot of the post-production, and you’re putting it in pre-production. You need to know your story, you need to know what those limitations are, you need to know what those effect shots already are so that they can be put onto the screen. So, Deb can talk to it practically, but it really does take a plan.”

Deborah Chow became the first solo live-action female Star Wars director with her work on two episodes of The Mandalorian (the upcoming chapter three and seven) and has since been hired to helm Disney+’s Obi-Wan series. She tried to explain the complex process at the panel:

When I first started, they were still testing a lot, we didn’t even know how much we were gonna be able to use or how successful it was going to be. But as a director, it’s a completely different process because we prevised the entire episode. So I was in there for months beforehand prevising everything and we cut it together like it was an actual edit. We had done it for technical reasons but it was actually an amazing story tool because we can look at like the literally the cut, and go that’s not working or this is working. And then from there, they would take what we had designed in the previs, they would go to photograph material, like the environments in Iceland, and then they would then we’d be able to tweak the lighting, we tweak the environment, and then on the day of shoot it would show up on the screen and then shooting in a camera. And a lot of the stuff that was amazing is that we would have the practical elements. So if you wanted to have foreground or you didn’t have the practical set, and oftentimes it was so perfectly aligned, that the screen would take it over. So it was kind of amazing. I think one of the most interesting things with that technology and with where it’s going, is it seems like the future to me is that it allows you to tell the story in a little bit of a different way like suddenly we could do magic hour, all day long. You can do things you just can’t do. There was a point when I was working with my DP and where the danger was that it was becoming too perfect. That we’re controlling it too much and it looks too beautiful.

So in the future of Stagecraft, does a director even leave the soundstage to location scout? Is it even needed? Chow joked that it was “the best thing in the world to not have to go in the transpo van ever.” She continued:

“They show you virtual environments and you would pick the lens. I mean literally we would start with concept art. We have Doug Chiang and then we’d have the look and then we have the DP. We were sort of working it out and then we get the photograph and match the art and then we would tweak it. … So basically, say we shot a room in Mandalorian, and Andrew Jones, who’s the production designer, decided that they liked a lot of the references from the LA subway systems. So they just use the texture and whatnot. They sent people down there who do 3D photography at a very high resolution. They have the environment, and then that environment would then come into 3D, and we would start manipulating to start building it from there.”

The result is a virtual creation of the environment in 3D that can be rendered in real-time in the perspective of the camera:

“It also allowed us actually to have the environment in 3d so that we could do virtual scouts so we would have days where it would be the whole team, we’d have a production designer the DP, Jon, Dave Filloni, and we would all be in headsets together flying around the same set, and we can put cameras up and put a lens up and say, Okay, this is what I’m thinking and we can all see it, and we’re actually looking at it on the set together.”

Throwing the press a bone, Kathleen Kennedy asked Chow how the new technology might influence her next project, the Obi-wan Kenobi TV series. She responded: