Over the years, Planet of the Apes has been many things: a satirical French novel, a landmark science fiction movie, a series of uneven sequels, a disastrous Tim Burton reboot. But ever since the book hit the screen, the most memorable thing about the franchise has been the effects. (OK, Charlton Heston’s ripe line readings are pretty memorable, too.) The latex masks of the original 1968 movie were revolutionary at the time, and are still remarkably effective, but they’re nothing compared to the digital wizardry of the latest series Apes movies, which began with 2011's Rise of the Planet of the Apes and continues this weekend with the powerful new Dawn of the Planet of the Apes.

Uh oh, here comes trouble. Incredibly well-crafted, digital ape trouble. (Credit: WETA/20th Century Fox) I checked in with Joe Letteri, the visual effects supervisor for both Rise and Dawn and director of Weta Digital, to find out how he created the film’s remarkably detailed world. Letteri's long resume also includes key work on Avatar and the Lord of the Rings movies. His answers led into a deep, thoughtful exploration of the technology of modern movie-making, along with the unexpected connections between digital effects and zoology, medicine, and even particle physics. I had originally expected I would just quote Letteri in my story, but he proved such an engaging interview that I'm sharing his comments in full. The remarkable thing about the apes in the new movie is that they seem real--so real that you almost instantly forget you are looking at digital creations. Part of the reason is that they are not purely digital creations. Using “performance capture,” live actors guide the facial and body movements of their ape characters. Dawn uses the technique more extensively than ever before, and for the first time employed it to capture performances on location, not on set. That necessitated parallel advances in the 3D camera work by 3ality Technica in order to keep the digital and real components of the movie tightly in sync. The other breakthrough in Dawn is the rendering of the characters. The hair movement, skin textures, light reflections—everything in the new movie looks like it follows the same physical rules as you and I do. As Joe Letteri explains in the following exchange, achieving that level of verisimilitude is the most difficult yet exciting part of his job. [For more related news and images, follow me on Twitter: @coreyspowell] ____________________________________

Planet of the Apes is such an iconic franchise. Did you go back and look at the old movies for visual inspiration?

We went back and looked at the old movies before we started Rise of the Planet of the Apes [four years ago], but that was just to get back into the story and think about where we were going. There was no way we could do that with real chimps, so we were set right away to create realistic looking digital chimps that we could add performance to. How do you create a whole cast of digital characters--characters that need to look recognizably like apes, and yet need to behave unlike anything we've seen before? There’s years of research that went into making this work. One of the breakthroughs for us was when we did Gollum [in the Lord of the Rings movies] and came up with the technique of subsurface scattering. That gave us, for the first time, the ability to do soft translucent skin as opposed to hard dinosaur-like skin like in Jurassic Park. That opened the door to doing skin, eyes, lips, light coming through fingers and ears—it gave you the level visual realism that the characters needed. At the same time, we started working with Andy Serkis and using performance capture to author his character arc.

Andy Serkis, in performance capture mode and as the ape Caesar. (Credit: WETA/20th Century Fox) Those two ideas—performance and visual representation of it—are the things we’ve been using over the years to understand how to create characters who are believeable and emotionally engaging. For Rise we thought, wouldn’t it be great I we could capture what Andy [who plays lead ape Caesar] is doing on the set. That gets you to the core dramatic moments you want, then you can animate on top of that and we can add our own cinematography. We took that to an extreme on Dawn of the Planet of the Apes: We took the gear out on location, out in the forest and rain and mud and natural lighting. It gave us the complete package. What about the anatomy and movements of your advanced apes? How did you create these animals as, literally, fleshed out characters? We studied ape physiology. We went below the skin. All our character have to have believable skeletons, so we create a muscle system that’s physically based. In the old days we did animation like animatronics, we had little bladders inside the CG creatures that would inflate up and down to make it look like muscles moving. Now we actually solve fiber-based muscles. Those drive a layer of tissue, and that in turn drives the skin, and all that can be driven either by the actor’s movements or animators or most of the time a combination of the two. Our animation team has to take over from the actors to do the final interpretation of the human emotion and human expression into what’s believable for chimp motion. It’s that sense of physical correctness but also artistic interpretation that has to come together and work with the story and setting to make the film complete.

What were the biggest technical challenges you faced in making Dawn?

Technically there were a lot of things going on under the hood. For example, our fur system needed a complete rewrite [from Rise of the Planet of the Apes]. We do a dynamics-based fur system. Because these scenes have a massive number of apes in them, with a lot of dirt and mud and twigs all into their bodies, and a lot of action--a lot of rain hitting the fur, running down and getting embedded--we needed a higher level of simulation. We needed a new rendering engine to do global illumination, because you have to simulate not only the dynamics of every strand of fur on every ape but also the way the light bounces through every strand of fur. A lot of fur [in other movies] has been treated as little wires. We have some new software that treats them as plastic coils, which is what hair really is. It uses a method called dynamic rods that we’ve been working with a group out of Columba University to develop. It gives you a more dynamic and correct representation of fur. There’s a big level of physical simulation of both the dynamics and the lighting.

I've noticed that a lot of effects-heavy movies are dimly lit, as a way of hiding the places where reflections don't look realistic. Is that why many scenes in Dawn were set in the rain or beneath overcast skies?

Not at all. That’s how [director] Matt Reeves wanted the film to look. We can do full path tracing whether it’s sunlight or cloudy. If you look at some of the shots in the ape village, there are practical light sources in camera—that’s fairly new in CG. Where did you go with the new movie visually that you haven't been able to go before? The biggest change was the ability to take performance capture to extremely remote locations. On Rise we could bring it on the set. Now we’re recording not just the image of a performance but the performance itself. On this film we open with the apes in their own colony in the forest and we got to go really remote. The gear had to be robust, we had to go wireless, with multiple apes captured at the same time. We wanted to get the ability to break out the studio and say, let’s do performance capture wherever the story takes us. How do you get the physics right, so all of that movement and action looks believable? You are seeing a handover of different techniques. We’ll start with some stunt people and do motion capture for them, but they may be on safety wires and not falling as fast as they should be. We take that, apply proper gravity and get them to fall at the correct speed. Then we take that initial fall and blend it in with something key-frame animated to carry through the rest of the fall, and then blend back to an impact to get the realism from the stunt people working. It’s a mix of the physics engine and the key frame animation working together. Animators always have to be mindful of the physics and the gravity. And then within the creatures you have all the dynamic simulations, the tissue dynamics and fur dynamics and wind resistance and all those things that have to work together. If the characters are wrestling with each other you have the additional contacts and impacts that have to be solved. There’s always a mix of the science and the art.

Toby Kebbell transforms into villainous ape Koba. (Credit: WETA/20th Century Fox)

How did the digital effects play into the creation of the different ape personalities in Dawn?

We started by looking at different kinds of apes, trying to see what we could do to make them unique to their species and unique to themselves. For example, there are differences between how gorillas sit and walk and move versus chimps, and big differences between chimps and orangutans. We’d study reference footage of those kinds of things. There’s a great troop of chimps here at the Wellington Zoo, and the zoo’s been great to give us access to them. You get to understand which apes have which kind of personalities. Then we work with the actors to bring out their individual personalities and performances. We set up a motion capture volume where they can practice their character using arm extensions for motion, scratching their noses, and so on. They can see themselves in real time on the monitors, see themselves being apes. The animators study a lot of video reference and we animate to match the reference. We look at a chimp climbing or a gorilla running and try to animate that. Then we take what the actors give us and see where it needs to be modified, so you get the intent of the actor but the locomotion is what an orangutan would be doing, for instance. You need to hold that mirror up to it because you might think you’re doing the right thing—like how much you’re rolling over your shoulders when you’re walking like chimps do. You can look at it and say, nope, that’s not enough or that’s too much and you can refine that. All those movements have to become second nature to them the actors. They need to know how to move so they can concentrate on their performance. The new movie has a much broader canvas: many more apes, much more action. How did that influence your visual approach? In the first movie you had one character. It was really a coming of age story, but because it was told through a chimp you looked at it in a new way. Dawn takes a similar arc but now it’s looking at a society. When you start off you’re basically seeing a Stone Age culture, but during the course of the movie they become acquainted with civilization, they become acquainted with electricity and guns, and you start to see this emergence of culture on top of it. At heart what these stories are doing is leading you along in this journey. Its by anchoring them in that kind of reality that as an audience you can get into the story and go along with it. On top of everything else, the movie was shot in 3D. Does creating all the effects in 3D add yet another layer of complexity to your work? It adds complexity on the production side, not the coding side. All the models are done in a fully 3D space anyway. What changes is the live-action photographic aspect of it; you have to match the two cameras simultaneously. There’s a lot of hand work involved. We had shots of our ape actors up on horses. They’re bigger than apes, they have longer legs, they need stirrups to ride. We put the digital apes on the horses, they have shorter legs, no stirrups, you see more of the horse. We had to paint out large pieces of the riders and reveal horse that wasn’t actually photographed. Our paint artist had to paint that all back in where they could. That’s hard to get in stereo. If we couldn’t paint it we had to revert to doing a 3D model of the horse. So a lot of the horses that you see ridden by the apes are digital. Sometimes had to erase the horses entirely and put in CG horses.

Where do you turn for inspiration? Do you talk to other digital effects people and try to one-up each other: 'Hey, that was pretty cool but take a look at this'?

The underlying learning typically comes from more academic sources. A lot of these techniques that we develop are being researched at various universities and labs or at other visual effects companies, and everyone publishes their research. You look at that and you take it and build on that. There’s no other way to do that, because there is a lot of science that goes on behind the scenes. Taking the fur as an example, we were interested in what makes fur work, started to look at what we needed to do to accomplish that, and came across some researchers at Columbia. They presented a paper at Siggraph and we started to collaborate with them. It’s that sort of workflow. I'm surprised there's so much academic connection to your work; I assumed that digital effects were mostly driven by the movie industry. A lot of our researchers come from academia. We have visiting researchers who spend some time with us, it’s an open door. And we contribute back as well; we publish a number of papers on the techniques we’re doing. We just got an Academy science and technical award for the tissue system that we used for all the muscle and skin dynamics.There’s a back current of academic research that underpins a lot of this. I went to my first Siggraph in ’84 and people still go every year. It’s where the ideas get brought out and everyone looks at what the advances can be. We look to Siggraph and forums like that extensively, because we’re involved in so many different areas. We’re doing physical simulation, we’re doing rendering, we also do physical simulations of fire and water. We have our own software that we’ve written in house to do that. You see a lot of that in Dawn, in all the battle scenes. To create worlds we have to know how to grow trees, to build landscapes, make clouds, we have to figure out light transport for all this. There’s really no other way to do it other than to understand how does nature do it and see if we can build tools to do the same thing.

You've come a long way, baby: The apes back in 1968. (Credit: 20th Century Fox) Does the process run the other way? Does the work you do find applications in academia? One of the researchers who did a lot of our initial research into facial simulation is now at the University of Auckland running a medical lab there where they are trying to create synthetic muscles and looking at facial reconstruction, so it does go back and forth. A lot of [our kind of work] shows up in other applications of graphics itself. You might see a solution being picked up for architectural rendering or automotive design. These are really hard problems to solve, it’s fascinating when you really get right down into it. We’ve been writing this new renderer and having a lot of difficult issues with tracing light rays everywhere in our huge scenes. One of our guys started looking around and asking, Who else is having this problem? He realized, hey, this is actually a problem in neutron transport. He started taking up the literature on neutron transport and figured out how we could apply those techniques and managed to publish a paper on that as well. It’s interesting where this all leads you.