How can two vfx studios work on the same character, anyway?

Sharing a character might sound unusual, at first, but as mentioned, Marvel does this regularly. Partly, it’s simply to spread the work around and ensure no single facility is over-burdened. And partly, it’s because it simply can be done. Digital scans, motion capture data, and cg models can be shared amongst vfx facilities, even though they may each have proprietary techniques for bringing together the final character. Indeed, Digital Domain and Weta Digital do have different approaches, but it was also the job of overall visual effects supervisor Dan DeLeeuw and the Marvel production team to review and compare the cg Thanoses and push each studio towards a similar result.

Before any shots were done, however, Marvel actually had Digital Domain and Weta Digital do a Thanos test. “They shot some random clips of Josh Brolin just being Thanos without any story,” said Digital Domain animation director Phil Cramer. “We got to effectively make a little short film with Thanos, showcasing how close we could go to his face as we transferred Josh’s performance.”

Once actual shooting began, both Weta Digital visual effects supervisor Matt Aitken and Digital Domain visual effects supervisor Kelly Port joined DeLeeuw on set, and continued interacting during post-production. “It was a very collaborative process,” said Aitken. “I guess it’s kind of an overhead for Marvel, because they’re paying for two complete builds. But they would be the first to admit that they could cherry-pick the best of both worlds. So, early on when we were modeling the shape of the mouth, Marvel preferred the work that Digital Domain was doing in that respect. And then some of the work around the rest of his face, say his brows and his eyes and his chin, they liked what we were doing.”

DeLeeuw would send each vfx vendor updates of where the other vendor was up to in order to help with continuity. “For example,” said Port, “we got some notes on getting more textural detail on the wide shots of his arms because it was looking too smooth and feeling too cg. Or we’d be working on Thanos’ armour and present that to Dan; he thought it was cool, so we’d hand off that pattern to Weta so that they could match it into their armour.”

Facing the future with Thanos

Thanos, of course, is not a one-to-one translation of Brolin, but in creating a cg character, both studios preserved the essence of the actor’s performance. That was made possible by firstly digitally scanning him (via Disney Research Zurich’s Medusa process). On set, Brolin then wore a motion capture suit and a head-mounted facial camera system, with his face covered in tracking dots. Brolin also recorded Thanos performances on a separate motion capture volume for particular scenes and to concentrate on certain moments.

Both Weta Digital and Digital Domain have for several years been developing their own facial animation pipelines to handle digital humans and the translation or re-targeting of actor facial capture into some other kind of character or creature. Weta Digital has refined its approach on the various Apes films, and relied on those techniques again for Infinity War.

Weta’s Aitken says for Thanos, who is a particularly ‘fleshy’ character and who is in several fight scenes in the film, the studio steered clear of any dynamic simulations on his face. “We won’t do the geometry of the face and solve it, because we want to retain pixel-level, vertex-level control of the movement of the face – that’s very precious to us. We will add dynamics to the face, but we’ll do it by creating target shapes as part of the facial animation system.”

Meanwhile, Digital Domain introduced a new approach to facial animation with Infinity War centered around machine learning, and led by the studio’s head of digital humans, Darren Hendler. The technique, dubbed Masquerade, takes frames from a helmet-mounted camera system to create a high-resolution actor face scan. Machine learning was then used to take previously collected high resolution tracking data and turn around 100 facial data points from a mocap session into 40,000 points of facial motion data.

“It’s like training data,” said Port. “Each time you do it, the automatic solution gets better and better. And then if it’s off, you give it a little correction, and over the course of the production the automatic solution gets more and more robust and more and more correct from the very beginning.”

“For us,” added DD animation director Phil Cramer, “this was a crucial breakthrough because it was so important to capture every nuance of Josh Brolin’s performance. That was the first step. You need to get every wrinkle, every tensing on the skin on a very micro level, so we wanted to push the boundaries with this.”

Digital Domain’s other proprietary step in its facial animation process is Direct Drive, which takes the data from Masquerade and transfers it to the creature, ie. a cg Thanos, by creating a correlation between the actor and the creature. Digital Domain animators still had absolute control over the animation rig for Thanos, where they could make corrections and key decisions, but the combination of the new tools was aimed at getting them to a high level of fidelity and correlation with Brolin’s original performance.

Purple challenges, and art-directing muscles

In the same way other vfx vendors have had to tackle a green Hulk in the Marvel Cinematic Universe, Weta Digital and Digital Domain had the unusual task of rendering Thanos as a purple character. That, admits Aitken, can be a tough color to achieve realistically on screen.

“I knew that we had an issue there when they brought on-set a maquette that Legacy Effects had made for lighting reference,” said Aitken. “Our scenes were on Titan which was lit to be very orange, and if you bring a purple thing into orange lights, it just looks gray, because there’s no overlap in the spectrum. So we actually had to tweak his color so that it read as being purple under the lights.”

With purple problems behind them, and facial animation and other issues solved, the two studios had perhaps only one other major stumbling block, and that was how muscly to make Thanos. The character is intended to be the most powerful warlord in the universe and that meant exhibiting some of his strength in his arms and legs. This even got a name: ‘adding hamsters.’

“The arms needed to always be alive,” said Cramer, “because we didn’t want to have the look of concrete shoulders or concrete arms on this huge character. Actually, Dan DeLeeuw always told us, ‘Just imagine he’s making a fist and exercising his hands off-camera to keep it alive.’”

Animators turned to a readily available source of reference to inspire this muscle movement by looking at footage of actor Chris Hemsworth, who plays Thor. “Out of all the people,” said Cramer, “we had so much footage of him without a shirt on or with exposed arms, and that guy just has so much muscle movement happening on every little twitch that he does. A traditional muscle system doesn’t really help you there because when an arm goes down, nothing much is going on. So really we ended up exaggerating Chris Hemsworth and art directed those muscles to taste.”