University of Washington

This rubbery, alien facsimile of Tom Hanks' face may look like a bad waxwork -- but it's actually a computer generated digital model created by a machine learning algorithm.

The model was created by researchers at the University of Washington who wanted to answer the question "what makes Tom Hanks look like Tom Hanks?", his appearance, his mannerisms or the way he moves?


To do so, they created a digital model from a number of images available online -- unlike traditional facial capture technologies, the model does not require the participation of a particular subject.

With enough visual data, the algorithm can also animate the model to deliver speeches.


The system uses a number of technologies; 3D face reconstruction, facial tracking and alignment, texture modelling and puppeteering. Thousands of images of Hanks were combined to create a multi-layered, multi-angled '3D puppet'. The most common expression is the model's 'resting' face, with texture overlays creating the multiple changes that occur when a face goes from a smile to a frown.

And it doesn't just work on Tom Hanks. The software can transfer expressions and movements onto the face of someone else -- George Bush's face has been mapped onto a variety of different celebrities, for example.

In theory, the technology could be used to bring real-world people to life virtual reality -- such as distant relative or friend. But as the current system requires thousands of different photos that's some way off yet.