I recently gave a series of opening keynotes on The Future of Customer Experience as part of a roadshow for omnichannel customer experience platform provider Genesys, which is running a global series of events for their lead customers, which includes organizations such as News Limited, Vodafone, Western Union, and the Australian Taxation Office.

The central theme of my keynotes was the boundaries and relationship between humans and machines in customer experience.

Today, extraordinary insights from data and analytics enable us to address individual’s unique preferences to an unprecedented degree.

Yet the emotion, empathy and engagement of humans cannot be replaced – we all seek personal connection and a real sense of caring.



Virtual agents mimic humans in providing customer service. Until recently they have been very crude, little more than animated puppets. Yet the state of the art is very rapidly progressing.

The video below shows a real-time rendering of a human face being taken through a range of emotions. Have a look to see how far the state of the art has advanced.

This face takes 2 Teraflops of computing power to render. To put that in context, that is equivalent to 2,000 Cray2 Supercomputers, which were not that long ago the extraordinary pinnacle of computing.

This is the state of the art. But it won’t take much longer until this kind of technology can render a human face on a video screen that we cannot distinguish from a real human. We still need to improve at speech recognition and Turing test-passing conversation, but it is fascinating that we can already generate a human face that is virtually real.