Science fiction films of the past can offer prescient glimpses of the future. Blade Runner (1982) featured video calls and voice-controlled devices, while 2001: A Space Odyssey (1968) included an uncannily accurate depiction of tablet computers. But when it comes to the subject of futuristic advertising, the reference point is always the film Minority Report, set in the year 2054. The character of John Anderton (played by Tom Cruise) is seen striding through a shopping mall while digital advertising boards call out to him, aware not only of his name and his face, but also his tastes and desires. It depicted a future where there was a deep, data-driven connection between humans and a corporate world desperate to find out what we might want to buy.

Minority Report is still science fiction, but the barriers to this kind of precisely targeted advertising are no longer technological; they’re legal and ethical. Digital signage is becoming smarter, facial recognition technology is becoming more powerful, and it’s only privacy laws that prevent the two fusing together – along with advertising firms maintaining an ethical stance. “For us, facial recognition is scary,” says Denis Gaumondie of French digital signage firm Quividi. “We’ve been asked to do it many times, but we’ve never accepted it.”

Are we lovin' it?

Two recent events have highlighted the advances being made in this sector. A couple of weeks ago, fast-food giant McDonald’s announced the launch of “dynamic menus” in more than 1,000 of its stores. These take the form of touchscreens, which while taking your order, suggest additional purchases depending on the items already in your basket, availability in the kitchen and other factors such as the weather. While this is a modest form of personalisation, Steve Easterbrook, chief executive of McDonald’s, hinted how screens at drive-thru locations could, in the future, display your recent purchases based on the scanning of your car licence plate.

A customer views a digital menu at a drive-thru outside a McDonald's restaurant in the US. Bloomberg

A few days earlier, journalist Matthew Brennan tweeted from Chengdu Shuangliu International Airport in China, noting that a smart sign had recognised his face and directed him to the correct gate. It was a prime example of publicly situated, consumer-facing technology that knows who you are and what you want.

Smart signs don’t have to know who you are in order to be smart. Quividi has worked extensively with camera-equipped signage that can tell if it’s being watched, but Gaumondie stresses the importance of distinguishing between facial recognition and what he terms “facial detection”. “Facial detection is estimating the presence of a face,” he says. “So we detect gender, age, the presence of glasses, a beard or moustache. We can detect five states of mood, from very unhappy to very happy, the distance someone is standing from the screen, and for how long they look at it. But this processing happens locally, and images aren’t stored for more than one 10th of a second. We’re very careful about privacy.”

Facial detection, Gaumondie says, enables a more meaningful connection with the viewer without breaking ethical boundaries. Adverts can be tailored to broad demographics, and on-screen characters can respond to viewers’ emotions or movement. But while Quividi has achieved great success with its campaigns – particularly involving charities – people will always be suspicious of personalisation, says Evan Selinger, professor of philosophy at the Rochester Institute of Technology and a senior fellow at think tank The Future of Privacy Forum. “Studies suggest that people actually don’t like targeted ads,” he says, “at least not Americans. Adults, particularly well-informed ones, tend to wish they weren’t subjected to it”.

Advertisers will talk of the benefits of “relevant” advertising and how it boosts engagement, but Gaumondie also recognises that they can’t rock the boat. “You can be in tune with people without being aggressive,” he says. “We’re never going to have a sign saying ‘Hey, you with the glasses, come closer!’”

The pitfalls of personalisation

Establishing a precise level of personalisation that doesn’t alienate the public isn’t easy. Even within the field of facial detection there are problematic areas, such as serving adverts based on ethnicity – perfectly possible, but ethically dubious.

In China, fully-fledged facial recognition is a part of daily life. In Jinan, Shandong Province, jaywalkers are publicly shamed at busy junctions with a combination of screens and cameras; screens grant admission to buildings, offer personalised guidance and, in one particularly notable example, limit the use of toilet paper in public conveniences. Chinese culture now permits close connection between the personal and the corporate; 300 branches of fast-food chain KFC have a “Smile-To-Pay” option, where a face scan can debit your bank account. If scenes in Minority Report ever transpired in the real world, it would happen in China first.

The capabilities of facial recognition are becoming well-understood, and it leads to an assumption that precisely targeted advertising of this kind is almost inevitable, given the strength of the business case for it. The battleground over laws relating to our biometric data is fluid and varies by country; Google recently argued in a case brought against it in the US that the collection and use of biometric data is only problematic if it causes harm, rather than being inherently bad. Selinger says many factors determine whether facially aware advertising screens pose a risk to us, including the type of information being collected and how aware we are of it happening. “Much depends on the forms the advertising takes and where and when it is delivered,” he says. “These will impact how vulnerable we are to being manipulated.”

Our senses are already engaged in a constant, low-level battle against sales pitches, whether it’s invasively loud television ads or pleasant aromas in malls. Facial detection and recognition give advertisers an upper hand in that battle; the question is whether the display of tailored messages is a personal benefit or gross imposition. Selinger has concerns about such technology being the thin end of a dangerous wedge. “They become part of a culture that normalises face surveillance, diminishes expectations of privacy and nudges ever-more-­intrusive surveillance creep,” he says.

We can at least be thankful that any existing smart signs, at least for the time being, fall short of the capabilities of the screens in Minority Report, which were described by their creator as being able to “not only recognise you, but also your state of mind”.