The news from Tesco that the supermarket giant plans to install facial recognition devices that can serve up more relevant advertising to customers on miniature screens while they stand at the tills has sparked ferocious debate.

Controversial: Tesco plans to introduce facial recognition technology

The technology will make educated guesses on customers’ age and sex before generating advertisements based on whichever demographic Tesco deems that they fall into. Already, questions have been raised about the possible legal issues under the Data Protection Act of both the device and how Tesco processes the information gathered from the cameras.

A spokeswoman for Tesco was quick to insist that there is no risk to customers’ personal information: "No data or images are collected or stored and the system does not use eyeball scanners or facial-recognition technology," she said.

The Data Protection Act governs how organisations process and store personal data. What can be determined so far is what Tesco has told us: the system doesn’t store information and does not process data for the purposes of the Data Protection Act.

If no data is stored in Tesco’s computer system, then it probably doesn’t count as personal data under Section 1 of the Data Protection Act 1998. In a strict interpretation of the 1998 Act, this would be the case.

However, we have moved on to a more flexible understanding of the meaning of personal data. For the Article 29 Working Group, personal data is interpreted to mean “am I likely to be identifiable from this data?”

For example, a postcode may have a hundred people living within it, and a postcode by itself would not be personal data because it doesn’t relate to any one of those hundred people. However, what if the postcode only had one person living within it? Then the postcode would be considered an identifiable detail and therefore personal data.

The privacy directive makes this clear: if the person is identifiable, then it is personal data even if the data controller is unable to identify the individual. For the Article 29 Working Group, it’s about the principle, so that it can change with future technologies that capture personal information.

Why is this important? Because a data controller may collect information on subjects that does not look personal in any way, shape, or form. However, if that information was to fall into the hands of someone who was looking to use the data for nefarious purposes, then the data could suddenly make a person identifiable.

If it isn’t personal data then we don’t have to worry about any of the collection and processes, but this isn’t clear from the little we can gleam from the technology in present circumstances.

Let’s make some assumptions about the way the technology works; a customer stands at the till and a camera scans the face. Based on an algorithm, the camera compares the face to other physical characteristics that readily identify a man or a woman; for example, a beard or a moustache or long hair.

The information is then used to determine what advertisements to place on the screen. One school of thought is that the data would be personal. This is because the information grabbed by the camera will be the same that one can use to identify oneself in the image that Tesco has grabbed. The data processed refers to the identifiable person.

But what if the information is not considered to be personal data under the Data Protection Act? Well, Tesco can store it all. In fact, there might not be any engagement with the Article 8 Right to Privacy in the European Convention of Human Rights. In Naomi Campbell v MGN, Lady Hale made clear that a person who pops out to buy a pint of milk has no reasonable expectation of privacy.

This is good law, and Tesco may have a valid case when arguing that its new ad service doesn’t violate the Data Protection Act.