Claims made that AI can tell your sexuality from your face have raised pressing ethical questions and significant concern in the LGBT community.

Artificial Intelligence is often hitting the news these days, both as the answer to our problems and the bringer of new ones. But Stanford University research claiming that AI can find subtle differences in human faces that can denote sexuality really grabbed the headlines. The New York Post thundered excitedly: “This AI Can Tell If You Are Gay or Straight”.

This claim was very widely reported. Even The Guardian ran with the headline “New AI can guess whether you’re gay or straight from a photograph”. LGBT site Pink News reported “experts” had “found that intimate traits in a face can be picked up by machine, which humans are not capable of spotting”.

However did the Stanford paper actually show this? No, it didn’t.

The heart of the research was data from “facial images posted on a public US dating website”. Their sexuality was established “based on the gender of the partners they were looking for”. To be honest, we don’t need to go much further than this. The idea that a US dating site delivers up reliable, and verifiable, information about sexuality is ridiculous in the extreme.

However I will go further. As many have pointed out the scope was hopelessly incomplete – the dating site data was cut off at age 40, only white people were included, and no-one living outside the US. Most obviously, defining sexuality as 2 options ignoring bisexual people clearly shows the researchers are not serious about their subject. And that is the most generous interpretation.

It is astonishing that as Ellen Broad pointed out “the authors make no reference to ethical processes undertaken in designing the study” and there was no explicit consent because of how the data was gathered.

The Economist, clearly aware of the problems with the work, did try to make the best of it : “The point is not that Dr Kosinski and Mr Wang have created software which can reliably determine gay from straight. That was not their goal. Rather, they have demonstrated that such software is possible.” No. Reading the paper, this was clearly their goal and they did not demonstrate it was possible in any meaningful sense.

What they did demonstrate is that people are willing to try to develop technology to hunt out LGBTQ people. Because no-one can really think the intention was to develop technology to identify straight people.

For this reason it would be foolish to dismiss this story as overblown headlines for a questionable piece of research. LGBTQ people know well what the ideology is behind it. That’s the idea that we are inherently different from the “normal”, we are strange, deviant, somehow broken. And that “brokenness” can be detected, in this case through phrenology in 21st century clothes.

This study has to be set in its context. This story was run in the same week the UK daytime TV invited on a gay ‘cure’ i.e. eradication advocate. The week also saw Stonewall report that hate crime against LGBT people had risen 80% over 4 years. Even for those of us who have been ‘out’ for many years, the ability to control how we declare our identity and information about us is literally a matter of life and death.

The other part of the context is that all our faces are becoming a technological battleground.

The Biometrics Commissioner has warned that the police’s use of more than 20m facial images exceeds the original intention for their use. The Metropolitan police used facial imaging to analyse people attending the Notting Hill Carnival.

Equally, companies are seeing our faces as resources, Apple announced its latest phone can be unlocked using ‘FaceID’ facial recognition. Facebook, Snapchat and the rest are built on selfie culture, linking our images to intimate details about our lives.

Both governments and companies now see our faces as information they are entitled to use and retain. This context makes ethics free sexuality facial recognition irresponsible, particularly if it is flawed. Mass surveillance and technological targeting are dangerous for LGBTQ people.

Until researchers, companies and governments take on the concerns of the LGBTQ community in developing, and when regulating technology, we will be at risk and marginal. And we will recognise this- Big Brother is a straight guy.