By Charles Foster

When you click ‘Like’ on Facebook, you’re giving away a lot more than you might think. Your ‘Likes’ can be assembled by an algorithm into a terrifyingly accurate portrait.

Here are the chances of an accurate prediction: Single v in a relationship: 67%; Parents still together when you were 21: 60%; Cigarette smoking: 73%; Alcohol drinking: 70%; Drug-using: 65%; Caucasian v African American: 95%; Christianity v Islam: 82%; Democrat v Republican: 85%; Male homosexuality: 88%; Female homosexuality: 75%; Gender: 93%.

Read the (very accessible) paper in full for details of the methodology. For present purposes two further and related observations will do. First: most of the prediction happened by the connection of pieces of information which, by themselves, would not have been particularly enlightening. And, second, ‘few users were associated with Likes explicitly revealing their attributes.’ Thus, for instance, fewer than 5% of gay users were connected with explicitly gay groups. This second observation might be particularly ethically significant: it might mean that, despite our willingness to disclose a lot to the world, there are some things about ourselves that we would rather not have known.

The sort of conclusions generated by this algorithm are of course likely to be of great interest to many, including the security services (they can now target their drug searches more effectively), insurance companies (have you lied on your application form about your smoking?), homophobic bigots (‘he’s been lying to us all along: let’s get him’), politicians (‘We know there are Republicans in that house: make sure they get dragged out to vote’), and retailers. The most benign-sounding of these, retailers, aren’t necessarily benign at all. The authors of the paper discuss the use of shopping records by a US retail network to diagnose the pregnancies of its female customers. The customers thought to be pregnant were then sent targeted offers. Those offers might be welcome, but they might also do great damage. They might, say the authors, reveal (or incorrectly suggest) the pregnancy of an unmarried mother in a culture where such a pregnancy is unacceptable. Or, I suggest, cause great pain when a woman has just lost a much wanted child.

Is it unethical to use such an algorithm? It might be said (and Hannah Maslen indeeds argues powerfully for this position here), that if you put lots of little bits of information about you in the public domain, you can hardly complain if they are merely stuck together to form a more complete picture.

Surely it turns on consent. If it is universally understood that your ‘Likes’ can and will be used this way, there can indeed be no cause for complaint. But most will not understand that.

If I invite you into my hall and kitchen, I’m not inviting you to look into my bedroom. If, unknown to me, you have x-ray glasses which you use to look up from the hall into my bedroom, I’ve been violated.

The low incidence of gay people ‘liking’ explicitly gay groups surely gives the lie to the suggestion that, by ‘liking’ anything, we’re impliedly giving the algorithm permission to rummage, unrestricted, through our lives.