“The commons is the cultural and natural resources accessible to all members of a society,” quoth Wikipedia, “held in common, not owned privately.” We live in an era of surveillance capitalism in a symbiotic relationship with advertising technology, quoth me. And I put it to you that privacy is not just a virtue, or a value, or a commodity: it is a commons.

You may well wonder: isn’t privacy pretty much definitionally “owned privately”? What does it matter to you, or to me, much less to society as a whole, if some 13-year-old somewhere (and her legal guardians) decide to sell her privacy to Facebook for $20 a month? OK, maybe you think rootcerting a teenager is sketchy — but if an adult chooses to sell their privacy, isn’t that entirely their own business?

The answer is: no, actually, not necessarily; not if there are enough of them; not if the commodification of privacy begins to affect us all. Privacy is like voting. An individual’s privacy, like an individual’s vote, is usually largely irrelevant to anyone but themselves … but the accumulation of individual privacy or lack thereof, like the accumulation of individual votes, is enormously consequential.

As I’ve written before, “This accumulation of data is, in and of itself, not a “personal privacy” issue, but a massive public security problem. At least three problems, in fact.” Those are:

The absence of privacy has a chilling effect on dissidence and individual thought. Private spaces are the experimental Petri dishes for societies. No privacy means no experimentation with anything of which society disapproves, especially if it’s illegal. (Which, please note, in recent memory includes things like marijuana and homosexuality; there is a long, long history of “illegal today” becoming “acceptable tomorrow” as societies become less authoritarian.) If privacy becomes a commodity, one that only the rich can afford, then the rich can and will use this information asymmetry to threaten and persecute people who challenge the status quo, thereby perpetuating it. Accumulated private data can and probably will increasingly be used to manipulate public opinion on a massive scale. Sure, Cambridge Analytica were bullshit artists, but in the not-too-distant future, what they promised their clients could conceivably become reality. No less an authority than François Chollet has written, “I’d like to raise awareness about what really worries me when it comes to AI: the highly effective, highly scalable manipulation of human behavior that AI enables, and its malicious use by corporations and governments.”

We may conclude that while individually, our privacy may usually be mostly meaningless, collectively, it is a critically important commons. Anything that eats away at our individual privacy, especially at scale, is a risk to that commons.

I’m not saying a single person selling to Facebook for $20 a month rootcerted access to everything they do is a major civic problem, or that it’s ethically wrong for them to do so. Selling their privacy may be a perfectly reasonable and justifiable individual decision, in the same way that letting one’s cow graze on Midsummer Common probably makes a lot of sense for both cow and owner.

What I am saying is that selling privacy cheaply isn’t any better for society than letting it be seized without any compensation. In fact, if privacy commoditization leads to a more rapid degradation of the commons, it’s actually worse. Similarly, again, individual votes are essentially never that important … but would you think it OK for a company to purchase citizens’ voting rights for $20 per person per month? If we need to defend privacy as a commons — and we do — then we can’t start thinking of it as an individual asset to be sold to surveillance capitalists. It, and we, are more important than that.