What is our privacy worth?

That’s the question we should be asking right now, as we make sense of a policy change that could cast a long shadow over the future of the internet, and indeed, over the nature of the economy itself. Privacy activists have been sounding alarms for the past week over the news that internet service providers (ISPs) will now be able to sell data on their customers’ web browsing and app usage—without so much as asking for consent.

That’s because the American Congress just passed a bill overturning the FCC rule that prevented ISPs like Verizon, AT&T, and Comcast—who provide most consumers’ access to the Internet—from selling their customers’ data the way Google or Facebook can. The FCC policy recognized that ISPs are in a special relationship to consumers, because they can see literally everything we do online (unless you take heroic measures to prevent that). Once ISPs start collecting data and piecing it together, it will be near impossible for any of us to protect our own privacy: after all, you might find a way to shield your activity from your ISP, but as soon as you send me an email, that communication is only as private as my ISP allows.

You might think that such a massive threat to online privacy would have the whole country up in arms. As is so often the case, however, much of the internet is sitting this one out. It’s notoriously hard to get people excited about online privacy issues, and no wonder, since many people don’t really understand the difference between their ISP, their web browser, and Google. In Europe, privacy laws mean that not only ISPs, but all internet companies, are strictly limited in the kind of data they can collect and sell. In America, we’ve just levelled the playing field between ISPs and and other online services—but we’ve done it by reducing regulations on everybody, rather than by increasing the level of privacy protection we expect from the companies that trade in consumer data.

That’s consistent with a political culture that assumes that, most of the time, we can trust in a free market to sort things out. After all, nobody has to use Google (Bing, anyone?), just like nobody has to use Facebook (even if it doesn’t feel that way). If we don’t want them to collect and sell our data, we can just stick to the corners of the internet that Facebook and Google haven’t colonized. (I swear, they are out there.) Or we can pay for additional services, like VPNs, that keep Google and Facebook from knowing what we’re up to. See, we’ve got loads of choices!

Feelings of confidence and security about a website may not correlate with the site’s privacy practices.

The same thinking is already popping up around the the deregulation of ISPs. If consumers care so much about their privacy, the argument goes, we’ll see ISPs compete with one another based on who offers the best privacy protection. Don’t like the idea of Verizon collecting the list of every pair of shoes you’ve looked at on the internet, and selling that list to Macy’s? No problem—you can just switch to an ISP who promises not to do that. (And at least for the moment, all the big players are promising not to do that…though I will look forward to checking in again next year, when this issue has faded from the headlines.)

But the idea of comparison shopping for privacy rests on the assumption that there’s a viable market of privacy-sensitive consumers. As James Nehf points out in his 2007 article, Shopping for Privacy on the Internet, we can’t afford to make that assumption. Nehf sums up the problem very neatly:

A system that relies on individuals to police their privacy rights presumes that individuals can value privacy rights meaningfully. If people do not know what information is being collected, how it could be used, and what harm might result from its collection and use, they have no way to judge how much it is worth to them (in time, money, or other trade-offs). To make an informed choice about whether and how to share personal information, and whether to make an effort to protect it, people need to know what is at stake.

Nehf notes that when making decisions that involve difficult trade-offs, people often try to minimize the emotional discomfort involved in making a hard choice. That’s especially true for choices the involve making apples-to-oranges comparisons, like deciding whether to go with the ISP that saves you $20/month, or the ISP that promises not to sell your data:

Because of the desire to minimize emotional conflict, people may avoid comparing attributes that are dissimilar, especially when asked to put a price on something she intuitively believes should not be compromised. A well informed consumer may learn that a Web site does not retain or sell personal information of any kind, but find it difficult to compare the value of that site’s privacy policy with different benefits (such as lower prices) from another site….What is the value of knowing that the details of one’s life are not sold to third parties? Is it worth giving up the benefits offered by the competing, but less private, alternative? Comparing disparate categories of benefits and costs is extremely difficult in any circumstance, and when making decisions about privacy the attributes we are asked to compare vary widely. The emotional conflict created by the comparison is heightened when a person is asked to put a price on something she believes should not be commodified or traded away. The problem is most acute when people are asked to trade values they view as sacred or protected; for most people privacy is such a value.

Somewhat paradoxically, that kind of emotional conflict makes people more likely to make emotional—rather than rational—decisions. “Rather than struggle to make a difficult comparison, individuals may turn to affect cues (feelings derived from a consumer’s experiences with a particular alternative) as a decision-making guide,” Nehf writes. “As a result, feelings generated from a Web user’s experience interacting with a Web site may affect decisions about sharing personal information, but those feelings can lead to inaccurate decisions. Feelings of confidence and security about a site, for instance, may not correlate with the site’s privacy practices.

Nehf’s argument neatly underlines the reason we can’t afford to trust that the market will address any privacy concerns that emerge over ISPs selling consumer data. Precisely because privacy is such a fundamental value, it’s hard for people to put a price on it. That means that whenever they’re asked to make a “consumer” decision about what their privacy is worth, they’re likely to shift from rational to emotional decision-making…and to make decisions that put their privacy at risk.

All of this leads to one inevitable conclusion: we can’t expect people to be effective privacy “consumers.” And if people can’t be effective privacy consumers, then we can’t afford to leave privacy to the marketplace. That means we absolutely must regulate the privacy-compromising trade in consumer data, particularly when it comes to the ISPs who constitute the gateway and backbone of the internet itself.

Sadly, Nehf’s own work outlines why it’s hard to be hopeful for that outcome: