The strategy she and Howe chose was to drown out the signal of the user’s search profile in a din of background noise. The first version of TMN randomly picked terms from a static list and fed them into four popular search engines, including Google; to mock the idea of Web surveillance, its creators populated the list with sensitive terms like bomb and HIV. “It freaked people out,” Nissenbaum says. “We got e-mails saying, ‘I like your idea, but I don’t want the FBI to think I am a terrorist.’” In TMN’s more recent versions, the obfuscating terms come from a list that combines recent popular searches and RSS feeds from well-trafficked sites like CNN.com and NYTimes.com. The list evolves over time, uniquely for each user; every once in a while, a randomly selected term is replaced with a new one chosen from the results of a fake search involving that term. And users can tailor the noise by selecting any RSS feed from across the Web as their source for fake terms—an ironic bit of personalizing, given the software’s intent.

But while search privacy may be desirable to users, is it a good thing for society? After all, monitoring searches and responsibly mining search logs can further the common good. For example, epidemiologists use Google search data to track the spread of influenza. Google and other companies also claim that records of searches help them improve their search engines and prevent click fraud—the nefarious and sometimes automated clicking of links by those seeking to drive up their advertising revenue.

In fact, some searches could be viewed as a form of dialogue between citizens and their government. Why shouldn’t what constituents are exploring online be the government’s business in a healthy democracy? A spike in searches on “student loans” in New Orleans, for example, could help education officials decide whether to expand local college-aid programs.

TMN’s creators say such uses would be more palatable if search companies committed to a responsible standard for anonymizing, storing, and purging search data. That includes giving users the option of wiping out all trace of their searches, as Ask.com has done by launching its Eraser feature. Howe says the pressure from privacy advocates is bearing fruit. In September, Google announced that it would anonymize IP addresses stored in search logs after nine months instead of the previous 18-month time frame. “TrackMeNot is a mechanism that allows individuals to say that we are not going to just accept all the conditions imposed by larger actors in the online environment,” Nissenbaum says. “However, the world toward which TrackMeNot strives is one in which it is no longer necessary.”