(Dado Ruvic/Reuters)

Google has told investors (and the world) that they have access to information about roughly 70 percent of credit and debit card transactions in the United States. As a business matter, this is hugely important, because combined with location data, and user data is collects from search, YouTube, Google Android, Google Chrome, and Google Maps, Google can now tell advertisers how often people see an ad Google served them and then walked into a brick and mortar store and actually bought the item.

As a business matter, I do wonder how much good this actually does. Sometimes having so many data points allows you to tell any story you want. Google serves an appliance ad across a tiny portion of your web browsing experience, and you, for a short time occupy a store with that item in it, a store that itself occupies a tiny portion of the geographical space around your town. Until Google is wired directly into your brain, I wonder how it connects the two events together logically.

But the article brought to mind an idea I have that the social networks and other tech giants like Google are passively collecting so much data on human activity that the data trove they are sitting on itself is a kind of socially radioactive material. I may have picked up this metaphor from programmer and entrepreneur Maciej Ceglowski, who gave a talk on this theme a few years ago.

Basically, Ceglowski’s thesis is that collected and searchable personal data, like nuclear waste, has a dangerous power that may outlive the institutions that created it.

[I]nformation about people retains its power as long as those people are alive, and sometimes as long as their children are alive. No one knows what will become of sites like Twitter in five years or ten. But the data those sites own will retain the power to hurt for decades.

There are fewer and fewer ways to live a normal life and opt out of this information economy. Your car passes cameras that are passing on data about your location. Your car will soon be passing data itself. We are asked to trust not only all the institutions that we knowingly give our personal data to, but all the successor institutions that may acquire them — whether in a buyout, or simply as a condition. What would some of our Silicon Valley giants do to open up China to their business? Tech firms in China already have agreements to share data with the government.

Even the knowledge that all this data is being collected on us has social costs. As our helplessness to avoid being captured by corporate databases and suffering consequences during leaks becomes more evident, people may live increasingly as if they are being monitored by a potentially hostile agent. It also has the potential to destroy trust. Data leaks can not only destroy people for what they reveal about their true behavior, but messy and incomplete data could merely suggest things about a person’s behavior without proving it. Or faked data can be dummied up and interspersed into batches of real personal data to incriminate.

We need to be thinking much harder about the problems this creates.