Until now, the vast majority of information collected about us has remained untouched — there was just too much to make sense of it all.

What's happening: Artificial intelligence allows data that might once have gone unnoticed to now be detected, analyzed and logged in real time. It's already started supercharging surveillance at work, in schools and in cities.

Police are experimenting with facial recognition — either to help solve specific cases, or to compare the faces of passersby against a database of suspects. This week, a U.K. court upheld real-time facial recognition there.

In New York City, software can detect crime patterns over time and across boroughs, linking events in a way humans never could.

And in schools, cameras automatically watch for unusual behavior, and microphones listen for "aggressive" sounding voices.

The big picture: Humans have monitored each other as long as we've lived in communities — to punish free riders and troublemakers.

But now, cheap, powerful machines are taking the place of human watchers, disrupting a long-held social contract.

Unlike in China, where high-tech surveillance is a tool of fear and control, systems in the West are not centralized for now, curbing the scope of data gathering.

And tech companies like Facebook and Google have perfected online versions of automated surveillance for profit, in the form of products we can no longer live without.

Details: Software can identify and track faces, skin color, clothing, tattoos, walking gait and various other physical attributes and behaviors. But it's been plagued with bias and inaccuracy problems that primarily harm people of color.

From facial expressions and body movements, AI can extrapolate emotions like happiness and anger — a process built on shaky scientific evidence.

The impact: This quiet shift from passive watching to active surveillance is chipping away at our ability to remain anonymous in physical and virtual spaces.

Blending into the crowd is no longer an option if every face in that crowd is captured, compared against a driver's license photo and logged.

Constant AI surveillance threatens to erode the all-important presumption of innocence, says Clare Garvie, a privacy expert at Georgetown Law.

Go deeper: