In a speech at last year’s Re:Publica conference in Berlin, software designer and privacy advocate Aral Balkan asked a provocative question: If slavery is the business of buying and selling physical human bodies, “what do you call the business of selling everything else about a person that makes them who they are apart from their physical body?”

Balkan was referring to what Harvard professor Shoshana Zuboff calls “surveillance capitalism,” the business logic pioneered by companies such as Google and Facebook that has made our personal data the defining natural resource of the 21st century. Consumers have caught on to this trend: A new Pew poll found that 91 percent of American adults agree that consumers have lost control over how companies collect and use their information.

There’s also a deeper, more disturbing dimension to Balkan’s question that’s best illustrated by a sequence from the British sci-fi series “Black Mirror.”

The vignette begins with a woman preparing to undergo a mysterious medical procedure. Moments later, she awakens in an empty white room. A man — communicating with her through a small egg-like device that sits on a kitchen table — tells her the operation was successful. But like in every episode of the series, the horrible truth quickly comes into view.

It turns out the purpose of the woman’s operation was to create a perfect digital copy of herself, one sharing all her memories, emotions and personality. The woman we’ve been watching in the white room isn’t the woman, but her simulacrum. She was created, the man explains, to be a digital concierge for the “real” woman’s high-tech home — making her food, scheduling her appointments, anticipating her every desire. When she protests, the man — a kind of futuristic home technician — manipulates the egg device to torture the simulated woman, making her experience months of sleepless solitary confinement in a few moments.

The story’s most frightening theme, however, is that simulating humans has consequences for the people being simulated. Later in the episode, police extract a confession from another character by copying him and emotionally manipulating his simulated self inside a virtual environment, which the doppelganger believes to be real. The implication is that anyone in this future can be copied, analyzed and interrogated without their consent.

This particular example of simulated life is, admittedly, a far-fetched dystopian fable. But it’s chillingly prescient in the age of Big Data. Through mass surveillance and data mining, it’s fair to say that anyone who uses the Internet or owns a smartphone is having copies made of their digital identity on a daily basis. Advertisers record our every click and track our physical location as we browse the Web, and data brokerage companies such as Acxiom and Experian then assemble this and countless other personal information (ethnicity, sexual preferences, credit score, family history) into an ersatz simulacrum — a digital shadow invisible to us but accessible to marketers and other unknown entities.