Involuntary surveillance on a large scale—driven by Moore’s Law—arrived shortly thereafter. Its primary instruments are cellphones, smartphones, GPS, and inexpensive cameras. When these devices are employed, there is no need for users to be actively involved in creating information about their activities. They get little or nothing in return for involuntarily providing valuable information about themselves. The NSA does not provide services of any kind to cell-phone users in return for their metadata.

Nobody knows how quickly the cost of mass surveillance is declining or at what rate it is growing. What we do know is that existing participatory and involuntary surveillance technologies are proliferating and new ones are being introduced and becoming more effective every day. As costs drop, new frontiers in surveillance open up. Low-cost facial recognition will let the government and retail establishments track us with our cellphones turned off and our loyalty cards left behind.

As the cost of automated surveillance continues to drop, there will be a rapid increase in surveillance applications. Disparate pieces of our personal puzzle will be brought together in monstrously large databases. Big data analysis tools will combine the bits and pieces to create a full picture of who we are, where we go, what we read and watch, what we do, and what we like. There will be files of facts about us such as our addresses, phone numbers, the calls we placed on our cellphones and where we were when we placed them, and the Internet sites we visited. But there will also be algorithmic predictions about our tastes, behavior, plans, opinions, thoughts, and health. Almost everything about us will be known or predicted. Those predictions may well become the self-fulfilling prophecies that determine our future.

While much of the world’s concern has been focused on NSA spying, I believe the greatest threat to my freedom will result from my being placed in a virtual algorithmic prison. Those algorithmic predictions could land me on no-fly lists or target me for government audits. They could be used to deny me loans and credit, screen my job applications and scan LinkedIn to determine my suitability for a job. They could be used by potential employers to get a picture of my health. They could predict whether I will commit a crime or am likely to use addictive substances, and determine my eligibility for automobile and life insurance. They could be used by retirement communities to determine if I will be a profitable resident, and employed by colleges as part of the admissions process.

Especially disturbing is the notion that once you become an algorithmic prisoner, it is very difficult to get pardoned. Ask anyone who has tried to get off a no-fly list or correct a mistake on a credit report.