As the debate about National Security Agency powers continues, one of the most widely read expressions of support for the NSA has been David Simon's post, "We Are Shocked…", which makes a muscular case for the basic legitimacy of current surveillance techniques on several rationales, including this one:

"When the government grabs every single fucking telephone call made from the United States over a period of months and years, it is not a prelude to monitoring anything in particular. Why not? Because that is tens of billions of phone calls and for the love of god, how many agents do you think the FBI has? How many computer-runs do you think the NSA can do – and then specifically analyze and assess each result?"

Much of Simon's argument hinges on questions of legitimacy and oversight, but this line of reasoning – that there is some significant limit on technology or human attention that will keep people safe from unconstitutional abuses of state surveillance – is dangerously wrong.

Simon imagines that NSA capabilities can be guessed at by extrapolating from the capabilities of the the Baltimore police force in the 1980s. They can't. To understand why, we have to start with cat videos.

In August of 2010, a woman in Coventry threw a neighbor's cat – alive – into a trash can. She was caught on video but otherwise unidentified, until 4chan got involved. 4Chan is a community of largely anonymous users (and birthplace, inter alia, of lolcats, rickrolling, and the Anonymous hacker collective). 4chan's users started gathering every possible lead on the unnamed woman available from the video and, within a few hours, uncovered her name, home address, place of work, and phone number. She was later tried, pled guilty, and fined.

Before you can even begin thinking about what the NSA is capable of, you need to understand that a group of amateurs, working with commodity tools, many of them a continent away, can locate a nameless woman from a private security video and produce positive identification, and they can do it in their spare time, in a few hours, for fun. That, not the Baltimore cops, marks the baseline.

Simon's last year as a working reporter was 1995; what else has changed since he got out of the journalism racket?

First, computational power. Moore's Law says transistors per chip (a rough measurement of power) double every two years. In 1995, the most powerful chip held about 5m transistors. Today, it's 2bn. (There are few physical analogs for this sort of change; it is larger than the difference between the explosive force of a .22 pistol and a Saturn V rocket.) In addition, the last two decades have seen an explosion in parallel computing, where problems are broken into smaller parts to be calculated on multiple chips. Parallelism is what allows Google to keep up with constantly rising data and use, because it allows computing power to scale up faster than Moore's Law, so long as the funds are available. (For digital surveillance in post-9/11 America, funds are available.)

This has been accompanied by an increase in computational sophistication. To take just one example, the math behind the famous "Six Degrees of Separation" was not well understood until 1998, when Duncan Watts and Steven Strogatz worked out the Small Worlds network pattern. Small Worlds and related insights increase the refinement with which human networks can be measured. In 2009, researchers from MIT demonstrated that, with very simple tools, gay but closeted members of a student community could be outed, just by reading publicly available "friend" links. The work on tools for extracting information from links between people (the social graph) has been amazingly fruitful, and is ongoing.

So how many "computers-runs" can the NSA do, asks Simon? Given the change in chips, parallelism, and algorithms, NSA has, at a conservative guess, a trillion times more computing power than the Baltimore police (or indeed, any non-intelligence related group) had in the 20th century.

Then, there is the data itself. The rise in the amount of data between 1995 and now beggars plain description: of all of recorded information created in all human history, most of it has been created since 2010. This macro-scale change is driven by a billion micro-scale changes. If you have a smartphone, install a program like Sensor Data or Sensor List, and you can see the data one of your devices is generating: thousands of datapoints an hour, for every hour it is on.

Simon construes constitutional protection from government intrusion as controls against human listening in on phone calls or reading messages. But much of the new data created is metadata – data about people and their communication patterns – and metadata alone is often enough to create a real breach of privacy.

The identification of closeted gay students did not involve reading anyone's Facebook posts; and a query like "List every married man in Atlanta who sent a text message to a woman other than his wife between midnight and 4am" is pure metadata – no one needs to read the messages for the results of that query to feel intrusive. It is possible that the metadata for many phone calls includes an automatically generated written transcript for searching, turning even the contents of the call into metadata.

If the government remains interested solely in foreign threats to US citizens, this kind of power will be held in check; we have a National Security Agency to help keep the nation secure, and we should hope it succeeds. But history has few happy stories to tell of unchecked government power. All security comes at a cost, and recent American history reminds us that people focused on routing an enemy can end up destroying the village in order to save it.

Simon's argument that we should trust the NSA to respect American's privacy without much in the way of public oversight may carry the day. Certainly, that's the argument we should be having. But he is wrong to suggest – and no one on any side of the debate should believe – that the NSA's powers are anything less than extraordinary.