ACM News Keeping Government Out of Your Business

New Mexico State University researchers say they have developed a way to automate proactive oversight that prevents privacy violations in government surveillance before they occur, using algorithms and blockchain in the form of what they call the Enforcer software. Credit: Twelveofour

The U.S. Electronic Communications Privacy Act (ECPA) grants government agencies permission to surveil citizens who assume privacy in their electronic communications. Today such oversight, even by the U.S. Congress, is opaque (summarized and anonymized), and any notification of those being monitored nearly always takes place after the fact.

New Mexico State University (NMSU) researchers presented a paper at the ACM Conference on Computers, Communications and Security (CCS 2019, London) in which they claimed to be able to automate proactive oversight that prevents privacy violations before they occur, using algorithms and blockchain in the form of what they call the Enforcer software.

Said Stephanie Forest, director of the Biodesign Center for Biocomputation, Security, and Society at Arizona State University in Tempe, "Electronic surveillance is the single biggest security threat we face today. It comes in many forms, ranging from the surveillance we agree to when we purchase and activate network-enabled electronic devices, to stealthy monitoring of which we are unaware." Forrest said the paper, Scalable Auditability of Monitoring Processes using Public Ledgers , tackled the problem "of how to audit electronic surveillance processes involving multiple actors, focusing on government surveillance, especially agencies and companies, that violate court-sanctioned authorities. Methods such as these represent an important check and balance in our democracy and provide a way to audit government-sanctioned surveillance."

The Federal Bureau of Investigation (FBI), the Central Intelligence Agency (CIA), the National Security Agency (NSA) and other U.S. government agencies (some using secret laws) obtain court orders granting them permission to review large swaths of electronic communications between citizens without their knowledge. Sealed court orders allowing such surveillance prevent its targets from knowing their privacy is being violated, even though it is being done lawfully. Only after the surveillance period is over and the court order is unsealed can a trial take place using the secret information against a citizen. If the surveillance overstepped the limits set by the court order, it is up to the citizen's lawyer to discover that.

The Enforcer software, however, is designed to prevent law-enforcement agencies from overreaching the limits stated in a court order in the first place. For instance, the U.S. Department of Justice (DOJ) Office of the Inspector General (OIG) is responsible for auditing the FBI for possible infractions. Unfortunately, the OIG does not have the resources to monitor all FBI court orders for compliance. On the other hand, if the Enforcer were employed by the OIG to monitor communications between the law-enforcement agency and the company providing the secrets, its cryptographic algorithms could prevent illegal privacy intrusions for all court orders before they happened, according to the Enforcer's authors.

"The Enforcer, in principle, can be completely automated…[it] checks the emails as they go between the agency and the company. This requires the identification/building of the right kind of meta-information/data that needs to be visible to the Enforcer, does not violate anonymity of the users, and which specifies the rules for the checks by the Enforcer. These can be implemented as regular expression checks, or by using other formal methods with pre-defined formats. If you think about email as an application, we have all the meta information in the SMTP [Simple Mail Transfer Protocol] headers," said NMSU associate professor of computer science Satyajayant Misra, one of the paper's authors.

The Enforcer preserves adherence to court order terms using zero knowledge proofs (an interactive methodology in which private knowledge is demonstrated, but not communicated, to the Enforcer), commitment schemes (a cryptographic primitive, in particular the Pedersen commitment, that allows one to commit to a chosen value while keeping it hidden from the Enforcer), hash trees (cryptographic trees, here of the Merkle variety, which do not reveal secrets by virtue of using leaf-nodes that are labelled only with a hash of the secret), and blockchain public ledgers.

At no time does the Enforcer know the real identities of the people being surveilled, or the unencrypted secrets themselves. The Enforcer also requires the unsealing of surveillance orders at the time designated by the court-order-granting entity, thus allowing the surveillance operation to be verified as lawful. People being surveilled are provided with proof that the court order's mandate was not exceeded, and law enforcement agencies, along with the suppliers of the otherwise privacy-protected secrets, can demonstrate they adhered to the rule of law, according the the Enforcer's authors.

Said Misra, "All the cryptographic operations have been implemented in our codebase already. The parties [law-enforcement agencies and companies supplying the secret information] need to generate and verify the zero-knowledge proofs, Pedersen commitments, and construct (and verify) simple cryptographic data structures like Merkle hash trees, in addition to doing digital signatures."

To prove the validity of the Enforcer and its ability to scale up to handle the number of surveillance orders in force in the U.S. today, the authors implemented it using four desktop computers, each with an Intel Core i7-6700K central processing unit and eight gigabytes of random-access memory. Each computer ran an entire SAMPL (Scalable Auditability of Monitoring Processes using Public Ledgers) framework, including the Enforcer application. In the authors' test, random user data for 500 people was generated and stored in a structured query language (SQL) database which was accessed by email interchanges over a date-restricted period of 120 days.

The results showed that execution time was very short (under two seconds) for each Enforcer transaction between the court-ordered agency's request and the data providers' compliance acknowledgement. By grouping requests for the same people into batches, the efficiency of the Enforcer increased geometrically; increasing the number of people being surveilled increased the Enforcer's execution time linearly.

The authors claim such automated systems could revolutionize the tedious, overreach-prone, human-only-based protocols in place today. Each type of surveillance will require modification of the specific criteria by which personal records are stored. Sender and receiver names and their IP addresses will have to be hashed separately and stored along with date, time, and other metadata that can be verified by the Enforcer without decrypting the actual data.

The main downside to using the Enforcer is its inability to perform ad hoc surveillance audits for information needed dynamically during real-time surveillance, as is routine today. To extend the Enforcer for ad hoc audits, such as searches in an encrypted document, would require searchable encryption algorithms that could find specific data records by keywords. While that can be accomplished, this extension could leak some secret data to the Enforcer (unless cryptologists can invent a workaround not available today), making it possible for bad actors to hack the Enforcer to discover leaked secrets.

R. Colin Johnson is a Kyoto Prize Fellow who has worked as a technology journalist for two decades.

No entries found