Last week I participated in the third International Cryptographic Module Conference (ICMC), organized by the Cryptographic Module User Forum (CMUF), and concerned with the validation of cryptographic modules against government and international standards. You may think of cryptographic module validation as a dry topic, but it was quite an exciting conference, full of technical and political controversy. The technical controversy resulted from the fact that the standards are out of sync with current technology and it is not at all clear how they can be fixed. The political controversy resulted from the fact that, after Snowden’s revelations, it is not at all clear who should try to fix them. The organizers signalled that they were not afraid of controversy by inviting as keynote speakers both Phil Zimmerman, creator of PGP and co-founder of Silent Circle, and Marianne Bailey, Deputy CIO for Cybersecurity at the US Department of Defense, besides well known expert Paul Kocher of SSL fame. I enjoyed an exchange between Zimmerman and Bailey on the imbalance between defense and offense at the NSA and its impact on cybersecurity.

Besides the excitement, I liked the technical depth of the conference, and was pleasantly surprised by the gender and ethnic diversity, which was striking by contrast with Silicon Valley. The next ICMC will take place earlier in the year, May 18-20, 2016, in Ottawa, and the call for papers is already out.

Background

The conference was concerned with the design, implementation and validation of cryptographic modules that conform with the FIPS 140-2 and ISO 19790:2012 standards. FIPS 140-2 is a Federal Information Processing Standard issued by the US National Institute of Standards and Technology (NIST). ISO 19790:2012 is an international standard derived from FIPS 140-2, much of which is word-by-word identical to FIPS 140-2. Cryptographic modules are validated as conforming to FIPS 140-2 by the Cryptographic Module Validation Program (CMVP), which is jointly managed by NIST and the Communications Security Establishment Canada (CSEC) and relies on an international network of Cryptography and Security Testing (CST) labs spanning four continents. There is also a Cryptographic Algorithm Validation Program (CAVP) for the validation of algorithms rather than modules.

Historically, NIST has been an internationally acknowledged leader in cryptographic engineering. It organized international cryptographic competitions that resulted in the specification of AES as the worldwide de facto standard for symmetric encryption and SHA-3 as an alternative to the NSA-designed family of hash functions SHA-2. CMVP has been one of the successes of NIST in cryptographic leadership. CMVP certification is widely sought after, not only for modules used by Federal Agencies in the US and Canada, but also for modules used by other governments and modules used for commercial purposes unrelated to any government.

An obsolete standard

But the Snowden revelations regarding Dual EC DRBG, which Zimmerman referred to in his keynote as having “humiliated” NIST, have had an impact on the trust that the international cryptographic community places in NIST, and given impetus to the development of national cryptographic standards by countries other than the United States. It will be difficult for NIST in the future to specify cryptographic standards that are accepted worldwide. Hence the question of who should fix the cryptographic module standards, which are now widely seen as obsolete.

FIPS 140-2 is a revision of FIPS 140-1, which is itself a successor of FED-STD-1027, dated April 1982. FED-STD-1027 was renamed FIPS PUB 140 before being revised and issued in January 1994 as FIPS 140-1. Federal Information Processing Standards (FIPS) are usually revised every five years. FIPS 140-1 was issued in May 2001, with minor changes over FIPS 140-2; but an attempt at making more radical changes in FIPS 140-3 was abandoned after drafts in 2007 and 2009.

Besides being old, FIPS 140-2 is technically obsolete.

In the eighties, cryptography had to be implemented in hardware, because software was too slow. FED-STD-1027 was only concerned with hardware implementations. FIPS 140-1 and 140-2 allow software implementations but only as an afterthought. They require a cryptographic module to be contained within a physical boundary, defined in FIPS 140-2 as an explicitly defined continuous perimeter that establishes the physical bounds of a cryptographic module and contains all the hardware, software, and/or firmware components of a cryptographic module. Two sentences attempt to reconcile the concept of a physical boundary with the concept of a software cryptographic module in FIPS 140-2, but come up short:

If a cryptographic module consists of software or firmware components, the cryptographic boundary shall contain the processor(s) and other hardware components that store and protect the software and firmware components. Hardware, software, and firmware components of a cryptographic module can be excluded from the requirements of this standard if shown that these components do not affect the security of the module.

The concept of a cryptographic boundary is ill-defined for a software module, and this is a source of insecurity in validated products. In his presentation, Ashit Vora said that the cryptographic boundary may be shrunk to exclude key management from validation, or to claim compliance of OpenSSL with FIPS 140-2 even though the implementation of TLS by OpenSSL has not been validated. I repeatedly heard conference attendees refer to applying FIPS 140-2 to a software module as “fitting a square peg into a round hole”.

Yet the first cryptographic module certified by CMVP was a software module (the Entrust Cryptographic Kernel, V1.9, “For use in PCs”, certified on October 12, 1995). FIPS 140-1 was already obsolete in 1995, and FIPS 140-2, which made only minor changes, did not bring it up to date.

Since then, the standard has become more and more out of sync with technology, in spite of efforts to keep it up to date by issuing implementation guidance. Here are some examples. JavaScript programs are interpreted by a web browser and run on any processor and operating system on which the browser runs, yet an algorithm or module implemented in JavaScript can only be certified for a specific processor and operating system. Similarly, as pointed out by Apostol Vassilev of NIST in response to a question after his presentation, a cryptographic module implemented by software running on a virtual server can only be certified for a particular version of the hypervisor and a specific physical processor on which the hypervisor runs. Mobile devices rely on encryption for protection of data and keys, yet FIPS 140-2 does not allow encryption to be relied upon to increase the security level. Mobile devices derive encryption keys from a hardware root of trust or a cloud root of trust, but FIPS 140-2 only provides the option of deriving encryption keys from a password.

Also, the module validation process has not kept up with the accelerating pace of technology evolution. As pointed out by Vassilev in his presentation, certification review cycles are now much longer than product cycles.

Benefits are not real

NIST speakers at the conference made it clear that they want to replace FIPS 140-2 with ISO 19790 as the standard against which CMVP validates cryptographic modules used by US Federal Agencies. Michael Cooper of NIST, during his joint presentation with other NIST speakers, said that NIST expects to submit a recommendation to that effect to the Commerce Secretary as early as next month, after finishing a review of comments on a Federal Register Notice proposing the change.

At first glance, replacing FIPS 140-2 with ISO 19790 seems to be a good solution to the problems facing cryptographic module validation. It appears to provide two benefits:

ISO 19790 is much more recent than FIPS 140-2, its second edition being dated 2012-08-15. (The Federal Register Notice refers to 19790:2014 but there is no such thing. There is a 19790:2006, a.k.a. 19790 1st Edition, a 19790:2012, a.k.a. 19790 2nd Edition, and a Technical Corrigendum to 1790:2012 dated 2015-09-25.) Hence replacing FIPS 140-2 with ISO 19790 would appear to bring up to date the standard used by CMVP. ISO 19790 is an international standard. Hence using it in the CMVP program would appear to solve the problem that will be faced by vendors of cryptographic modules and testing laboratories as national cryptographic standards proliferate.

But neither benefit is real:

Although the second edition of ISO 19790 is dated 2012, it only makes minor modifications to FIPS 140-2 (which itself only made minor modifications to the 1994 FIPS 140-1 standard, itself derived from the 1982 FED-STD-1027). ISO 19790 is as out of sync with current technology as FIPS 140-2. Although ISO 19790 is an international standard, NIST speakers said that NIST intends to specify its own cryptographic algorithms in US annexes, and issue its own implementation guidance, expecting other countries to do the same. This will negate the benefits of ISO 19790 being an international standard for vendors and labs.

And while the benefits are not real, ISO 19790 has real disadvantages: it is bad policy, and it repeats a mistake that has been blamed for the inclusion of Dual EC DRBG in SP 800-90A.

Bad policy

ISO charges a fee for downloading standards. Each document costs hundreds of dollars, and each standard includes many ancillary documents. In his presentation, Randall Easter, who holds positions both at NIST and ISO, went over the ancillary documents related to ISO 19790; they include: ISO 17825, 18367, 19896-1, 19896-2, 20085-1, 20085-2, 20540, 20543, 2479 and 30104. In a comment after the talk, an attendee estimated that his organization would have to spend thousands of dollars on the documents. If ISO 19790 replaces FIPS 140-2, testing labs and vendors of cryptographic modules would have to pay collectively millions of dollars to ISO. But development of the ISO standard has been paid for with US taxpayer money, since much of it is word-by-word identical to FIPS 140-2, and where the two standards differ the changes have been made with NIST participation. So US taxpayers would end up paying twice for the ISO standard, first by financing the work of NIST on FIPS 140-2 and ISO 19790, and then by bearing the cost of the documents, which labs and vendors will pass along to US Federal Agencies that buy cryptographic modules conforming to the standard.

Repeating the Dual EC DRBG mistake

NIST included a deterministic random bit generator (DRBG) known as Dual EC DRBG, invented by the NSA, in the original version of SP 800-90A, and removed it after stories in The New York Times, ProPublica, and The Guardian, referring to internal NSA memos leaked by Edward Snowden, suggested that the NSA had planted a back door in the algorithm.

Dual EC DRBG is a very simple algorithm, and the method for planting a back door inside it is also very simple. They are described by diagrams in slides 6 and 9 of a presentation by John Kelsey of NIST. P and Q being points of an elliptic curve, pseudorandom bits are derived from sQ, where s starts out as the seed and is replaced with sP each time the DRBG is used. If P and Q are supplied by an adversary instead of being random, the adversary can plant a back door by letting P be aQ, where a is the back door known to the adversary. The adversary can later compute the new value sP assigned to s from sQ, since sP = s(aQ) = a(sQ). The adversary will not be able to simply observe sQ as the algorithm is used, because the algorithm only outputs a subset of the bits of the binary representation of the abscissa of the point sQ. But only 16 of the bits are not included in the subset, and the adversary can find their values by making up to 216 guesses. To test a guess, the adversary combines the guessed bits with the output bits to obtain the abscissa of a point R of the curve, computes the ordinate using the curve equation, computes S = aR, drops 16 bits from the abscissa of S, and checks if the remaining bits agree with the next output of the algorithm.

The back door a is a secret known only to the adversary. It is unfeasible to compute the scalar a such that P = aQ even though P and Q are public parameters because discrete logarithms are supposed to be hard to compute in the field of points of the elliptic curve. (If you are wondering what discrete logarithms have to do with this, if the group operation were metaphorically called multiplication instead of addition, aQ would be written Qa and called exponentiation instead of scalar multiplication. With the multiplicative notation, when P = Qa, the exponent a is said to be the logarithm of P in base Q, “discrete” because the group is finite.)

How could such a simple back door make its way to SP 800-90A? It’s a long story with many twists and turns. An insider account can be found in another presentation by John Kelsey. Additional information can be found, among other places, in a blog post by cryptographer Matthew Green, in this web site dedicated to Dual EC DRBG, and in this Wikipedia page.

But the root cause is clear. Dual EC DRBG was originally standardized as part of the X9.82 standard developed by working group X9F1 of X9 with NIST participation. X9 is an ANSI-accredited organization that develops financial services standards in the US and submits them to ISO for international standardization. (X9.82 eventually became ISO 18031.) Like ISO and other ANSI-accredited standards organizations, X9 is financed by membership fees and the sale of standards documents. Consequently, its working groups work behind closed doors, and its resulting standards receive little public scrutiny. This business model for standards development works well for most kinds of business and technical standards, but it does not work well for cryptography. Cryptography requires transparency and extensive public scrutiny to avoid the intentional or unintentional introduction of weaknesses in standards. The international cryptographic competitions organized by NIST to specify AES and SHA-3 are good examples of cryptographic standards development with transparency and public scrutiny.

Here is what John Kelsey had to say about the development of the Dual EC DRBG standard within X9, in slide 15 of his insider account:

Doing standard in X9 made it harder to get feedback

Copies not available for review except by paying



Few universities are X9 members, so academic cryptographers usually can’t be involved



Limited number of members with the right background (e.g., RNGs)



Public review not very public

NIST made a mistake by adopting a standard that was developed in a manner open to manipulation. John Kelsey adds in slide 46:

Many reasons why Dual EC DRBG should not have been in standard…

Performance, Bias, Potential Trapdoor

…but it had a champion on X9.82 editing committee

Common way for weak algorithms to get into standards.

Transferring responsibility for the cryptographic module standard to ISO would mean repeating the mistake.

Towards a solution with real benefits

That the ISO business model is not suitable for the development of cryptographic standards does not mean that we should give up on an international solution. An international cryptographic module standard would be very beneficial to vendors, labs and government agencies. Algorithms and implementation guidance should be part of the standard and the same for all countries that accept the standard. International cryptographic harmonization will be difficult to achieve, but it should be a goal.

Another difficult goal to achieve will be to bring the cryptographic module standard up to date, when we do not even have a clear concept of what a software cryptographic module is. In his talk, Apostol Vassilev said that NIST intends to create a working group with representatives from government, industry, laboratories and academia. That would be a good step towards that goal.