Contents

Primer

Anytime there’s an advancement in Quantum Computing, like Intel’s 49 qubit chip at last year’s 2018 CES show, or this years CES show with IBM’s 20 qubit commercial quantum computer, blockchain news outlets are always there to assuage worries with statements like “No, [this] Quantum Computer won’t break bitcoin”, often accompanied by the sentiment that the looming quantum computing threat is being blown right out of proportion.

What they usually fail to note is that, once again, quantum computing is passing metrics ahead of schedule. Previously quantum computing was relegated to something that would not reach a commercial application until the mid to late-2020s. That is not to say quantum computers have only ever outpaced expectations, but rather to contextualize the strength of today’s quantum computers, not just against a static metric of being able to break ECDSA, but also a development curve.

While it is true that IBM’s commercial quantum computer won't break Bitcoin, this is not the mindset we want to have with the underlying cryptographic fundamentals of something that underpins billions of dollars worth of value (in just cryptocurrency alone)— and hopefully much more in the future.

To put a visual metaphor to waiting until it’s too late: A ballistic may not penetrate your fortress today, one will someday, a thought secured by recent developments, makes it prudent to fortify sooner and be proactive, rather than reactive. Being reactive means your fortress will be compromised, or in the case of blockchain, your private keys stolen, and what’s the point of fortifying then?

In this way, the frame of the conversation needs be proactive, rather than reactive. Decentralized networks, by their very nature, are censorship resistant, and while that means many wonderful things for end-users generally, in terms of security it means that it can be very difficult/impossible to functionally undo transactions. Therefore, we cannot afford to be reactive, we must be proactive in order to meaningfully prevent negative, irreversible security outcomes.

IBM’s Commercial Quantum Computer

At CES, IBM announced a commercialised version of its 20 qubit computer which could be shipped out to institutions and businesses looking to start building on real quantum computers.

How powerful is it?

Unlike bits on your computer which are binary and scale linearly, qubits scale exponentially, which means that every qubit added to a system doubles its computational power. For a 20-qubit system, we’re looking at 2²⁰. To give some context, this is something that your average powered notebook can simulate perfectly well and is exactly what Microsoft does for its Quantum Development Kit. The breakdown is:

2²⁰ Laptop: Commercially, we’re here!

2³⁰ Desktop

2⁴⁰ Cloud compute service (For Microsoft, this is Azure)

So, if it’s not that powerful yet, who is it for?

IBM’s commercial quantum computer is demonstration that they can produce a reliable, upgrade-able commercial quantum computer. Clients who purchase a 20-qubit system today, could upgrade their systems down the line, having already familiarized themselves with IBM’s commercial quantum computer. This would appeal to researchers in particular, as having a QC on-hand would increase uptime for experiments, and help decrease system downtime.

What does this mean for blockchain

As aptly penned by The Verge, this is a symbolic development to mean that noisy intermediate scale quantum (NISQ) era quantum computers are here, commercialized, today, and with Intel, IBM, Microsoft, Google, Intel, Lockheed Martin, Rigetti, and more, all in the ring competing, it’s clearly not just in the lab.

The blockchain community needs to accept that quantum computers are here, today. While the timeline for when they can threaten blockchain security is uncertain, whether that day will come is no longer a matter of debate. No one wants to see a catastrophic event where large swaths of the cryptocurrency ecosystem is wiped out, but that is sadly a possibility if we do not act in a timely manner.

When do we need to worry, anyway?

This isn’t an easy thing to answer, but working off of our “how many qubits” list from above we can expand on it and get a number to run Shor’s algorithm.

2⁵⁰ Average supercomputer: We have already passed this! Currently, the largest stable QC in the world is 2⁷²

2⁸⁰ Quantum Supremacy: Quantum Computers are faster than the fastest supercomputer with a synthetic test. A milestone Microsoft is anticipating to pass this year.

2³⁰⁰⁰ Quantum Computers can run Shor’s algorithm: Shor’s can break ECDSA 256 (what Bitcoin uses): We want to upgrade well before this point.

Looking back, we can take the development trajectory of the number of qubits a system has had over the last 4 years and attempt to extrapolate:

2015: 4-qubits

2016: 9-qubits (IBM)

2017: 17-qubits (Intel), 50-qubits (IBM)

2018: 49-qubits (Intel), 72-qubits (Google)

This indicates a 5.5x/1yr — 18x/4yr increase, or, assuming a similar trajectory, it will be between 2.2 and 5.3 years until we hit 3000 qubits. However, there is more work to do than just stacking up qubits. There is the lingering issue of error-correction, which currently stands as a barrier to proper, accurate calculations. There is also the issue of scaling, which is being addressed in a number of ways, but has not been solved.

On the other side, Shor’s Algorithm is merely the first quantum algorithm to usher in an interest in quantum computing. There continue to be advances towards breaking Elliptic Curve Cryptography on a variety of levels through newer, more efficient algorithms that can run on our non error-corrected quantum computers.

Those are pretty big variables, which is why you’ll see everything from 2 to 30 years before ECDSA P-256 is broken. In some ways this range doesn’t matter, but there’s no question that we want to have everything secure well ahead of it.

Update 2019–09–22: In our more recent blog we also cover this in more detail by outlining QCCalc, a model your can run on your own computer (with Matlab or Octave) that takes into consideration yearly increase in qbits, yearly error rate improvement, yearly algorithmic improvement, required runtime, maximum acceptable risk and more.

The importance of preparation

If you’ve ever dealt with keeping systems secure in any capacity, it’s important to keep up to date. Generally, this means following Common Vulnerabilities and Exposures (CVE) notices and other security bulletins, and checking to see the state of your hash functions and cipher suites.