Breaking in, one number at a time PSL Images / Alamy Stock Photo

A new record has been set for the largest encryption key ever cracked – but your secrets should be safe for now.

Long strings of numbers are essential to the encryption that keeps our online data safe. One widely used form of encryption called RSA cryptography relies on the fact that it is extremely difficult to find the prime numbers that multiply together to yield very large numbers.

The inventors of the RSA algorithm published a list of RSA keys and challenged people to find the original primes, as a way of tracking how secure the encryption is against modern computers.


Now Emmanuel Thomé at the National Institute for Research in Computer Science and Automation in France and his colleagues have broken the record for the largest key cracked so far.

The team factored RSA-240, an RSA key that is 795 bits in size, with 240 decimal digits. The previous RSA record was set in 2010, with a key of 232 decimal digits and 768 bits.

“We were actually faster than the previous record, even though we computed something larger,” says Thomé.

The team also computed a discrete logarithm of the same size – these are essential for secure communications over computer networks, such as when a computer connects to a website securely using HTTPS.

Thomé and his colleagues ran computations across clusters of computers in France, Germany and the US. The total computing time took the equivalent of a single computer core running for 35 million hours, or almost 4000 years.

It took 8 million core hours to crack RSA-240, and computing the discrete logarithm was even more time-intensive, taking 27 million core hours.

The RSA keys most commonly used by ordinary computers today are larger in size, around 2048 bits, so the calculation isn’t a threat to computer security.

We would expect to crack larger and larger RSA keys as computing powers improves – a rule-of-thumb, known as Moore’s law, predicts that computing power doubles roughly every 18 months and can be used to determine when key sizes should be broken, given the time it took for previous records to be set, says Thomé.

This time, the team managed to do it faster than expected, he says. “We provided a new data point to make people able to determine how hard it should be now and, in the future, to compute things.”