Multiplying large numbers is hard. In 1971, two German professors predicted an algorithm that would make it easier, but no one ever proved it.

Until now.

Mathematicians from Australia and France say their algorithm can make multiplication and other kinds of arithmetic much more efficient.

From grade school onward, complex multiplication has been a headache. But an assistant professor from the University of New South Wales Sydney in Australia has developed a new method for multiplying giant numbers together that's more efficient than the "long multiplication" so many are taught at an early age.

“More technically, we have proved a 1971 conjecture of Schönhage and Strassen about the complexity of integer multiplication,” associate professor David Harvey says in the video below.

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

The Schönhage–Strassen algorithm, developed by two German mathematicians, was actually the fastest method of multiplication from 1971 through 2007. Although a faster method was developed in 2007, it's rarely used today.

Schönhage and Strassen predicted that an algorithm multiplying n-digit numbers using n * log(n) basic operations should exist, Harvey says. His paper is the first known proof that it does.

Harvey picks the example of 314 multiplied by 159—a large equation, sure, but much larger ones are used every day in real-life scenarios. To solve the problem, most people are taught to multiply each individual number together, and then add up the sums:

9 is multiplied by 4, 1, and 3; then 5 is multiplied by 4, 1, and 3, and so on. The result ends up with 9 digit-by-digit products.

This method is called n2 or n squared, because one must multiple n by n a number of times. It will get the correct answer, but Schönhage and Strassen designed a faster method. It was able to move beyond n2 and into something smaller, but a challenge still presented itself in the form of n * log(n).

For some, seeing a word in the middle of a math problem might be enough to make their eyes glaze over like they did in algebra class. As a refresher: log is short for logarithm, which helps people decipher exponents that make numbers squared or cubed or even something higher. So 2⁵ is 32, but expressed logarithmically, it would look like log₂(32)=5. While it may seem like a mouthful, logarithms are crucial when numbers get much larger.

The Schönhage-Strassen method is very fast, Harvey says. If a computer were to use the squared method taught in school on a problem where two numbers had a billion digits each, it would take months. A computer using the Schönhage-Strassen method could do so in 30 seconds.

But if the numbers keep rising into the trillions and beyond, the algorithm developed by Harvey and collaborator Joris van der Hoeven at École Polytechnique in France could find solutions faster than the 1971 Schönhage-Strassen algorithm.

“It means you can do all sorts of arithmetic more efficiently, for example division and square roots," he says. "You could also calculate digits of pi more efficiently than before. It even has applications to problems involving huge prime numbers.

“People have been hunting for such an algorithm for almost 50 years," he continues. It was not a forgone conclusion that someone would eventually be successful. It might have turned out that Schönhage and Strassen were wrong, and that no such algorithm is possible.

“But now we know better,” he says.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io