Can you remember the last time you did calculus? Unless you are a researcher or engineer, chances are good it was in a high-school or college class you’d rather forget. For most Americans, solving a calculus problem is not a skill they need to perform well at work.

This is not to say that America’s workforce doesn’t need advanced mathematics—quite the opposite. An extensive 2011 McKinsey Global Institute study found that by 2018 the U.S will face a 1.5 million worker shortfall in analysts and managers who have the mathematical training necessary to deal with analysis of “large data sets,” the bread and butter of the big-data revolution.

The question is not whether advanced mathematics is needed but rather what kind of advanced mathematics. Calculus is the handmaiden of physics; it was invented by Newton to explain planetary and projectile motion. While its place at the core of math education may have made sense for Cold War adversaries engaged in a missile and space race, Minute-Man and Apollo no longer occupy the same prominent role in national security and continued prosperity that they once did.

The future of 21st-century America lies in fields like biotechnology and information technology, and these fields require very different math—the kinds designed to handle the vast amounts of data we generate each day. Each individual’s genome contains more than three billion base pairs and a quarter of a million genomes are sequenced every year. In Silicon Valley, computers store over 100 GBs of data—more information than contained in the ancient library at Alexandria—for every man, woman and child on the planet.

Accompanying the proliferation of new data is noise, and a major job for data analysts and scientists is to tease out true signal from coincidence and noise. Knowing when a result is due to chance versus when it is statistically significant requires a firm grasp of probability and statistics and an advanced understanding of mathematics.