[#video: https://www.youtube.com/embed/GIIyzcZusUw

Brad Settlemyer had a supercomputing solution in search of a problem. Los Alamos National Lab, where Settlemyer works as a research scientist, hosts the Trinity supercomputer—a machine that regularly makes the internet’s (ever-evolving) Top 10 Fastest lists. As large as a Midwestern McMansion, Trinity’s main job is to ensure that the cache of US nuclear weapons works when it’s supposed to, and doesn’t when it’s not.

The supercomputer doesn’t dedicate all its digital resources to stockpile stewardship, though. During its nuclear downtime, it also does fundamental research.

Settlemyer wanted to expand the machine's scientific envelope. So he set out in search of a problem that even Trinity couldn’t currently solve. What he found was a physicist who wanted to follow only the most energetic particles through a trillion-particle simulation—a problem whose technological solutions have surprising implications for the bomb babysitters at Los Alamos.

Settlemyer and his team—a collaboration with Carnegie Mellon’s Parallel Data Lab—had been working for a while on a way to create huge numbers of files very fast. But they didn’t know how far could they push that capability. How many files, and how fast? “We were working on this tech, and we needed a use case,” he says. “What we really wanted was to find something over the top.”

So they started asking around Los Alamos, and found a lab scientist studying “Fermi acceleration,” a speed-up that happens to the particles in supernovae and solar flares. As particles oscillate back and forth, they gain speed along the way—acting kind of like pinballs bouncing between bumpers. The scientist wanted to simulate a plasma, the fourth state of matter that’s just a stew of dismembered nuclei and electrons, and see if its pinballs accelerated this way.

To do so, however, he needed to find out which few thousand particles—out of a trillion or so—accelerated to the highest speeds. “The problem,” according to Settlemyer, “is you don’t know until the end.” That made the particles essentially untrackable under the existing computing limits.

But maybe he and his team could fix that, if they could gin up files fast enough. They’d use a kind of program called a “vector particle-in-cell,” or VPIC code, invented at Los Alamos back in 1955. This program essentially allows scientists to keep track of individual particles, to see where they go and what they do in a certain situation. In nuclear research, scientists often use particle-in-cell code to understand how plasma mixes with plasma.

That mixing matters for Los Alamos because nuclear bombs produce plasma. Scientists don’t explode bombs with abandon anymore to understand them—as they did in the early days, turning islands into holes. Instead, they simulate bombs’ statuses, and look back at old videos to try to simulate what they see. To date, they haven’t been able to get at all the nuance in the footage. But with slick new simulations, Settlemyer says maybe they can.

But first, they had to test their file-creation speed limits using the physicist’s Fermi acceleration problem.

Here’s how such a simulation would classically work: The supercomputer would essentially take snapshots of all trillion particles at once, throughout the process. To find the most energetic characters in the final picture, and then rewind through their trajectories, the supercomputer would need to dig through each snapshot (each a couple of terabytes) to pull out the path of the relevant particles. “That was a huge cost,” says Settlemyer. Too huge: It would have crashed Trinity.

Settlemyer’s solution was, instead, to create more files with less information: one file for every particle, tracing each one through the entirety of the simulation. If Settlemyer put those files into a searchable index, the scientist could simply ask the computer, “Which of those particles’ lives ends with the biggest bang?”