The Department of Energy is releasing a record amount of supercomputing time, 1.3 billion processor hours, which has astrophysicists, biologists and everyone in between drooling in anticipation.

Starting in 2010, some of them will have the chance to run the biggest and most intricate simulations ever, creating experimental galaxies, plasma fusion reactors and global climates to help solve some of science's most complex problems.

They'll be competing for time on the Cray XT system "Jaguar" at Oak Ridge National Laboratory in Tennessee and the IBM Blue Gene/P "Intrepid" at Argonne National Laboratory in Illinois, two of the most powerful supercomputer facilities in the world. Unlike many of the DOE's big machines, they're dedicated to open, unclassified research.

"This is an incredible increase in computing power, which was itself a huge increase from the year before," said DOE spokesman Jeff Sherwood. "It's for research that would not be possible without petascale computing."

In 2009, 900 million processor hours were up for grabs (a million processing hours would take 1,000 processors 1,000 hours, or around 41 days), but both computers received huge performance boosts this year. Jaguar's processor count has shot up from 31,328 to 180,832, while Intrepid now boasts 163,840 from 32,768. Jaguar's peak performance is now a blistering 1.64 petaflops (a quadrillion and a half floating point operations per second), making it the second most powerful supercomputer on Earth.

The only number cruncher with more power is IBM's "Roadrunner" at Los Alamos National Laboratory in New Mexico, which runs simulations of nuclear weapons, and sits behind the rather impenetrable firewall of national security.

The boost will allow scientists to tackle traditionally cantankerous problems that involve multiple simultaneous physical phenomena. In enormous, high-resolution simulations, they can tweak an unprecedented multitude of conditions to test their theories.

"These are the only places in the world where you can do these types of simulations," said Bronson Messer of ORNL, whose simulations of core-collapse supernovae were allocated 75 million processing hours in 2009, more than any other project. "In the case of stars and dark matter, there's a lot of physics going on. They're very attractive targets for a big machine like this."

Messer hopes to understand how stars more than 10 times the mass of our sun die. Such deaths are the dominant source of elements in the universe.

Giant stars live much shorter lives than sun-size stars — about 10 million years rather than 5 to 15 billion — and they do not die quietly. The iron core collapses from a couple thousand miles in diameter to about 35 miles in half a second, reaching a density equivalent to the mass of all humans on Earth packed into a sugar cube. The core then rebounds with a blast of neutrinos and a shock wave that rips the star apart.

As the saying goes, these supernovae "live fast, die young, and leave a beautiful corpse," but astrophysicists don't understand some of the basic mechanisms of how the alluring explosions take place.

"People have been trying to attack this on computers for 40 to 50 years," Messer said. "We're using simulations to basically pick a neutrino and ride along as it flies out of the star, and do this with as many neutrinos as possible. We're finally starting to see some hints of what's happening, and we're pretty jazzed about it."

The team has used Jaguar to simulate the supernova up to about 100 milliseconds after the shock wave begins, and they hope to reach half a second. "If we can use 180,000 processors, we'll get much more accurate physics," Messer said.

Another astrophysics project on Jaguar led by Piero Madau at the University of California, Santa Cruz will use 5 million processor-hours to explore the invisible halo of dark matter that surrounds the Milky Way.

Madau's simulations, the largest ever performed for the Milky Way, will divide the halo into 30 billion parcels of dark matter and will simulate their evolution over 13 billion years. In an earlier simulation on Jaguar, he produced the most detailed picture of the Milky Way's halo, showing that small clumps of dark matter from the early galaxy survived to present day.

"In fact, the entire galaxy is decidedly clumpy, including our own neighborhood," he said. "Earlier simulations on less powerful systems had shown the dark matter smoothing out, especially in the galaxy's dense inner reaches, because they did not have the resolution to resolve the unevenness."

Astrophysics projects get the biggest allocation of Jaguar's processing time — between 18 and 19 percent of the total. But other areas involving vast scales and multiple physical phenomena, like climate and combustion, also get sizable chunks of processing time.

Jackie Chen of Sandia National Laboratories is using 30 million processor-hours on Jaguar to simulate the combustion process of alternative fuels, like biofuel and ethanol. Her modeling of flames, ignition and turbulence can influence engine design, allowing for higher-efficiency and lower emissions vehicles.

"To understand the underlying physics of what's going on in the internal combustion engines with alternative fuels," said Chen, "we need some of the world's largest calculations."

Other ongoing projects seek to understand how proteins misfold in neurodegenerative diseases, develop thermoelectric materials to capture wasted heat from tailpipe emissions and create high-resolution climate models.

See Also:

Images: 1) Dark matter simulation. Piero Madau/UC Santa Cruz, Sean Ahern/Oak Ridge National Laboratory. 2) Jaguar Cray XT supercomputer. ORNL. 3) simulation of an ethylene-air jet flame. H. Yu/Sandia National Laboratories, K. L. Ma/ SciDAC Institute for Ultrascale Visualization