Featuring over 9,000 CPUs and 27,000 GPUs, Summit is an information processing powerhouse.

Image credit:Oak Ridge National Laboratories

The cache : The “holding area” or short-term memory for pieces of data that the CPU is operating on.

: The “holding area” or short-term memory for pieces of data that the CPU is operating on. Arithmetic logic units (ALUs): The blocks that perform the basic calculations on data.

(ALUs): The blocks that perform the basic calculations on data. The control: The system that communicates between the cache, the ALUs, and the rest of the computer. The control is responsible for decoding and using the ALUs to execute instructions.

When you think of a scientist, do you imagine some lone figure, wreathed in a meticulous lab coat, furtively working late into the night, combining strange ingredients in a beaker or measuring something with a set of calipers? While it’s certainly true that many physicists engage in some sort of hands-on research, in the era of modern science that’s only half of the picture. The reality is that the physics of today is increasingly data- and computer-driven. That means that scientists often spend large portions of their time writing and implementing code to help them analyze the results they’re measuring in the lab. But some researchers take it even further—they may not have a physical experiment set up in front of them at all!In many fields, notably astrophysics and quantum physics, it’s somewhere between impractical and impossible to perform experiments in the traditional sense. Instead, researchers use fundamental principles and equations to derive theoretical models, then compare the predictions of these models against measurable quantities—for example, the light emitted from a star. Once they’re satisfied that their mathematical models are reasonable, they can perform experiments by tweaking different parameters and examining the (simulated) results.However, physicists are constantly developing more and more detailed models—which take more and more computing power to create meaningful predictions. That’s why many labs, including Oak Ridge National Laboratory, concentrate on developing powerful supercomputers to keep up with the ever-evolving models.The Oak Ridge Leadership Computing Facility (OLCF) was established twenty-five years ago to work on this very problem, and it recently unveiled its newest addition: a 340-ton supercomputer named Summit. This massive machine weighs more than a large commercial aircraft and takes up floorspace equivalent to two tennis courts, but it holds the distinction of being the most powerful supercomputer in the world.The secret to Summit’s power lies in its hybrid CPU-GPU architecture. The terms CPU and GPU may be familiar, but here’s a quick refresher: the CPU is the central processing unit of a computer, which commonly referred to as the “brains” of the computer. In practical terms, this is where all the individual calculations ingrained in your computer code are performed. The CPU is composed of three basic units:CPUs usually have a large cache but limited ALUs. This means that they’re able to perform calculations on large amounts of data, but they can’t do much in the way of parallel processing, instead focusing on sequential instructions. Meanwhile, the control is quite flexible and can handle very complex instructions. The end result is a powerful but potentially slow tool.The(graphics processing unit), on the other hand, is an increasingly popular processor originally used primarily for 3D game rendering. In contrast to the CPU, it consists of many individual processors, each with their own mini cache, ALUs, and control. This allows the GPU to quickly perform comparatively simple operations on many pieces of data at once.Clearly, both types of processing units have their own benefits and disadvantages. Summit’s solution is to use both CPUs and GPUs, allowing for an efficient exchange of data between the different systems. While it isn’t the first to do so—its predecessor at OLCF, Titan, works on the same principles—the process has been streamlined and implemented with even more powerful hardware.This means that at peak efficiency, Summit is able to perform 200,000 trillion calculations per second (200 petaflops). To put that in perspective, if every human on Earth performed one calculation per second, it would take 305 days to get all those 200,000 trillion calculations done!This computing power comes at an enormous energy cost. To keep the system cool, 4000 gallons of water are pumped through the cooling system every minute, carrying away 13 megawatts of heat—that’s roughly equivalent to the power consumed by 1000 homes! What’s the benefit? Here’s a quick look at some of the early science projects that have been allocated time on Summit.One group at OLCF will use the new capabilities to probe supernovae, or exploding stars. These comparatively rare events are responsible for the heavy elements, including precious metals like gold and platinum, that we find here on Earth. Using Summit, astrophysicists will be able to examine the small-scale particle interactions that create these elements in much greater detail, tracking twelve times as many elements! “It’s at least a hundred time more computation than we’ve been able to do on earlier machines,” Oak Ridge computational astrophysicist Bronson Messer commented in a press release Another project will hopefully help scientists develop efficient fusion reactors. As outlined in a recent post, nuclear fusion holds the promise of an enormous source of clean energy, but it’s notoriously difficult to harness. CS Chang and his collaborators will use Summit to simulate interactions between plasma (a state of matter where electrons are no longer associated with individual nuclei) and the interior of the reactor—an extraordinarily complex problem that requires extreme computing power.Finally, Summit may assist in the development of new materials. One application that models quantum interactions, called QMCPACK, has in the past simulated dozens of atoms to predict the properties of next-generation materials. Now, with Summit online, it will support models made up of hundreds of atoms. This huge advance may help create even better superconductors, materials that conduct electricity without losing energy, which can affect everything from medical imaging to grid stability.The launch of Summit is clearly exciting news for supercomputer aficionados, but it also holds great potential for scientists in a huge number of disciplines, from astrophysics to materials research—so chances are you’ll be hearing a lot more about the studies done on Summit!