Skip to main content

The Exciting World of Exascale

Premium

The fastest high-performance computing system in the world is the National Center for Computational Science's "Jaguar," a Cray XT5 supercomputer with a peak performance of 2.33 petaflops — roughly 2,000 trillion calculations per second — that came online in 2009. Given that the petascale barrier has only recently been broken, it may come as a surprise that the next jaw-dropping level of HPC is already receiving serious attention. Exascale computing — 1018, or a quintillion, floating-point operations per second — is a scale of computing that most in the HPC community have only recently fantasized about (and been terrified by).

But over the last 12 months, it has become clear that exascale computing will no longer only be relegated to isolated discussion groups at HPC conferences or the musings of supercomputing fanatics.

In late March, IBM and the Juelich Supercomputing Centre, Forschungszentrum Juelich, penned a deal to launch a joint Exascale Innovation Center aimed at developing hardware and software for exascale computing. The goal of the new center is to produce a prototype of an exascale system by 2015 and complete a fully functional exascale-class machine capable of working on real scientific problems by 2019.

Earlier this year, the G8 Research Councils — comprised of seven national research agencies in France, Canada, Russia, the United Kingdom, Germany, and the United States — announced a call for proposals backed by $13.6 million in funding for projects geared toward exascale computing.

In an effort to start digging into the problem of exascale software development, Jack Dongarra, a professor of computer science at the University of Tennessee, along with Pete Beckman, director of the Argonne Leadership Computing Facility at Argonne National Laboratory, and a slew of other HPC thought leaders from around the globe launched the International Exascale Software Project last year. The impetus for forming the IESP arose from a belief that the open-source community, while effective when it comes to developing solutions for single projects or specific problems, lacks global planning and coordination, and will simply not be able to provide proper support to help researchers with the challenges of exascale software development or the flexibility needed to take advantage of new hardware models.

"It's an activity intended to develop a road map for software that will get to exascale. We're trying to understand what areas of software are critical, what areas of software development need to receive funding ahead of other areas," Dongarra says. "Hopefully, it will help researchers and funding agencies, in terms of the organization at an international level, of putting resources into play that will help us to get to the state at which exascale can be a reality. … It's going to take a coordinated effort among researchers, computer vendors, and funding agencies, to get to there."

But will the life sciences community ever be able to take advantage of this scale of computing? And will they want to? "Absolutely, we're going to have a huge need for it across the board," says Rick Stevens, the associate director for Computing, Environment, and Life Sciences at Argonne. "Right now, molecular models don't take into account the molecular physiology of the organisms — they're abstractions — and as we get more observational data, and as genomes fill out in terms of annotations, and protein function fills out and we have more structural data, we'll start to move from abstract models towards real 3D computational models of cellular processes and I think that will happen over the next five to 10 years." It is probably a good thing that exascale computing won't become a reality for another 10 to 15 years, because it may take that long for most biologists to get a grasp on the potential of the scale.

The frontier

Overall, the bioinformatic community is gaining awareness of what the frontier of large-scale computing looks like, though it's a slow process. Stevens says that one of the first areas for the application of exascale computing in biology that comes to mind is the ability to put things into a significant evolutionary context. "In human medicine, we only have a handful of model organisms and they're pretty far apart from an evolutionary standpoint. I think as we go forward and as sequencing become cheaper, you're going to see more and more studies that are trying to take an actual evolutionary position," Stevens says. "Pretty soon we're going to have 10,000 or so complete genomes and you'll be able to ask questions across the phylogenetic space that we can't even conceive of asking today. ... You're going to be able to ask questions across many genomes that are posed in a way that allow you to look at what was evolution doing."

For example, if a researcher would like to reconstruct the metabolic network of a bunch of microbes, the current computational limitations require that this be done one at a time. Consequently, researchers spend a lot of time trying to understand what that organism is capable of doing, but in the future researchers could ask about the evolution of metabolism across taxa or about how a particular organism responded to changes in its environment. These are cross-cutting questions that investigators are not able to even ask today because they cannot compute it, Stevens says.

"Many areas of science have the need for advanced computing resources; sometimes people don't realize it today, but the need for HPC comes about because researchers want to get better resolution or fidelity in the solution that you're looking for and the other is that you would like to compute the solution faster," Dongarra says. "In the life sciences community, both would probably apply: you would like to have faster turn-around of some of the computations that are done, and exascale would provide that boost; and also you want to be able to get better resolution of some of the details of a simulation. HPC can help with that."

But while HPC software development for large-scale scientific applications seems to follow a steady trajectory from gigaflops to teraflops to petaflops, this gradual evolution will not continue once exascale systems begin popping up. Issues such as how to approach parallelism on this scale, as well as fault tolerance, networking architectures, and, of course, power and cooling, all present huge challenges.

"The general consensus is that it's going to have radical changes in how we look at programming for these large systems because we're talking about billion-way parallelism on these systems, so billions of threads of execution have to be coordinated and today we really don't have the programming models that can fit that mold," Dongarra says. "There is going to be a path that goes from today to exascale and that path is going to have to have some radical departure from what we're doing at the moment."

The Scan

Pfizer-BioNTech Seek Full Vaccine Approval

According to the New York Times, Pfizer and BioNTech are seeking full US Food and Drug Administration approval for their SARS-CoV-2 vaccine.

Viral Integration Study Critiqued

Science writes that a paper reporting that SARS-CoV-2 can occasionally integrate into the host genome is drawing criticism.

Giraffe Species Debate

The Scientist reports that a new analysis aiming to end the discussion of how many giraffe species there are has only continued it.

Science Papers Examine Factors Shaping SARS-CoV-2 Spread, Give Insight Into Bacterial Evolution

In Science this week: genomic analysis points to role of human behavior in SARS-CoV-2 spread, and more.