Improving Application Performance at the Petascale

Towards optimal petascale simulations

David E. Keyes (project webpage)
Columbia University

Multiscale, multirate scientific and engineering applications in the SciDAC portfolio possess resolution requirements that are practically inexhaustible and demand execution on the highest-capability computers available, which will soon reach the petascale. While the variety of applications is enormous, their needs for mathematical software infrastructure are surprisingly coincident; moreover the chief bottleneck is often the solver. At their current scalability limits, many applications spend a vast majority of their operations in solvers, due to solver algorithmic complexity that is superlinear in the problem size, whereas other phases scale linearly. Furthermore, the solver may be the phase of the simulation with the poorest parallel scalability, due to intrinsic global dependencies. This project brings together the providers of some of the world’s most widely distributed, freely available, scalable solver software and focuses them on relieving this bottleneck for many specific applications within SciDAC, which are representative of many others outside. Solver software directly supported under TOPS includes: hypre, PETSc, SUNDIALS, SuperLU, TAO, and Trilinos. Transparent access is also provided to other solver software through the TOPS interface.

The primary goals of TOPS are the development, testing, and dissemination of solver software, especially for systems governed by partial differential equations. Upon discretization, these systems possess mathematical structure that must be exploited for optimal scalability; therefore, application-targeted algorithmic research is included. TOPS software development includes attention to high performance as well as interoperability among the solver components. Support for integration of TOPS solvers into SciDAC applications is also directly supported by this project.

A diverse body of science poised for breakthroughs—given the ability to resolve more scales, sample larger ensembles, and/or couple together more phenomena simultaneously—is identified in the twin volumes of DOE’s A Science-based Case for Large-scale Simulation (the “SCaLeS” report). However, for every model that is ready today for predictive simulation at the petascale (e.g., lattice gauge theory), there is another (e.g., in biology, subsurface hydrology) that will most rapidly achieve predictive power through petascale experimentation. There is a daunting dichotomy between the steady improvements in the capabilities and price performance of computer hardware and the difficulty of fully exploiting it by the majority of practicing computational scientists, whose expertise lies, instead and appropriately, in their science. The TOPS center responds directly to this dichotomy by developing, demonstrating, and disseminating scalable solver software particularly in the areas of Accelerator Modeling and Design, Subsurface Reactive transport, and Quantum Chromodynamics (QCD).

Center for Enabling Technology: Applied Mathematics

Project Title: Towards Optimal Petascale Simulations (TOPS)

Principal Investigator: David E. Keyes
Affiliation: Columbia University

Project webpage:

Participating Institutions and Co-Investigators:
Argonne National Laboratory - Jorge More, Barry Smith, Dinesh Kaushik, Matthew Knepley, Lois McInnes, and Todd Munson
Lawrence Berkeley National Laboratory - Sherry Li, Esmond Ng, Parry Husbands and Chao Yang
Lawrence Livermore National Laboratory - Rob Falgout, Carol Woodward, Barry Lee, Radu Serban, and Ulrike Yang
Sandia National Laboratories - Michael Heroux and Jonathan Hu
Columbia University - David E. Keyes (PI)
University of California at Berkeley - James Demmel
University of California at San Diego - Daniel Reynolds
University of Colorado at Boulder - Steve McCormick, Thomas Manteuffel, Marian Brezina, Xiao-Chuan Cai, and John Ruge
University of Texas at Austin - Omar Ghattas

Funding Partners: Office of ScienceOffice of Advanced Scientific Computing Research

Budget and Duration: Approximately $3.1 million per year for five years 1

Other SciDAC Enabling Technologies Centers
Other SciDAC Applied Mathematics efforts

1Subject to acceptable progress review and the availability of appropriated funds


Home  |  ASCR  |  Contact Us  |  DOE disclaimer