Centers for Enabling Technologies

Overcoming technical challenges to enable effective use of Terascale and Petascale systems

DOE Program Managers
Computer Science  
Lucy Nowell and Sonia Sachs
Applied Math Centers and Institutes   
Karen Pao    
Distributed Computing
Thomas Ndousse-Fetter
DOE Office of Advanced Scientific Computing Research

Centers for Enabling Technologies (CET) are interconnected multidisciplinary teams that are coordinated with SciDAC Scientific Applications to address the Mathematical and Computing Systems Software Environment elements of the SciDAC Scientific Computing Software Infrastructure. This infrastructure envisions a comprehensive, integrated, scalable, and robust high performance software environment, which overcomes difficult technical challenges to quickly enable the effective use of terascale and petascale systems by SciDAC applications. CETs address needs for: new algorithms which scale to parallel systems having hundreds of thousands of processors; methodology for achieving portability and interoperability of complex high performance scientific software packages; operating systems and runtime tools and support for application execution performance and system management; and effective tools for feature identification, data management and visualization of petabyte-scale scientific data sets. CETs also address the Distributed Science Software Environment elements of the SciDAC program.

In order to foster broad availability and use of CET-developed code, all CET applications specified the type of open source license to be used and the mechanisms, including web sites, workshops, and other community-based activities, that will be used to disseminate information about CET software.

The SciDAC Centers for Enabling Technologies will focus on:

  • Algorithms, methods, and libraries Algorithms, methods and libraries that are fully scalable to many thousands of processors with full performance portability
  • Program development environments and tools Component-based, fully integrated, terascale and petascale program development and tools, which scale effectively and provide maximum utility and ease of use to developers and scientific end users
  • Operating system and runtime software and tools Systems software that scales to hundreds of thousands of processors, supports high performance application-level communication, interoperability, optimization, and provides the highest levels of fault tolerance, reliability, manageability, and ease of use for end users, tool developers and system administrators
  • Visualization and data management systems Scalable, intuitive systems fully supportive of SciDAC application requirements for moving, storing, analyzing, querying, manipulating and visualizing multi-petabytes of scientific data and objects
  • Distributed data management and computing tools Scalable and secure systems for the analysis of large volumes of data produced at experimental facilities, often through complex workflows, and consumed by a large and distributed user community, as well as end-to-end network tools and services to support high-end applications

Enabling Technologies Centers Announced in September 2006

Visualization and Data Management

Seeing the Unsee-able
Visualization and analytics software technology to increase scientific productivity and create new possibilities for scientific insight
    Principal Investigator: E. Wes Bethel (ewbethel@lbl.gov)
    Lawrence Berkeley National Laboratory

Getting the Science out of the Data
Scientific data management to help scientists spend more time studying their results and less time managing their data
    Principal Investigator: Arie Shoshani (shoshani@lbl.gov)
    Lawrence Berkeley National Laboratory

Applied Mathematics

Advancing Science via Applied Mathematics
Applied Partial Differential Equations to develop simulation tools for solving multi-scale and multi-physics problems
    Principal Investigator: Phillip Colella (PColella@lbl.gov)
    Lawrence Berkeley National Laboratory

Bigger and Better Simulations
Interoperable technologies for advanced petascale simulations to improve accuracy and efficiency
    Principal Investigator: Lori Diachin (diachin2@llnl.gov)
    Lawrence Livermore National Laboratory

Improving Application Performance at the Petascale
Towards optimal petascale simulations
    Principal Investigator: David E. Keyes (kd2112@columbia.edu)
    Columbia University

Computer Science

Plug and Play Supercomputing
Common Component Architecture support facilitating software and programming language interoperability, domain-specific common interfaces, and dynamic composability
    Principal Investigator: David E. Bernholdt (bernholdtde@ornl.gov)
    Oak Ridge National Laboratory

Moving Mountains (of Data)
Enabling distributed petascale science
    Principal Investigator: Ian Foster (foster@mcs.anl.gov)
    Argonne National Laboratory

Multi-core Compilers
Center for Scalable Application Development Software for Advanced Architectures
    Principal Investigator: John Mellor-Crummey (johnmc@cs.rice.edu)
    Rice University

Sharing a World of Data
Scaling the Earth Systems Grid to Petascale Data to enable faster, easier sharing of climate change research data
    Principal Investigator: Dean N. Williams (williams13@llnl.gov)
    Lawrence Livermore National Laboratory

 


Home  |  ASCR  |  Contact Us  |  DOE disclaimer