Archive of SciDAC Discovery Highlights
VACET, ESG partner for new Climate tools
VACET, a SciDAC visualization project, is working to help bring advanced visualization technology to the climate modeling and analysis community through a partnership with the Earth Systems Grid (ESG).
During the past several months, VACET has been working closely with ESG to prepare both new tools/technologies and material for presentation at the December 2009 climate meeting in Copenhagen.
Prior to working with VACET, ESG's viz/analysis tool (Climate Data Analysis Toolkit, CDAT) consisted of only 1D & 2D charting/plotting tools. VACET’s role has been to roll out new 3D/4D visualization technologies that are now included in the CDAT release. This effort is beginning to bear fruit, with the first set of objectives: 3d slicing, isocontouring, multiple linked 3D views, and 3D moviemaking.
This image, one frame from the complete video which was shown in the WCRP booth at the climate meeting in Copenhagen, demonstrates the use of the visualization to communicate an important insight. The insight in this example is that the outer layer of the atmosphere is cooling while the lower layer is warming.
The other activity conducted in the process of developing this video was demonstrating that this visualization could be realized in real time while transferring massive amounts of data. The real time visualization demonstration was a finalist in the Supercomputing 2009, Bandwidth Challenge, (November 2009). The team was recognized for demonstrating the transfer of 10TB of climate data and producing this visualization in real time on the receiving end.
New Climate Report Shows Regional Impacts
Evan Mills and Michael Wehner, researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, contributed to the analysis of the effects of climate change on all regions of the United States, described in a major report released June 16, 2009 by the multi-agency U.S. Global Change Research Program.
For the southwest region of the United States the report forecasts a hotter, drier climate with significant effects on the environment, agriculture and health.
Global Climate Change Impacts in the United States covers such effects as changes in rainfall patterns, drought, wildfire, Atlantic hurricanes, and effects on food production, fish stocks and other wildlife, energy, agriculture, water supplies, and coastal communities.
The report addresses nine zones of the United States (Southwest, Northwest, Great Plains, Midwest, Southeast, Northeast, Alaska, U.S. islands, and coasts), and describes potential climate change effects in each. Some states may fall in more than one zone; California is part of the southwest zone, but also the coastal zone.
The precipitation map shown is one of the projections developed by Wehner. It shows, among other things, a substantial reduction in springtime rains in California, and summertime rains in the Pacific Northwest.
“Even in areas where precipitation is projected to increase, higher temperatures will cause greater evaporation leading to a future where drought conditions are the normal state. In the southwest United States, water resource issues will become a major issue,” says Wehner.more
ComPASS team helps diagnose a CEBAF issue
At Jefferson Lab, scientists use CEBAF (Continuous Electron Beam Accelerator Facility) and three experimental halls to study quarks, gluons, protons and neutrons inside the nucleus with CEBAF and its three experimental halls. Much like a giant, powerful microscope, CEBAF enables scientists to "see" things a million times smaller than an atom. This unprecedented view of the basic building blocks of ordinary matter and their interactions is allowing us to gain deeper insight into the particles and forces that build our universe.
When researchers encountered an issue with the beam break up (BBU) in the CEBAF accelerator, they turned to Kwok Ko, Cho Ng, and the ComPASS team for assistance. Utilizing the advanced codes (such as SLAC ACD) developed under the SciDAC program, the ComPASS team was able to accurately model the superconducting cavity configuration installed in Jlab's prototype cryomodule and, from external measurements, "reverse engineer" what must have happened during assembly. The BBU issue was tracked and attributed to a particular cavity which had a non-standard preparation history, one that resulted in a mechanical distortion of the shape. The cavity performed well in normal operating mode, but a higher-order mode (HOM) that should have been damped was tilted away from the coupler designed to extract it.
By using the unique and highly accurate algorithms and solvers developed at SLAC, Volkan Akcelik, Zenghai Li and their colleagues were able to reconstruct the cell-by-cell distortions that must have occurred. They predicted the cavity variance and examination of inspection records corroborated their prediction. Beam-based measurements confirmed the effect of this distortion in the dynamics of the CEBAF Linac. With this understanding Jlab is now able to develop improved procedures to check future cavities to prevent reoccurrance of this undesirable effect.
Researchers use PFLOTRAN/Jaguar to study CO2 sequestration
The SciDAC project, Modeling Multiscale-Multiphase-Multicomponent Subsurface Reactive Flows using Advanced Computing, led by Peter Lichtner of Los Alamos National Laboratory, is using the PFLOTRAN code to study CO2 sequestration. Injecting supercritical (heated and pressurized) CO2 into subsurface geologic formations is one type of carbon sequestration that has been proposed as a way to mitigate the atmospheric accumulation of greenhouse gases released by the burning of fossil fuels.
The researchers are running simulations based on the CO2 output of a 1000MW gas-fired power plant. The simulation domain is an area 7 x 7 kilometers with a thickness of 250 meters. Permeability and porosity typical of sandstone are used. A CO2 volume corresponding to roughly 75% of the plant's output is injected for 20 years, with the simulation continuing (without further injections) for 300 years. This allows scientists to see how the CO2 might dissipate over time.
Gas-fired plants are an important area of study. In 2000, 19% of the U.S. electricity was provided by gas-fired plants. Of the plants added from 2001-2005, 91% were gas-fired. In 2006 there were more gas-fired generators than coal and hydroelectric combined, accounting for 41% of the total U.S. capacity.
PFLOTRAN, a next generation reactive flow and transport model, has been demonstrated to scale to 27580 processors on ORNL’s XT4 Cray, "Jaguar". PFLOTRAN is based on the PETSc parallel libraries which enable highly efficient solution to partial differential equations using domain decomposition. The code demonstrates strong scaling, approaching petaflop performance.
Blue Gene/P Simulations Shed Light on Key Process in Type Ia Supernovae
In their study of Type Ia supernovae, among the brightest and most powerful exploding stars in the universe, University of Chicago researchers have addressed a critical question about buoyancy-driven turbulent nuclear combustion, a key physical process in these explosions.
Using the FLASH code on the IBM Blue Gene/P supercomputer at the Argonne Leadership Computing Facility, researchers addressed the question, "Is buoyancy-driven turbulent nuclear combustion due primarily to large-scale or small-scale features of the flame surface?" They used more than 40 million processor-hours on the BG/P to run a grid of simulations for different physical conditions. The research team also developed parallel processing tools needed to analyze the large amounts of data produced by the FLASH simulations of buoyancy-driven turbulent nuclear combustion. Preliminary analysis of these results showed that the flame surface is complex at large scales and smooth at small scales.
The results have been published in the SciDAC 2008 conference proceedings. These findings will be used to treat buoyancy-driven turbulent nuclear combustion more accurately in the whole-star, three-dimensional simulations of Type Ia supernovae at the DOE NNSA ASC/Alliance Flash Center, The University of Chicago.
CEDPS success with caBIG and APS beamline
Under the leadership of Ravi Madduri, member of the SciDAC CEDPS project, Argonne software developers have designed and implemented several innovative technologies in the caBIG architecture and have developed a collaborative information network to enable interoperability among biomedial databases and analytical tools.
The tool called gRAVI (grid remote application virtualization interface, pronounced "gravy") along with Introduce toolkit (part of the caGrid toolkit) provides a framework to enable fast and easy creation of Globus based grid services (while hiding all "grid-ness" from the developer). It addresses service support features such as: advertisement, discovery, invocation, and security (Authentication/Authorization. It allows researchers to wrap executables, applications as Grid services without writing a single line of code and thus reducing barrier to entry for researchers to expose their applications as services promoting reuse.
The caBIG initiative is a four-year project, funded by the National Cancer Institute, with the mission of linking the more than 60 cancer centers across the U.S. into an integrated distributed-computing system. There are over 900 caBIG participants accessing 45 different services through caGRID.
The new infrastructure for caBIG, called caGrid, uses gRAVI for creating, registering, discovering, and invoking analytical routines as Grid services. Authorized researchers nationwide can invoke these services and compose multiple services into workflows for individual applications. The infrastructure also allows one to create a common gateway service between the caGrid and the TeraGrid, which integrates high-performance computers, data storage, and high-end experimental facilities around the country. This new gateway service, bridges caGrid authentication and authorization processes to the TeraGrid security services, so users can easily access the resources of the TeraGrid without having to modify their applications.
The toolset is finding fame in application areas outside of SciDAC, from the caBIG cancer researchers, who have given several awards to the Argonne project team, to experimental researchers like Brian Tieman of the Advanced Photon Source.
At a recent workshop on lightweight tools for collaborative science, Tieman reported that use of gRAVI has shortened his development time for new services from over a month to around two days. Tieman is generating services that control a beamline experiment, and the data analysis, visualization and modeling that follow on. In addition to the shorter development time, Tieman says gRAVI provides security for jobs, and tracks when his remote job finishes, “and science just happens”.
A team of researchers from the University of California-Irvine (UCI), working with staff at Oak Ridge National Laboratory's National Center for Computational Sciences (NCCS), reports the largest run in fusion simulation history.
The team, led by Yong Xiao and Zhihong Lin of UCI, used 93 percent of the NCCS's flagship supercomputer Jaguar, a Cray XT4, with the classic fusion code GTC (Gyrokinetic Toroidal Code), the key production code of two fusion SciDAC projects (GPS-TTBP and GSEP).
The researchers discovered, among other things, that for a device the size of ITER, the containment vessel will demonstrate GyroBohm scaling, meaning that the heat transport level is inversely proportional to the device size. In other words, the simulation supports the ITER design: a larger device will lead to more efficient confinement.
"The success of fusion research depends on good confinement of the burning plasma," said Xiao. "This simulation size is the one closest to ITER in terms of practical parameters and proper electron physics."
However, the huge amounts of data produced by fusion simulations can create I/O nightmares: in one GTC run, the team can produce terabytes of data (in this case 60TB). To address this potential bottleneck, the team used ADIOS, a set of library files that allows for easy and fast file input and output, developed mainly by the NCCS's Scott Klasky and Chen Jin and Georgia Tech's Jay Lofstead and Karsten Schwan.
For more details see NCCS article on HPCwire's "off the wire"
Cubed-sphere spectral element grid
The SciDAC project “Modeling the Earth System” is focused on creating a first-generation Earth system model based on the Community Climate System Model (CCSM). As these improvements will require petascale computing resources, the project is also working to ensure that CCSM is ready to fully utilize DOE’s upcoming petascale platforms. The main bottleneck to petascale performance in Earth system models is the scalability of the atmospheric dynamical core. Team members at Sandia, ORNL and NCAR have thus been focusing on the integration and evaluation of new, more scalable, dynamical cores (based on cubed-sphere grids) into the atmospheric component of the CCSM. The first model successfully integrated uses a new formulation of the spectral element method that locally conserves both mass and energy and has positive preserving advection.
This dynamical core allows the CCSM atmospheric component to use true two-dimensional domain decomposition for the first time, leading to unprecedented scalability demonstrated on LLNL’s BG/L system. The model scales well out to 96,000 processors with an average grid spacing of 25 km. Even better scalability will be possible when computing with a global resolution of 10 km, DOE’s long term goal (DOE ScaLeS Report, 2004). As part of the project’s model verification work, a record-setting one-year simulation was just completed on 64,000 processors of BG/L. This initial simulation was obtained using prescribed surface temperatures and without the CCSM land and ice models. Coupling with the other CCSM component models is the team’s current focus.
In this snapshot of gluon fields from a supercomputer QCD simulation, the gluon fields are started in a nonuniform, chaotic state (left), and quickly diffuse into the full volume of space simulated on the computer (middle and right).
According to the Standard Model of Particles and Interactions, the fundamental constituents of subatomic particles, such as protons and neutrons, are quarks and gluons. The equations governing the forces among quarks have been known for decades. These forces are mediated by particles called gluons, in much the same way that electromagnetic forces are mediated by photons. However, unlike the forces of electricity and magnetism, they become stronger as quarks are pulled apart; this remarkable behavior, which is responsible for the permanent confinement of quarks, is not captured by other force or field theories. The part of the Standard Model that describes this strong interaction, or color force, between quarks and gluons is called Quantum ChromoDynamics (QCD). Only large scale numerical simulations have allowed us to calculate, to high precision, QCD quantities such as the masses and lifetimes of particles containing quarks (i.e. protons, neutrons, etc.). In QCD, quark and gluon fields are defined on a four-dimensional space-time grid called a lattice. The quantum fluctuations of these fields are calculated by Monte Carlo methods. Under its SciDAC grants the U.S. QCD Collaboration (www.usqcd.org) has created a unified programming environment (www.usqcd.org/software.html) for large scale simulations of lattice QCD. With it, they have performed a wide variety of calculations. These include investigations at unprecedented precision of the properties of strongly interacting matter at high temperatures and densities, investigations of the structure and interactions of hadrons, and determinations of the fundamental parameters of the Standard Model, which encompasses our current knowledge of the forces of nature.
Silver (100) surface at T=600K in contact with a Lennard-Jones model liquid
Using an extension of the parallel-replica dynamics method, the surface diffusion of the silver adatom was accurately accelerated by a factor of 6.5 on 8 processors. Using this same approach with more processors will give much larger boost factors for systems at lower temperature
The main challenge in the SciDAC project "Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking" is developing a computational methodology that can simultaneously treat the vast range of scales in time (picoseconds to seconds and beyond) and length (angstroms to millimeters) necessary for accurately simulating the technologically critical process of stress corrosion cracking. As part of this multi-institution project (involving University of Southern California, Harvard, Purdue, California State University at Northridge, and Los Alamos and Lawrence Livermore national laboratories), researchers at Los Alamos National Laboratory are developing a method for accelerating molecular dynamics simulations at the solid-liquid interface.
In the parallel-replica dynamics method, time is parallelized to achieve longer simulations for infrequent-event processes, such as the diffusion of atoms on a surface, or, as is relevant to this project, the activated processes that advance a stress-loaded crack tip. Because stress corrosion cracking often involves a liquid phase in contact with the crack tip, the parallel-replica dynamics method is being extended so that it can be used to accelerate the dynamics at a solid-liquid interface. Initial results look promising for obtaining significant parallel speedup in time for this much more complex system, which heretofore was limited to time scales accessible to direct molecular dynamics.
Valerio Pascucci of Lawrence Livermore National Laboratory, working with members of the SciDAC Visualization and Analytics Center for Enabling Technology (VACET), has developed the first feature-based analysis of extremely high-resolution simulations of turbulent mixing. The focus is on Rayleigh-Taylor (RT) instabilities, which are created when a heavy fluid is placed above a light fluid and tiny vertical perturbations in the interface create a characteristic structure of rising bubbles and falling spikes. RT instabilities have received much attention because of their importance in understanding many phenomena, ranging from the rate of formation of heavy elements in supernovae to the design of capsules for inertial confinement fusion. However, systematic, detailed analysis has been difficult due to the extremely complicated features found in the mixing region. This novel approach, based on robust Morse theoretical techniques, systematically segments the envelope of the mixing interface into bubble structures and represents them with a new multi-resolution model, allowing a multi-scale quantitative analysis of the rate of mixing based on bubble count. This analysis enabled new insights and deeper understanding of this fundamental phenomenon by highlighting and providing precise measures for four fundamental stages in the turbulent mixing process that scientists could previously only observe qualitatively. more
AORSA simulation of 3D radio-wave electric field
propagating in the ITER plasma
Using his AORSA application on the Cray XT4 Jaguar supercomputer, ORNL physicist Fritz Jaeger has performed 3D simulations of radio-wave heating in fusion reactors. The simulations demonstrate that radio-wave heating should work effectively for both present experiments and the multibillion-dollar
ITER fusion reactor. ITER is being developed as a cooperative effort between nations in Europe and Asia, as well as the United States, to demonstrate the scientific and technological feasibility of fusion power. The reactor will use radio waves to heat the ionized gas (plasma) ten times hotter than the sun, thereby causing atoms in the gas to fuse and release energy. Analytical theory, one- and two-dimensional simulations, and experiments have provided an understanding of the relative success of radio-wave heating on medium-scale experiments and its relative inefficiency on smaller experiments. Jaeger’s simulations verified that radio waves tended to heat the edge of the plasma instead of the center on smaller experiments. However, he also demonstrated that radio-wave heating should work efficiently on the larger ITER reactor,
which measures more than 12 meters across and will hold more than 840 cubic meters of plasma.
SciDAC project page
According to the article, Woosley’s scientific career began when he started mixing chemicals – often with explosive results -- as a teenager in Texas. As head of the SciDAC project, Woosley and others now simulate supernova explosions on DOE supercomputers. While supernovae are common – and provide most of the heavy elements in the universe – scientists still haven’t determined exactly what causes the stars to explode.
The article also describes how Burrows, of the University of Arizona, and others are investigating whether sound waves may provide the boost of energy which causes an unstable star to finally explode in a burst brighter than all other stars.
Electronic Density of Water in a Carbon Nanotube
The team is also studying hydration with benzene and hexafluorobenzene. Their 2007 paper in J. Phys. Chem. B reports that the electronic structure of interfacial water molecules differs from that of bulk water, as a result of the interaction with the aromatic solute. These results indicate that the solvation of aromatic species is determined by subtle but important charge transfer and dipole redistribution effects, and cast some doubts on the validity of nonpolarizable models for the study of these systems. These findings also indicate that electronic structure information, as contained in ab initio MD simulations, is an important component in a microscopic description of aromatic hydration.
The flow vectors highlight two strong rotational flows. On the right the flow is moving clockwise along with the shock pattern, whereas at the bottom left the post-shock flow is being diverted into a narrow stream moving anticlockwise, fueling the accretion of angular momentum onto the PNS.
Scientists need a better theory understanding astrophysical processes, particularly the creation of elements in the stars. Engineering applications include the design of next-generation power reactors, reactors to burn nuclear waste, and simulations to obviate the need for nuclear weapons testing. The theoretical methods to be applied will make extensive use of "density functional theory", a tool that has been spectacularly successful in chemistry and in materials science for predicting the properties of molecules and material systems. Because of the many computational challenges to construct the theory, the project calls for a collaborative effort between computer scientists and nuclear physicists. It is anticipated that this 5-year project will produce a theory and codes that will dramatically improve the accuracy and reliability of predictions of nuclear properties.
The project team is a consortium of 8 universities and 6 national laboratories with funding of $15M. It is led by Professors George Bertsch (far right) and Aurel Bulgac (near right) in the Institute for Nuclear Theory and the Department of Physics at the University of Washington.
more about the project
Understanding the properties and behavior of molecules, or better yet, being able to predict the behavior, is the driving force behind modern chemistry. Theoretically, quantum mechanics means that all the properties of molecules could be predicted. The problem is that the equations are too complex to actually solve, even using the most powerful supercomputers. Predicting the behavior of just one molecule with one electron requires 1,000,000 calculations, while doing the same for an atom with 20 electrons would require 1, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000 calculations. And since scientists are typically interested in atomic systems with at least a few hundred electrons, a better method is needed.
To meet this requirement, the Advanced Methods for Electronic Structure: Local Coupled Cluster Theory project was designed to develop new methods which strike novel compromises between accuracy and feasibility.
Using methods developed by SciDAC’s Algorithmic and Software Framework for Applied Partial Differential Equations (APDEC), computational and combustion scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory have created an unparalleled computer simulation of turbulent flames. The research was featured on the cover article of the July 19, 2005 Proceedings of the National Academy of Sciences. The research led to a three-dimensional combustion simulation of unmatched accuracy, a simulation that closely matches conditions found in laboratory combustion experiment. The code allows the researchers to model a flame about 12 cm in height and consisting of 80 chemical species and more than 300 chemical processes. (MORE) - July 2005
Simulations were computed on the IBM SP at NERSC
Researchers from the Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry (TSTC) project are seeking better understanding of inhomogeneous autoignition. Numerical experiments on the effect of thermal stratification on controlling burn rate,under homogeneous charge compression ignition (HCCI) engine conditions, show that increasing thermal stratification promotes more flame-like structures and the zonal model deteriorates with increased stratification. MORE - May 2005
A series of 3D hydrodynamic simulations show the flow in a stellar explosion developing into a strong, stable, rotational flow (streamlines wrapped around the inner core). The flow deposits enough angular momentum on the inner core to produce a core spinning with a period of only a few milliseconds. (MORE) - May 2005
Simulations were computed on the Cray X1 in the Leadership Computing Facility at ORNL.
Home | ASCR | Contact Us | DOE disclaimer