home · mobile · calendar · colloquia · 2008-2009 · 

Colloquium - Loft

An Inconvenient Question: Are We Going to Get the Algorithms and Computing Technology We Need to Make Critical Climate Predictions in Time?
National Center for Atmospheric Research

Over the past 30 years, the use of supercomputers to obtain numerical solutions of the equations governing weather and climate, has led, along with a multiplicity of satellite and other types of observations, to the conclusion that the Earth's climate is changing due to human activity. Recent global simulations by computational climatologists at relatively coarse resolution have pointed towards predictions of continental scale environmental impacts of increasing severity in coming decades. Now decision makers want specific climate change predictions on scales down to the state and county level. However, it is unavoidable that Earth must be understood as a complete system interacting on a wide range of spatial and temporal scales and requiring century long simulations and high resolutions to study. It becomes a monumental computational challenge to provide answers to these vital questions.

The anticipated availability of massively parallel petascale computers in the next few years offers the climate community a golden opportunity to dramatically advance our understanding of the Earth's climate system and climate change, if they can be harnessed to the task. Unfortunately the fit is not perfect. First, massively parallel systems impose stringent and unavoidable Amdahl-law requirements on application scalability. Second, the trade-off between resolution and integration rate, both critical factors in climate modeling, are severe. Third, the increasing complexity of petascale systems, e.g. in terms of the numbers of cores on a chip, and the number of chips in a system, increases the tension between the system architectural trends and programmability. Finally, the size and complexity of climate applications make them difficult to port, adapt, and validate on new architectures. There is no single computational kernel in these models to optimize.

This talk will discuss on-going efforts within the DOE SciDAC and NSF PetaApps programs to both seize this important scientific opportunity and address the increased complexity of petascale systems. Efforts to develop lightweight, incremental, and beneficial scaling improvements on existing climate ocean, land and sea-ice components will be demonstrated. Similar improvements for the atmosphere will be shown for the High-Order Method Modeling Environment (HOMME), a new dynamical core currently being evaluated within the Community Atmosphere Model (CAM). This progress has improved scalability and performance of these components to the point that 50 km atmospheric component coupled to eddy-resolving ocean and sea-ice simulations are now being attempted at Lawrence Livermore National Laboratory and elsewhere. Further gains are required, and may involve even more complex and far-reaching modifications to our algorithms as well as the use of exotic architectures.

Hosted by Elizabeth Jessup.

Department of Computer Science
University of Colorado Boulder
Boulder, CO 80309-0430 USA
May 5, 2012 (14:13)