Integrating Program Steering, Visualization and Analysis in Parallel Spectral Models of Atmospheric Transport


Overview
The Scientific Problem
Parallelization of the Spectral Transport Code
An Integrated Computational Approach
On-line Program Monitoring and Steering
The Visualization/Analysis Module
How to Operate the System
Looking Ahead
Acknowledgements
Papers and Presentations
Bibliography

Overview

For large scale atmospheric simulations one would like a tight coupling between the simulation, the observational database on which it is based, and the visualization/analysis process by which it is understood. In fact there should be feedback, in the form of steering, between the latter and the simulation, since this will yield much more accurate representations of atmospheric processes and a significantly more focused investigation of behavior relevant to questions being asked. Since the data have complicated 3D structures and are highly time-dependent, the visualization approach must handle this dynamic data in a highly interactive fashion.

In this research, we have combined all these aspects into a single, integrated approach. This has required a collaborative, interdisciplinary process involving atmospheric scientists, experts in parallel high performance computing, visualization specialists, and experts in user interfaces. In particular, we find that it is important to have the scientists involved from the beginning in defining the steps of the project and evaluating its results. This constant evaluation allows an iterative refinement of the approach and aids everybody in discovering new aspects of the problem that they did not foresee. We think that the process used here could serve as a template for building highly effective and powerful applications (and tools supporting them), a process where the developer comes away with a deeper understanding of user needs.

The Scientific Problem

Introduction

While ozone constitutes less then one-millionth of the mass of the atmosphere, it plays an important role in heating the stratosphere and in reducing the amount of ultraviolet radiation reaching the ground. Today there is great, worldwide discussion about the possibility of the destruction of this ozone-layer. Man-made halogens might reach the stratosphere and destroy parts of it, which would have an incredible influence on our whole earth-atmospheric system. Therefore 3-D models are used to predict, as well as possible, the concentration of O3 worldwide, and to determine the influence of different scenarios such as fluorocarbon release or aircraft-induced nitrogen oxides.

The maximum columnar amounts are observed in high latitudes and in spring, although the primary production mechanism for ozone begins with the dissociation of molecular oxygen by sunlight to produce atomic oxygen. Transport is clearly important. Observations show annual mean poleward transports (by eddies) of 50-60 metric tons per second across a middle latitude parallel. For comparison, the polar half of each hemisphere contains approximately 8 X 108 metric tons of ozone, which is the amount brought into the polar cap by the eddies in about 170 days.

Global destruction of ozone by chemical processes at the ground has been estimated at 13 to 23 tons per second. In the absence of any significant tropospheric creation or destruction of ozone, this amount must be transported downward across the tropopause level by atmospheric motions, and must equal the net generation of ozone by photochemical processes in the stratosphere.

The Spectral Model

The ultimate goal in climate modeling is the simultaneous simulation on a global scale of physical and chemical interactions in the ocean and atmosphere. This goal is still far from reach since, in addition to the problem's enormous complexity, parameters must be chosen to simulate processes that are not well understood or whose influence can only be approximated at the scale of current models.

Earth and atmospheric scientists at Georgia Tech have developed a global chemical transport model (Kindler, et al, 1966) that uses assimilated windfields for the transport calculations. These models are important tools to answer scientific questions about the stratospheric-tropospheric exchange mechanism or the distribution of species such as chlorofluorocarbons, hydrochlorofluorocarbon, and ozone. This model uses a spectral approach, which is common to global models (Washington, et al, 1986), to solve the transport equation for each species. In a spectral model, all variables are expanded into a set of orthogonal spherical basis functions. In a typical run our model contains 37 layers, which represent segments of the earth's atmosphere from the surface to approximately 50 km, with a horizontal resolution of 42 waves or 946 spectral values. When one transforms to a grid system, this corresponds to a resolution of about 2.8 degrees by 2.8 degrees per grid cell. Thus in each layer 8192 gridpoints have to be updated every time step. A typical time step increment is 15 simulated minutes, and for the usual annual run the number of grid values generated is over 10 billion. Of course, several variables may be evaluated at each grid point, and one might need many runs at different parameter settings to accurately simulate observed phenomena.

Scientific Progress

The three-dimensional, spectral transport model used in the current project was first successfully integrated over climatological time scales by Dr. Guang Ping Lou for the simulation of atmospheric N2O using the United Kingdom Meteorological Office (UKMO) 4-dimensional, assimilated wind and temperature data set. A non-parallel, FORTRAN version of this integration using a fairly simple N2O chemistry package containing only photo-chemical reactions was used to verify our initial parallel model results. The integrations reproduced the gross features of the observed stratospheric climatological N2O distributions but also simulated the structure of the stratospheric Antarctic vortex and its evolution. A paper describing this work was presented at the Spring, 1994 AGU meeting (Lou, et al, 1995) and an enlarged version suitable for publication is currently in preparation (Lou, et al, 1996).

Subsequently, Dr. Thomas Kindler, who produced much of the parallel version of our model, enlarged the N2O model chemistry package to include N2O reactions involving O(1D) and also introduced assimilated wind data from NASA as well as UKMO. Initially, transport calculations without chemistry were run using Carbon-14 as a non-reactive tracer gas with the result that large differences in the transport properties of the two assimilated wind data sets were apparent from the resultant Carbon-14 distributions. Subsequent calculations for N2O, including its chemistry, with the two input winds data sets with verification from UARS satellite observations have refined the transport differences between the two such that the model's steering capabilities could be used to infer the correct climatological vertical velocity fields required to support the N2O observations. During this process, it was also discovered that both the NASA and the UKMO data contained spurious values in some of the higher frequency wave components, leading to incorrect local transport calculations and ultimately affecting the large scale properties of the model's N2O distributions, particularly at tropical latitudes. Subsequent model runs with wind data that had been filtered to remove some of the high frequency components produced much more realistic N2O distributions. A paper presenting these results and data limitations was given at the Fall, 1995 AGU meeting (Kindler, et al, 1995) and, in more detail, in Dr. Kindler's Ph.D. Thesis at Georgia Tech (Kindler, 1995).

During the past few months, the UKMO wind data base for a complete two-year period was processed into spectral form for model use. This new version of the input transport data base now includes complete temperature fields as well as the necessary wind data. This was done to facilitate advanced chemical calculations in the parallel model which often depend upon temperature. Additional UKMO data is being added as it becomes available.

How Interactive Steering Contributes To Scientific Applications

When combining on-line visualization and steering tools with our parallel version of a global spectral atmospheric transport model, we have the unique ability to compare model results with observational data during the model run. Should discrepancies between model results and observations occur, model execution can be stopped, rolled back in time, model parameters may be changed, whereupon we can then rerun the model with new parameter settings. Our experiences with this new approach in model validation are quite positive. Specifically, when applying these interactive validation methods to the scientifically relevant problem of simulating the global distribution and transport of Nitrous Oxide (N2O), interesting scientific outcomes result from a comparison of results using simulated windfields for transport versus using assimilated (measured) windfields for driving the transport inside the model. When comparing the results from using two different sets of assimilated windfields (NASA, UKMO), the model shows an underprediction of vertical mass transport in the equatorial area with the UKMO windfields and an overprediction of the vertical transport with the NASA winds. Online model interactions permitted us to adjust and "play with" vertical windfields to investigate in detail the sensitivity of the biased model results to changes in the vertical advection term. As a result and compared with observation data, model results were improved significantly for both windfield sets.

Parallelization of the Spectral Transport Code

The original atmospheric model was written in FORTRAN. To facilitate on-line monitoring, it was necessary to rewrite the code in C since the current Falcon system (which supports on-line monitoring) only supports the C programming language. (An effort is underway to develop a version of Falcon that will work with other languages.)

The parallelized model has been targeted for two different computer architectures. The first approach was to target a shared memory machine model. A Kendall Square KSR 2 supercomputer was first chosen for this implementation. This shared memory model parallelizes the computations by atmospheric level, by term in the Navier Stokes equation, and by latitude (actually sin(latitude)). Common data is replicated across all involved processors and is therefore, locally accessibly. Spectral layer data is shared by all processors dealing with a certain layer. The grid layer data is decomposed along constant sin(latitude) and accessed locally by the processor to which this range has been assigned. As a result, no movement of grid data is necessary during model computation, whereas spectral data is shared frequently. When Kendall Square Research closed its doors, the model was ported to the Power Challenge Series of supercomputers manufactured by Silicon Graphics. Both machines run a variant of the UNIX operating system and support the shared memory paradigm so porting the software was of minimal effort.

The second approach was to target a distributed memory machine model called a message passing model. The machine used to implement this model is a group of three IBM RS/6000 workstations and an 8-node IBM SP-2. We use the MPI message passing library to communicate between the processors. Each processor is assigned work to do by a master process. Each of the slave processors calculates its portion of work, occasionally communicating with other processors to share needed information. Results are sent back to the master processor at the end of each timestep. In addition to the responsibility of gathering information for the entire application, the master processor also is given a portion of work to do. This distributed memory model version parallelizes the application by level only in order to keep communication costs down to the vertical advection term. The model has been developed to deal with varying numbers of available processors; the atmosphere is divided into layer sets according to the number of processors and the power of those processors. Portable binary files (described below) are built before model startup and can be distributed among non-homogeneous and/or non-NFS systems if necessary.

Parallel Code Performance Evaluation

This work concerns the parallel implementation of a grand challenge problem: global atmospheric modeling. The novel contributions of our work include: (1) a detailed investigation of opportunities for parallelism in atmospheric transport based on spectral solution methods, (2) the experimental evaluation of overheads arising from load imbalances and data movement for alternative parallelization methods, and (3) the development of a parallel code that can be monitored and steered interactively based on output data visualizations and animations of program functionality or performance. Code parallelization takes advantage of the relative independence of computations at different levels in the earth's atmosphere, resulting in parallelism of up to 40 processors, each independently performing computations for different atmospheric levels and requiring few communications between different levels across model time steps. Next, additional parallelism is attained within each level by taking advantage of the natural parallelism offered by the spectral computations being performed (eg., taking advantage of independently computable terms in equations).

Performance measurements are performed on a 64-node KSR2 supercomputer. However, since the parallel code has been ported to several shared and distributed memory parallel machines, including SGI multiprocessors, the IBM SP-2 machine, the SGI Powerchallenge, and workstation clusters, performance evaluation is an ongoing process.

An Integrated Computational Approach

In order to enable our integrated approach, we have developed Falcon [6], a toolkit that collectively supports the on-line monitoring, steering, visualization, and analysis of parallel and distributed simulations. The general usefulness of the toolkit is demonstrated by its diverse application to areas such as interactive molecular dynamics simulation and interactive simulation of fault containment strategies in telecommunication systems. It is anticipated that the Falcon toolkit will be available for distribution on the WWW in the near future. Falcon tools include:

Sensors, probes, and steering objects inserted in the simulation code are generated from monitoring and steering specifications. Their partially analyzed monitoring information is sent to graphical and visualization displays. Once steering decisions are made by the user, changes to the application's parameters and states are made by Falcon's steering mechanism which invokes the steering objects embedded in the application code.

Falcon's on-line steering component consists of a steering server on the target machine that performs steering, and a steering client that provides the user interface and control facilities remotely. The steering server is typically created as a separate execution thread of the application to which local monitors forward only those monitoring event that are of interest to steering activities. The steering client receives application run-time information from the application, displays the information to the user, accepts steering commands from the user, and enacts changes that affect the application's execution. Communication between application and steering client and steering client and server is handled by the transmission tool, Data Exchange.

Data Exchange is a transmission tool for routing messages between multiple clients where clients can be broadly classified as applications, visualization/analysis/steering tools, or other Data Exchanges. Messages are identified by their format names and registered with Data Exchange by both senders and receivers. When a message is received, it is routed to those clients who have registered their interest in receiving that message type. Communication is done either through sockets or file I/O. The exchange server can provide additional functionality such as event reordering before data is routed to clients. Data Exchange and PBIO taken together provide a flexible display system for attaching different types of graphical and visualization displays to an application's execution. Graphics intensive clients, which run on high performance front-end workstations to take advantage of better graphics and visualization support, can be dynamically attached to and detached from the display system.

The program steering environment demands speed and compactness of binary data transmission in a heterogeneous environment. These needs are met by Portable Binary I/O (PBIO), a set of services for transmitting binary data between machines in heterogeneous environments. PBIO provides a low overhead service by not requiring data to be translated into a "standard" or "network" representation and portability by transferring data between machines despite differences in byte ordering, sizes of datatypes, and compiler structure layout differences.

Though PBIO uses a metaformat in which the actual formats of binary records could be described, the representation of the metadata is hidden. Writers of data provide a description of names, types, sizes, and positions of fields in records through calls to the PBIO library. Readers provide similar information. No translation is done on the writer's end; meta information describing the senders format is sent in the PBIO data stream. On the reader's end, the format of the incoming data is compared with the format the reading program expects. Where discrepancies exist, PBIO performs the appropriate translations.

On-line Program Monitoring and Steering of the Transport Application

As mentioned above, the monitoring is accomplished by inserting sensors into the actual code at compile time. Sensors are declared earlier by the scientist and programmer and are used to gather interesting information about a program's state at a particular moment. At run time, when the code encounters a sensor, the sensor will gather up whatever information it needs and will send that information to the Data Exchange. The Data Exchange is a program, usually running on another machine, that gathers information and stores or forwards it to other applications as necessary. For our system, the visualization system connects to the Data Exchange to request the monitoring information about the application. Although it is theoretically possible to have the application communicate directly to the visualization, the Data Exchange provides more functionality in that it can offload work to another processor and it provides facilities for allowing an arbitrary number of applications to connect and access the same monitoring information without affecting the application in any way. Also, the Data Exchange allows us to easily provide a communications interface for the steering function that will accommodate multiple steerable components without unduly affecting the running time of the application.

To steer the application, special sensors are inserted into the application that check for steering commands from the Data Exchange. If a command is waiting to be received from a Data Exchange, then the sensor interprets the steering command and modifies the application state accordingly. In most cases, steering is accomplished by first stopping the application from proceeding further in its simulation, changing the program state, and then re-starting the application so that it may continue with the new state information. This start/modify/stop sequence is done to insure that all parts of the application are synchronized and have the necessary state information so that the model is in a consistent state and thus the calculations are consistent as well.

Our applications include monitoring sensors for the wind fields and for the concentrations of various (single) chemical constituents. A selectable 2D or 3D interactive visual interface allows the scientist to move through the data at each timestep using various projections as desired. Steering sensors allow the scientist to evaluate and test new values for the wind fields in conjunction with simple checkpoint and restart facilities which are required to assure stable and accurate simulation behavior. As mentioned above, the spurious wind values were discovered through the use of this interface.

The Visualization/Analysis Module

In the initial version of this system, we integrate the Glyphmaker visualization system, including modules developed within the Iris Explorer environment, with the Falcon steering system and the atmospheric model. As the model generates timesteps, the visualization is updated in an on-line fashion. Additions to the visualization capability include modules to immediately display the data or to pass it along to PV-Wave for alternative visualizations and analysis.

(Note: Due to a number of reasons, including size and performance limitations of Iris Explorer, this method is being abandoned and we are in the process of integrating the SGI Open Inventor system with the Falcon steering system and the atmospheric model. While the Glyphmaker/Open Inventor connection is not yet in place, we describe Glyphmaker's purpose here for completeness and for future reference.)

By direct manipulation steering we mean that we can interact directly with visualizations of atmospheric simulations to alter the future course of the simulations. We do this, for example, by scaling, rotating, translating, or inputting data for graphical objects bound to the data. Thus we could use the conditional box (a tool from Glyphmaker) to define spatial regions in the data where one could change chemical concentrations or other parameters. We could also employ data probes from Glyphmaker to locate localized behavior of interest and to adjust parameter values where desired. We have extended the rendering module in Glyphmaker to support these direct manipulation capabilities. Our direct manipulation techniques involve interactions with both 3D and 2D representations of the data. This hybrid approach is attractive because it recognizes that while new and innovative methods are necessary to explore spatially complex and multidimensional data or to control simulations that produce these data, familiar tools such as 2D plots are succinct ways of expressing user intent.

In our current version of the visualization/analysis tools, we have added a graphical steering mechanism to our Glyphmaker visualization system. The system allows the user to select from a set of geometric forms. The user can then deform the geometry to encompass a desired spatial region, within which one can change parameters in the atmospheric model. In addition to interactive control of position and deformation of the geometric steering objects, the user receives visual feedback from both the steering object's geographic position and from the model datastructure indices. The visual feedback is enhanced by allowing the user to choose from a variety of projections (spherical or flat) with the graphical attributes of the geometric form adjusted to the type of projection.

The direct manipulation steering approach is a new and powerful way to control spatially complex and dynamic simulations, such as those from atmospheric models. It allows the user to do side-by-side probing and analyzing of the correlations in the data while being able to redirect the simulation in a spatially intuitive way to better understand how the physical processes evolve. It requires the capability for direct, quantitative probing of data that we have built into Glyphmaker through the formulation of elaborate data structures that always connect the visual representations to the original data, allowing investigation down to the individual datum. The data structures change and expand dynamically as new bindings between visual representations and data are made. The steering also requires a close coupling with the steering control and data transfer mechanisms provided by Falcon. The first stages of this integration has been completed and future development will require updating of the Falcon system to respond to new needs placed on it by enhancement of the visual interface as well as modification of the modes of visual interaction necessitated by improvements in the Falcon system.

The flexibility of the Glyphmaker system allows the use of the steering objects for analysis as well. The data elements within the region could be reclassified with their own glyphs (e.g., with different shapes or colors than the surroundings) so that their behavior could be highlighted and followed in detail. We have added the capability to take these selected data and list any values or show their distribution in 2D plots. This is the process of mixing 3D visualizations with 2D quantitative analyses that we mentioned above.

We have also instrumented the atmospheric model with a mechanism for deferred steering. Our design allows model changes to be scheduled rather than applied immediately. This is necessary because the parallelized execution is kept efficient by minimal synchronization. Additionally we focused on steering the vertical windfields. The windfields are an important transport mechanism and are derived from observed data. Our steering system permits both human interactive control and automated input from weather data sources (satellites, etc.).

How to Operate the System

(Note: This section assumes that the reader has basic familiarity with Silicon Graphic's Open Inventor.)

To show our VA tools in action, we choose a simulation of N2O. The distribution of N2O in the atmosphere has a rich structure and is significantly affected by horizontal and vertical wind fields. We focus on the correlations between horizontal and vertical wind fields (taken from satellite observations) and the changes in N2O distribution. These correlations are hard to see using traditional visualization methods, but they can be critical in assessing the accuracy of the model and in understanding the processes by which species spread through the atmosphere.

First, start the data exchange on a particular host with

exchange
This will return the messages
Registering group "exchange_6"
Data Exchange server listening at Inet host/port (hostname portnumber)
where hostname is the machine on which the data exchange is running and portnumber is the number that will be used to link the data exchange with the hosts running the interface and the model.

Next, start the interface on another host with

si hostname portnumber
If you do not enter a hostname or portnumber, the default hostname is slick.cc.gatech.edu and the default portnumber is 65535. A file called INPUT specifies the location of all maps; if you don't want to have a worldmap displayed, set the number of maps to 0 in the first line.

Finally, start the model running on another host (in our cases, probably a parallel system) with

tp np nl -cthread_steering -cthread_monitor_socket hostname portnumber
where np is the number of processors to use and nl is the number of levels to simulate. The right-hand window from the following figure will appear.

By pressing the R button on the right of your screen you can bring up a Read Data Hub shell window. This window will display each timestep as it has completed. You can choose any timestep by highlighting it and pressing the "Show this time step" button. A Control_Panel window will pop up allowing you to select specific longitudes (X), latitudes (Y), or levels (Z). Press the button on the left to enable the particular display. Sliding the bar associated with each dimension will change the value of that dimension. In this example, we are displaying the N2O values at longitude number 16 (about 90oE), latitude number 14 (about 79oN), and level number 13 (about 17.5 km.) The figure can be rotated as desired. You can select any object(s) (any combination of level, latitude, and longitude in our case.) Notice that when you select an object, it will appear in the lower left window. In that window (which is called detail examiner) you can rotate and change the size of the object, but you cannot move it.

The Steering menu allows you to select from several steering options including Cuboid Region (uniform distribution within a selected cube), Spherical Region (uniform distribution within a selected sphere), Distribution Sphere (Gaussian distribution within a selected sphere), and Isospheres. (At this point in time, only the Distribution Sphere has a fully implemented interface for steering mode.) Consider the figure below. To select steering mode, select Distribution Sphere from the Steering menu. This will cause a spherical cloud to appear on the main display. Now, while pressing the Steering button, select DSphere. At this point, you may want to choose a maniuplator. The manipulators are Handlebox (which is pictured below) Trackball, Jack, Centerball, or Tab Box and can be selected either through the Manips menu or the buttons T_B, H_B, C_B, or cleared with Clr.

Whenever you select a particular mode a cooresponding option menu will appear. In this figure, the steering control panel appears on the lower right. The model can be stopped by pressing the "Stop Steer Command" button. This causes the model to pause and allows you to issue a steering command. When you press the "Enter Data" button an input line will appear on the screen allowing you to enter a scalar value for the new concentration level. Press the "Send Steer Command" button to perform the selected distribution of this new value. Finally, press the "Send Go Command" to resume the model run.

Several menus are available to help you customize your views. The Editors menu will allow you to edit materials, colors, and various translations. The Lights menu will allow you to change your lighting source. You can also customize some Xwindows values for the steering interface on your machine. The following are the initial defaults in the .Xdefaults file:

si*XmForm.shadowThickness: 0
si*shadowThickness: 2
si*XmToggleButton.shadowThickness: 0
si*Background: gray
si*FontList: 7x14

Known problems:

The data exchange must be started first or you will get a segmentation fault.
If you enter an incorrect hostname or portnumber, you will get a segmentation fault.
If the data exchange dies unexpectedly, the model is likely to hang; this is due to the Xt monitor socket mechanism. This will be corrected in a future version.
After running the model and the interface, if you terminate the interface you will need to restart the data exchange if you decide to restart the interface.

Future additions:

Roll back command
Checkpoint command
Additional steering commands (cuboid region, spherical region, isospheres)

Looking Ahead

The Science and High Performance Computing

A more direct method of transforming the UKMO and NASA data to spectral form is being developed that will not require linear interpolation processes to "move" data from one grid system to a different one for spectral transformation. Although the interpolation process that has been used to date is not thought to contribute in any important way to the introduction of any spurious high frequency waves to the data, in view of the now-known existence of such waves in the wind data base, it is thought that the elimination of any potential high frequency noise that may be introduced numerically in preparation for transformation be undertaken. The distribution of energy as a function of spatial resolution for the transformed assimilated data base will then be compared with observational data in order to delineate the frequencies that contain spurious values.

A major upgrade of the parallel model that is currently under way involves the simultaneous integration with a number of atmospheric species and the inclusion of the necessarily complex chemical packages that will be required. For this purpose, we propose to make use of a substantially modified version of a large atmospheric chemical model obtained from the "Laboratoire de Physique et Chimie de l'Environment", CNRS, Orleans, France. This model is to be included as a separate module linked and interacting with the current parallel transport model and should thus permit state-of-the-art simulations of stratospheric mixes of important atmospheric constituents.

Minor changes to the parallel model that are planned for the next few months include the installation of new fourth-order numerical scheme for the spectral vertical diffusion calculations and the introduction of wind data at the lowest model levels to better simulate the effects of the Earth's surface boundary layer.

The infrastructure grant concerns tool development and distributions, especially focussing on steering and its use for scientific processors, with extensions of these tools to address entire distributed laboratories.

The Visualization

We are extending Glyphmaker in ways to increase its power in the analysis of atmospheric simulations that will grow significantly in size and complexity as the parallel approaches are scaled up. It will be necessary to manage levels of detail in the visualizations so that we can retain highly interactive exploratory analysis as the data grows. This will require both automatic and user-directed methods, since the user will not know at the outset what the data contains but will want to direct and refine the visualization process. We are working on general methods for detail management that are based on an understanding of the nature of physical data and that include both approaches for 3 and 4D (including time) pattern recognition and for feature recognition and extraction. These approaches will allow a natural organization of the data for further study including higher level visualization (e.g., surface and volumes) of general unstructured or scattered data. These approaches will also permit us to represent the data with visual abstractions at multiple levels of complexity. We will work closely with application scientists so that the visual abstraction process matches the physical abstraction process that they use to simplify and then understand their data.

We plan to extend the visual representations and interactions for steering. One extension will allow the user to specify distribution functions with a few parameters so that, for example, more physically accurate concentration profiles can be inserted into the simulation. Thus the user can easily specify how model changes are distributed within the extent of the steering object. Also, we are incorporating the ability to acquire steering specifications from the visualization output. For example, if an isosurface specification produces a surface in the visualization, we will be able to use the surface as a spatial parameter for steering. We plan to write a paper shortly on our present and some of our new steering capabilities.

In order to achieve our ultimate goal of real-time exploratory visualization, steering and control of simulations, regardless of the size of data output, we must investigate alternatives to our present visualization approach. Among other things, this means looking at tools other than SGI Iris Explorer and Inventor. The reason for this is that we must have fast rendering of thousands of potentially independent objects; neither Explorer or Inventor are optimized for this case. We are considering, for example, the use of the CAVE libraries from NCSA. These are built for scientific visualization in immersive virtual environments. They thus are built for real-time use, have been employed on big data, and have some tools for exploratory navigation built in. By integrating the CAVE libraries with Open Inventor, we can retain several of our interaction and direct manipulation tools. As an alternative, we are also considering building our own renderer using OpenGL. This will give us optimal efficiency and control over visualization capabilities. However, we will have to rebuild most of our interaction capabilities and some of our visualization techniques.

Whichever path we take for our rendering tools, we will move them from GL to OpenGL. This coupled with use of libraries like Open Inventor will make available a large number of platforms for use by our system.

Collaborative Steering

We plan to incorporate support for collaborative work in the monitoring/steering infrastructure beyond the simple example of replicating the pixels of a visualization on several workstatios' screens. Support will be needed to allow the collaborators to have different views of a single visualization (or possibly different visualizations of the same data) and to coordinate the steering interactions and feedback among the views. For rendering the visualizations we use an object-oriented graphics library which allows one to arrange objects into a tree structure to describe a scene. This library includes several objects which respond to user input (mouse, keyboard, etc.) which we use for steering. To support collaboration we add a mechanism to this library which maintains consistent copies of the scene tree structure on two or more machines.

Acknowledgments

This work is supported in part by the NASA AISRP Program under contract number NAGW-3886 and by NSF under grant number NCR- 90000460.

Papers and Presentations

This section provides a summary of papers and presentations given at a number of conferences and meetings around the country. Abstracts and descriptions are provided for detail and clarification.

AUG Spring, 1995 meeting.

Reference:

Guang Ping Lou ,, Fred Alyea, and Derek Cunnold, "3-D Simulations of N2O Transport and Antarctic Vortex Evolution", presented at AGU 1995 Spring Meeting, Baltimore, MD, May 30-June 2, 1995, paper no. A51B-5.

Abstract:

3-D Simulations of N2O Transport and Antarctic Vortex Evolution

This study focuses on three areas: (a) the structure of the stratospheric Antarctic vortex and its evolution; (b) the transport of N2O and dynamical forces that dominate these processes; (c) the climatology of the N2O mixing ratio distribution and its driving factors. A 3-dimensional spectral chemical transport model was employed to simulate N2O transport and study the driving forces that affect the processes. The dynamical driving fields are from the UKMO 4-dimensional assimilated data set. UARS CLAES N2O mixing ratio are used for the N2O initial conditions. Model results show that the N2O distribution and transport closely resemble the CLAES measurements, especially at high latitudes. The correlation coefficients between CLAES N2O temperatures, and model N2O and temperatures are remarkably similar in terms of their meridional distributions. Diagnostic study and model simulation results reveal that while large-scale Eulerian mean vertical motion fields are upward inside the vortex, the mean residual circulation vertical velocity is downward. The monthly mean maximum sinking residual velocity is -0.40 cm/s at about 1.5 mb and - 0.07 cm/s in the 30-9 mb layer inside the Antarctic vortex in September. The vortex first breaks in the upper stratosphere during September. Then the breaking process propagates downward to the 3-10 mb level in the middle of October. At the lower levels, 10-20 mb, the vortex breaks up in early November. These breaking processes continue to penetrate to lower levels at about 20-30 mb by late November. In the meridional transport of N2O, eddy transport is the chief process. Especially at higher altitudes, there seems to be persistent eddy mixing going on at the middle latitudes during the early spring. However, the residual circulation transport dominates the long term vertical mixing. The bulge of the elevated N2O mixing ratio in the tropical stratosphere is determined by the uplifting of mass by the residual circulation. During the Southern Hemisphere summer, the uplifting of N2O by the residual circulation reaches above 1 ppb/day. The downward transport inside the vortex can exceed 2 ppb/day in the winter hemisphere. The climatological distribution of the N2O mixing ratio follows the seasonal variations of the solar radiation. The bulge of the elevated N2O shifts toward the summer hemisphere by up to 15 degrees in latitude. The slopes of the N2O mixing ratios are sharper in the winter hemisphere and the surf zone is well defined in the middle latitudes on the zonal mean plots.

Presented at AGU 1995 Spring Meeting, Baltimore, MD, May 30-June 2, 1995, paper no. A51B-5.

AUG Fall, 1995 Meeting

Reference:

Kindler, T.P., D.M. Cunnold, F.N. Alyea, G.P. Lou, and W.L. Chameides. "A Comparison of CLAES N2O Simulations using 3D Transport Models Driven by UKMO and GSFC Assimilated Winds", presented at AGU 1995 Fall Meeting, San Francisco, CA, December 11-15, 1995, paper no. A52D-9.

Abstract:

A Comparison of CLAES N2O Simulations using 3D Transport Models Driven by UKMO and GSFC Assimilated Winds

A three dimensional chemical model has been developed. The model has a vertical resolution of approximately 1.25 km (on-half a UARS layer) and is spectrally truncated at T21. In this paper we will compare N2O simulations from two calculations in which the model is driven by the windfields provided by the assimilation models of UKMO and GSFC. The calculations were initialized on September 1, 1992 with a distribution based on UARS CLAES N2O measurements and were run for 13 months. The zonal mean gradients of N2O are found to steepen using the GSFC wind fields whereas they flatten out using the UKMO fields (as we have previously reported). Consequently the calculated atmospheric lifetime of N2O changes from 180 years initially to less than 100 years and longer than 200 years respectively using the GSFC and UKMO winds. The budgets of N2O in the two calculations will be compared in terms of contributions by the residual mean circulation and mixing along isentropes. The degree of isolation of the polar vortices and the extent of iteration between the tropics and the extratropics will also be examined using area mapping analyses.

Presented at AGU 1995 Fall Meeting, San Francisco, CA, December 11-15, 1995, paper no. A52D-9.

Supercomputing '95, GII Testbed

Reference:

M. C. Trauner, V. C. Martin. "A Parallel Spectral Model for Atmospheric Transport Processes", GII Testbed and HPC Challenge Applications on the I-Way, Virtual Environments and Distributed Computing at SC '95, Supercomputing '95 Conference, San Diego, CA, December 3-8, 1995, project no. 13.

Abstract:

A Parallel Spectral Model for Atmospheric Transport Processes

Earth and atmospheric scientists at Georgia Tech have developed a global chemical transport model that uses assimilated windfields for the transport calculations. These models are important tools to answer scientific questions about the stratospheric-tropospheric exchange mechanism or the distribution of species such as chlorofluorocarbons, hydrochlorofluorocarbons, and ozone. This model uses a spectral approach common to global models to solve the transport equation for each species.

Ideally, in large-scale atmospheric simulations, the observational database should be closely coupled to the visualization/analysis process. In fact, there should be feedback in the form of steering between the latter and the simulation in order to yield more accurate representations of atmospheric processes and a significantly more focused investigation. Because the data have complicated 3D structures and are highly time-dependent, the visualization approach must handle this dynamic data in a highly interactive fashion.

In this project, the researchers have combined all these aspects into a single, integrated approach. This has required a collaborative, interdisciplinary process involving atmospheric scientists and experts in high-performance parallel computing, visualization, and user interfaces. The process used here could serve as a template for building highly effective and powerful applications (and tools supporting them), a process where the developer comes away with a deeper understanding of user needs.

Discussion:

This application was accepted for execution over the GII testbed and visualization on the I-Way Wall.

A working prototype of the distributed memory model with visualization and minor steering was exhibited. The atmospheric transport model was running on 32 nodes of the IBM SP-2 supercomputer at the Cornell Theory Center. On-line monitoring data was shipped over a dedicated ATM network to San Diego to an SGI Challenge server which acted as a centralized resource manager and router (the Data Exchange.) The custom visualization was running on an SGI Onyx connected to the Wall.

The prototype allowed for the user to interactively (while the model is running) view both the wind fields and the N2O concentrations at any part of the globe in a variety of interesting formats including spherical levels extending from the earth's surface into the stratosphere, a flat map Cartesian view with strict longitudinal and latitudinal planes, and simply x-y plots. The data viewed could be chosen via explicit selection or relative position using sliding bars and dials.

Penn State University

Reference:

Karsten Schwan, "Interactive High Performance Programs: From On-line Scientific Applications to Operating Systems", Penn State University, College Park, Dec. 1994.

Workshop on Debugging and Performance Tuning for Parallel Computing Systems

Reference:

Karsten Schwan, Weiming Gu, Greg Eisenhauer, Jeffrey Vetter, "Interactive Parallel Programs: The On-line Steering of Large-Scale Parallel Codes", invited lecture at the Workshop on Debugging and Performance Tuning for Parallel Computing Systems, Cape Cod, Oct. 1994.

Bibliography

  1. Thomas Kindler, Karsten Schwan, Dilma Silva, Mary Trauner, and Fred Alyea, "Parallelization of Spectral Models for Atmospheric Transport Processes", Concurrency: Practice and Experience, to appear 1996.

  2. W.M. Washington and C.L. Parkinson. An introduction to three- dimensional climate modeling. Oxford University Press, 1986.

  3. Derek M. Cunnold, Fred Alyea, N. Philips, and R. Prinn. "A three- dimensional dynamical-chemical model of atmospheric ozone." J. Atmos. Sci., 32:170-194, 1975.

  4. Derek M. Cunnold, Fred Alyea, and R. Prinn. "Preliminary calculations concerning the maintenance of the zonal mean ozone distribution in the northern hemisphere." Pure Appl. Geophys., 118:329-354, 1980.

  5. Karsten Schwan, Harold Forbes, Ahmed Gheith, Bodhisattwa Mukherjee, and Yiannis Samiotakis. "A cthread library for multiprocessors." Technical report, College of Computing, Georgia Institute of Technology, Atlanta, GA 30332, GIT-ICS-91/02, Jan. 1991.

  6. Weiming Gu, Greg Eisenhauer, Eileen Kraemer, Karsten Schwan, John Stasko, Jeffrey Vetter, and Niru Mallavarupu. "Falcon: On-line Monitoring and Steering of Large-Scale Parallel Programs." Proceedings of The Fifth Symposium of The Frontiers of Massively Parallel Computation, McLean, VA, February, 1995.

  7. William Ribarsky, Eric Ayers, John Eble, and Sougata Mukherjea. "Using Glyphmaker to Create Customized Visualizations of Complex Data." IEEE Computer, July, 1994.

  8. Lou, Guang Ping, Fred Alyea, and Derek Cunnold, "N2O transport dynamics and its climatology", in preparation, 1996.

  9. Kindler, Thomas Paul. "The Development of Supercomputing Tools In a Global Chemistry Transport Model (CTM) and Its Application to Selected Problems in Global Atmospheric Chemistry", Ph.D. thesis, Georgia Institute of Technology, December 1995.

  10. Yves Jean, Thomas Kindler, William Ribarsky, Weiming Gu, Gregory Eisenhauer, Karsten Schwan, and Fred Alyea, "An Integrated Approach for Steering, Visualization, and Analysis of Atmospheric Simulations", Report GIT-GVU-95-15, Proceedings Visualization '95, pp. 383-387.

  11. Martin deBoer, Yves Jean, William Ribarsky, Gregory Newton, Frits Post, and Robert Sumner, "A New Data Organization for Analysis in Visualization Systems", Report GIT-GVU-96-10, to be submitted to Transactions in Visualization and Computer Graphics.

  12. Yves Jean, Thomas Kindler, William Ribarksy, Weiming Gu, Gregory Eisenhauer, Karsten Schwan, and Fred Alyea, "Case Study: An Integrated Approach for Steering, Visualization, and Analysis of Atmospheric Simulations", Visualization '95, Oct. 1995.

  13. Ph.D., Weiming Gu, "On-line Monitoring and Steering of Parallel Programs", August 1995.

  14. Weiming Gu, Greg Eisenhauer, Karsten Schwan, and Jeffrey Vetter, "Falcon: On-line Monitoring and Steering of Large-Scale Parallel Programs", GIT-CC-94-21, journal submission, Dec. 1994, revised Aug. 1995.

  15. Karsten Schwan, Fred Alyea, William Ribarsky, and Mary Trauner, "Integrating Programming Steering, Visualization, and Analysis in Parallel Spectral Models of Atmospheric Transport", NASA Science Information Systems Newsletter, Issue 36, July 1995.

  16. Weiming Gu, Jeffrey Vetter, and Karsten Schwan, "An Annotated Bibliography of Interactive Program Steering", ACM SIGPLAN Notices, Sept. 1994

  17. Greg Eisenhauer, Weiming Gu, Karsten Schwan,and Niru Mallavarupu, "Falcon -- Toward Interactive Parallel Programs: The On- line Steering of a Molecular Dynamics Application", High Performance Distributed Computing (HPDC-3), San Francisco, CA, Aug. 1994.

  18. Weiming Gu, Greg Eisenhauer, Eileen Kraemer, Karsten Schwan, John Stasko, and Jeffrey Vetter, "Falcon: On-line Monitoring and Steering of Large-Scale Parallel Programs", Frontiers 95, Feb. 1995.

  19. Greg Eisenhauer, Weiming Gu, Dilma Silva, Karsten Schwan, Jeffrey Vetter, Eileen Kraemer, "Opportunities and Tools for Highly Interactive Parallel Computing", Proceedings of the Workshop on Debugging and Performance Tuning for Parallel Computing Systems: Toward a Unified Environment, Los Alamos National Laboratories, IEEE PRESS, May 1996.

  20. Ph.D., Jeffrey Vetter, "Interactive Steering of Large-Scale Parallel Applications" (in progress).

  21. Jeffrey Vetter and Karsten Schwan, "Progress: A Toolkit for Interactive Program Steering", 24th International Conference on Parallel Programming, IEEE, Aug. 1995.

  22. Ph.D., Vernard Martin, "Performance of Heterogeneous Parallel Applications" (in progress).



Comments about this page? Contact
mary.trauner@oit.gatech.edu
Last modified on 24 MAR 1996.