Distributed LaboratoriesHPC end users in distributed computational laboratories interact via visual displays to solve problems collaboratively by manipulating local, remote, and shared computational tools. In such settings, there may be substantial differences in the resources available to users: from high end visualization environments like CAVEs to low end environments more suitable for homes like browser-based visualizations. Similarly our experience indicates that scientists prefer to interact and steer applications from multiple interfaces; from 2D and 3D plots of grid data to graphs summarizing observed behavior.
The distributed laboratories project addresses interactivity in a computationally diverse environment consisting of complex, multi-model scientific applications, information brokers, and client sources such as visualization tools through light-weight online steering and monitoring mechanisms, as well as decision mechanisms for controlling and optimizing data flow. Our steering and monitoring tools include a stream-based steering framework, Falcon, and a newer object-based steering framework, MOSS. We've applied our work to a parallel atmospheric global transport model that we recently coupled with a parallel chemical model. The more complex chemical model enhances the research value of the transport model in that it allows the simulation of reactive species like ozone. Shown in the above figure is an isosurface used for injecting changes to species concentrations back into the application.
The multiple required views and potential resource limitations of the scientists drive research into decision mechanisms for controlling data streams: active interfaces, time-based steering, and database cost functions. Active interfaces are visual interfaces that offer improved interactions with computational tools by exporting selected state to information brokers or agents. For example, an active interface can relay its resource limitations to an information broker, the latter of which can use the information to tailor the data stream. Time-based steering works with active interfaces to tailor the data stream to a client's needs. User specified temporal queries are transformed to brokers that operate on the data stream. Database cost functions which estimate the efficiency of specific operations at the nodes of a query plan, are used in the sophisticated search strategies of query optimizers to determine the best query plan. Components can be viewed as a query plan, such as that generated within a database management system as the computational equivalent of a user's query. Cost functions can then be used to determine an optimized ordering of the components. The algorithms are complemented by mechanisms that effect dynamic component reordering.
The Distributed Laboratories Project, under the direction of Karsten Schwan, has received substantial funding from agencies like NSF, NASA, DARPA, and from companies like Intel and Microsoft. The project was originated in 1996 as a broad research effort involving several GT faculty and funded by a National Science Foundation Research Infrastructure grant (see NSF-funded DL ).
Beth Plale, Greg Eisenhauer, Karsten Schwan, Jeremy Heiner,
Vernard Martin, and Jeffrey Vetter,
From Interactive Applications to Distributed Laboratories ,
IEEE Concurrency, vol. 6,
no. 2, 1998.
Beth Plale, Greg Eisenhauer, Karsten Schwan, Jeremy Heiner, Vernard Martin, and Jeffrey Vetter, From Interactive Applications to Distributed Laboratories , IEEE Concurrency, vol. 6, no. 2, 1998.
For more information contact:
Last modified: Wed Jan 23 12:44:58 EST 2002