Future Computing Environments

The Cyberguide Project

We are developing handheld intelligent tour guides to explore and demonstrate future computing environments at the College of Computing of the Georgia Institute of Technology. For example, at monthly GVU demo days visitors will each be given a personal mobile computing interface, or cyberguide to help them navigate around the various demo locations. The GVU Cyberguide is a proof-of-concept prototype, which we intend to extend to other touring applications.

PERSONALIZATION:

Because the cyberguide is interactive rather than a playback device, it can be personalized for the individual being served. The cyberguide will personally welcome them, describe potential demos, talks, and posters, and encourage a personal tour agenda to be created. The tour can be changed at any time to reflect new discoveries or insights. Tours can be taken backwards, in a random order, or in a search for what is the next most exciting activity. We will use machine learning techniques to discover user preferences, and attempt to filter and structure the possible choices to make the interaction more effective and streamlined. One test of whether we succeed is whether a human participant will trust the cyberguide to the extent of saying to the machine "Why don't you go ahead and pick what we do next".

LOCALIZATION:

Cyberguide will constantly measure its position and orientation, allowing maps and directions to be oriented with respect to physical rather than display coordinates. Cyberguide will be able to indicate the position within a room at a meter resolution, and outdoor positions at GPS resolutions. Because Cyberguide will know its physical location and where it is pointing, it will be able to describe to other Cyberguides where it is and what it is doing, making many cooperative mobile applications possible. It will also allow enhanced images to be created. For example, the cyberguide may provide an "Xray" view of what is behind a closed door or beyond a wall. Images showing what a view looked like in the past or will look like after future changes can be registered with the actual gaze direction.

COMMUNICATION:

Reservations and scheduling arrangements will be negotiated across a mobile computing network between the cyberguide and scheduling agents for various events. Changes in scheduling and waiting times for unreserved events will also be updated across the net. Communication will also be possible with other tour participants seeing different events. A participant will be able to immediately tell his friends "Wow, there is this really cool animation stuff!" and the cyberguides can provide missing information about where and when. Tour participants will also be able to send email messages to the appropriate person in charge of a particular event, so questions that come up before or after an event can still be addressed.

COLLABORATION:

Often a demo (and a classroom lecture or lab exercise) is a passive event for most of the group. Only a few people are close to the action, and typically only one person can actually affect the outcome of the demo. Handheld collaboration technology can change this. All of the viewers of a demo can have the contents of a computer screen or live video transferred to their own screens, and each user can have her own focus of attention. Users can all make choices with their own selection devices, and for some demos a voting scheme may be used, while in others each participant follows an individual path through the demo. Each particpant can have the exciting and motivation experience of figuring out how to do something, and sharing that with a neighbor or the group.

TOURING CYBERSPACE:

A new opportunity provided by this technology is the opportunity to tour places that don't exist. Imagine touring a virtual acheological dig. Tourists can see what is at their current layer in the dig, and can choose to move laterally by walking around, and to move vertically using selection input to the cyberguide. Imagine taking a walking tour of the internet. One could walk the major links, find an interesting site, and wander along the local network links, watching the packets whiz by. By mapping cyber space to real space, a more intuitive and much more immediate exploration is possible.

MULTIMEDIA:

Ultimately we expect handheld multimedia, including audio communication, video playback, and image capture and analysis. Perhaps a fullsize or mini CD disc will be needed for local storage of information on the handheld device.

FUTURE COMPUTING ENVIRONMENTS:

This is part of a larger program of research on future computing, in which embedded computing and communication facilitate a greatly enhanced and integrated human-machine interface. Users should be able to freely move between handheld devices, workstations, and large wall displays. We see personal interfaces as a way to deal with the potential complexity of such a rich environment.


Future Computing Environments Georgia Institute of Technology