Helping Firefighters with Virtual Reality Technology

November 2, 2003

 Each year more than 3,900 people die from fires, and property loss due to fire totals more than $9.6 billion, according to the National Fire Protection Association. Tragically, firefighters too often lose their lives in the line of duty. On average about 102 firefighters die each year, about a 7 percent increase in deaths since 1990 (U.S. Fire Administration, FEMA). Consequently, firefighters need the best training possible to react to these emergencies in the most effective way.

In an effort to achieve that goal, the Atlanta Fire Department approached Georgia Tech about developing a fire command training simulator to better prepare their officers to react in emergencies. Collaborating with the Atlanta Fire Department, Georgia Tech researchers are refining a training application using virtual environment technology—immersive computer-generated experiences—to better train fire commanders directing teams of firefighters.

“The key here is the safety of the firefighters,” says Captain W.G. May, special projects coordinator, Atlanta Fire Department. “By reducing the dangers involved in training, we can greatly lower the chance of a firefighter injury.”

This application simulates the progress of a fire in a single-family home and responds to the orders made by the fire commander on the scene. The virtual environment allows the user to navigate around the fire scene and view a house on fire from any angle; to direct firefighters and watch them execute commands; and see realistic fire and smoke behavior reacting to changes in the environment such as the opening of windows.

By reducing the dangers involved in training, we can greatly lower the chance of a firefighter injury." - Captain W.G. May

“The world that firefighters work in is incredibly complex. Every fire and every situation is different, so a virtual environment, which can be changed fairly easily, is a good fit for this type of training,” says Dr. Chris D. Shaw, senior research scientist in Georgia Tech’s College of Computing and faculty member of the Graphics, Visualization and Usability (GVU) Center, who leads the project.

The Firefighter Command Training Virtual Environment is designed as a training tool to be used by the fire company officer, who usually commands a four- to eight-person company of firefighters who respond to fire emergencies. The officer usually has a number of years of experience as a firefighter and has trained to be an officer in the classroom and by practicing command procedure at the fire department’s training ground.

However, these training methods are limited. First, not all fire companies see all types of emergencies in equal amounts; some companies may see many more fires than others. The overall number of fires has declined over the years -- only about 3 percent of calls to the Atlanta Fire Department are for fires. Second, practice training always takes place at the training ground in exactly the same fireproof building, so realism and the element of surprise are limited. The virtual environment, on the other hand, can provide a variety of scenarios in a more realistic way and with less risk and expense than training with real fires.

“When I came to Georgia Tech for graduate school, I was interested in working in computer graphics and with virtual reality, so this project was a good fit,” said Tazama St. Julien, third year computer science Ph.D. student. “The visit to the actual fire training ground and seeing fires up close and personal was pretty interesting and fun.”

In the prototype application, Shaw and his team of students created a virtual environment with a furnished one-story house with a garage, a fire truck, firefighters, tools, and fire hydrant. The user, the fire company officer, sees the house on fire on a computer screen or a head-mounted display and gives verbal commands as he would in a real fire. The system operator types the officer’s commands into the computer system via code. The project team decided to handle the command input in this fashion rather than incorporating a voice recognition system to translate the voice commands due to their unreliability for multiple users. Also, having an operator input the commands rather than the user allows the user to concentrate on evaluating the situation and making decisions. This arrangement also allows the operator to set up mistakes or traps for the user, again creating a more realistic experience. The officer then sees animated firefighters reacting to his commands, such as laying hoses or climbing onto the roof to cut a hole over the fire. Also, every 15 seconds the visuals of the smoke and fire change in reaction to the officer’s commands.

“Due to the number of factors involved, the project has proven technically challenging. The Atlanta Fire Department told us that accuracy is important. If the fire in our virtual environment doesn’t respond like a real fire would to a door opening, for example, then it’s not very useful as a training tool. So we’ve concentrated on accuracy in the amount of smoke and fire produced, for example, which is a huge amount of data to calculate,” said Shaw.

Significant improvements have been made since the original prototype was created. Originally, the animated firefighters moved like robots; now the application includes motion scripts to make the firefighters’ movements more realistic.

“It has been amazing to see this project develop. The early stages were simply cylinders representing firefighters that hopped through a house with little candle flames sprouting from the floor. Now, we have firefighters that can walk, climb ladders, ventilate a roof, spray water, etc. The fire is very realistic, not only in the way it looks but in its behavior as well. For example, if the house has a limited oxygen supply, the fire will smolder and burn slower,” says May.

Due to the complexity of calculating the amount of smoke and fire produced, the team turned to the National Institute for Standards and Technology (NIST), which studies why firefighters die and compiles extensive data on this problem. The Georgia Tech team is using NIST’s Fire Dynamic Simulator to compute realistic physical fire and smoke behavior. Due to the lengthy time to accurately compute the volume of fire and smoke, the team pre-computed the data for the entire house at one-second increments, and the system uses the pre-computed data to visualize and animate the fire and smoke in the virtual environment. On a current PC, the Fire Dynamic Simulator takes about eight hours to compute one minute of data, making it impossible to calculate the smoke and fire in real time.

“The majority of my work has been on the volume renderer,” said St. Julien. “I worked with the Fire Dynamic Simulator (FDS) to simulate the fire and smoke data, then render or draw a visualization of that data in the virtual environment. This took learning the input and output file format for FDS, learning how to use FDS, and learning how to efficiently render the data. My other main contributions are path finding, the hose animation, control of the firemen, and the fire simulation.”

Other challenges for this project include the need to develop compression techniques to make the huge data files manageable. The exponential growth of choices and conditions -- such as opening doors, spraying water -- result in an exponential increase in data. Also, the team had to create realistic-looking 3-D visuals of fire and smoke to accurately indicate to the officer the amount of soot, heat and smoke. At the scene of a real fire, officers look for these factors to determine the cause and type of fire to guide their decisions.

“The firefighter project had a compelling blend of the technical challenges I’m interested in: graphics and artificial intelligence, as well as concrete, real-world applicability,” said Dan Cunning, senior in computer science. “I am currently working on creating a more realistic looking fire simulation, exploring the possibilities of using different textures and transparency for different parts of the fire, and possibly using fragment and vertex shaders, a fairly cutting-edge technology. Most undergrads have no clue how easy it is to start working with one of the research groups on campus.”

The Firefighter Command project has provided hands-on experience to a number of computer science seniors and graduate students. Typically, a student is assigned to work on a specific component of this complex project. The team continues to refine the technical aspects of the application including developing a more complex path selection of the various choices a commander might make.

Housed in the new Technology Square Research Building, the Graphics, Visualization and Usability (GVU) Center, an interdisciplinary research center at Georgia Tech, fosters collaborations in computing and information technology research among Georgia Tech faculty and students. With more than 40 faculty and 150 affiliated students from the disciplines of Computing; Psychology; Architecture; Literature, Communication and Culture; and Electrical and Computer Engineering, GVU has gained international recognition in the research areas of graphics, animation, virtual reality, human-computer interaction, ubiquitous computing, augmented reality, wearable computing, 3-D compression, robotics, perception, collaborative web spaces and online communities.