The Future of Artificial Intelligence in Firefighting

By Kirk McKinzie

On September 19, 2018 Cosumnes Fire Department reported to a series of fully-involved structure fires in Elk Grove, California. The crews were met with smoke pushing out around the doors and signs that the fire was ventilation limited. Command assigned crews to open the roof, creating a flow path that quickly changed the tenability of the compartment. The hose crew stood by at the door awaiting further instructions. This was anything but a typical bread-and-butter structure fire and anything but a typical training evolution. The “structures” were flashover containers filled with ordinary and modern combustibles that were outfitted with a complex array of thermocouple sensors, advanced thermal and visual imaging devices, and even heat-resistant 360° cameras. Unlike typical training evolutions, it was not the fire crews that were the students, it was AUDREY that was in full learning mode.

Test burn 1

(1) John Merrill, Director of First Responders and Detection at DHS S&T, and the multi-national team of first responders near ignition in test burn #1.

The Assistant for Understanding Data through Reasoning, Extraction and sYnthesis, or AUDREY for short, is a software application being developed by National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL) with funds from the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) that performs data fusion and provides tailored situational awareness to first responders. These series of “test burns” were intended to provide mountains of data, sensor readings, and imagery to begin training AUDREY on fire growth, flow paths, and flashover. Leveraging cutting-edge technology from the newest field of artificial intelligence (AI), Artificial General Intelligence (AGI), AUDREY is being taught fire behavior and the risks firefighters face to assist firefighters, incident managers, and dispatchers to keep personnel safe. Each scenario was aimed at teaching AUDREY how fast a fire develops; how the contents of the fire affects the heat and the fire’s growth; and how much time responders have to do their job before conditions are no longer tenable.

Calibration fire

(2) Engineers began with a propane fueled sensor array “calibration fire”, the vertical iron “tree” is equipped with thermal couples and IR sensors while 360-degree VR cameras are on the floor and tripods (two of the four cameras are protected with air-cooled glass covers).

Cosumnes Fire Department played host to a team of leading scientists, technologists, and federal officials to push the boundaries of modern technology and fire service research. “AUDREY takes video data and sensor data and fuses it together to determine what the fire is going to do next. The data we are capturing will be fed into these models, machine learning will take place, and we will get some understanding of what’s happening and [how to] develop prediction technologies [so] we can look into the future,” explained Dr. James Mullins, Ph.D., of FLAIM Systems, who also has many years of service as a firefighter in the Country Fire Authority (CFA) in Victoria, Australia.

The goals of the project are undoubtedly ambitious. “When first responders are attending an emergency, AUDREY can be an extra set of eyes and ears with a perfect memory and the ability to see through smoke, fire, and heat to augment their capabilities and their natural senses. The more AUDREY learns, the better equipped she is to be a guardian angel for first responders on the site,” said Charles Boyle, Co-founder and Managing Director of Signal Garden, a technology company that has made a name for itself helping blind and visually impaired people navigate the world around them—a scenario not far removed from firefighters navigating smoke-filled environments.

RELATED: DARPA Demo Day: a Glimpse of the Expanding Technological Frontier | Robotics for Reducing Firefighter Injuries: Now and Potential | WPI Team to Train Humanoid Firefighting Robot | THE USE OF ROBOTS AS A SEARCH TOOL

Nursery burn

(3) The “nursery” burn provided graphic content visually with heat flux more than 50kw/per meter squared, (15,000+ BTU sq. ft.), about twice the energy needed for rapid full compartment involvement.

Test burn 1

(4) A screen snap from the NASA I.R. sensor output, with temperatures (in Celsius).

The sensors and instruments in the test burns were collecting information not only pertinent to predicting fire behavior. Each of the participants, such as NASA JPL, Flaim Systems, W.S Darley and Co., Signal Garden, Reax Engineering, and Qwake Technologies, will use the data and imagery that collected temperature readings, thermal imagery, heat flux and 360° video to educate AUDREY, which in time will help firefighters with useful and actionable information. For instance, NASA JPL explored the use of low-cost infrared (I.R.) sensors to determine if body-worn or device-mounted sensors could provide accurate and useful temperature readings of the surrounding environment. Reax Engineering set up extensive thermal couple trees to measure heat flux at various levels within the compartment. Flaim Systems captured high resolution 360° video to incorporate into their advanced virtual reality environments to train firefighters.

AUDREY runs on some of the most powerful computers in the world. But as computer technology advances and AUDREY becomes more optimized, AUDREY is expected to come closer and closer to the edge where people interact with technology, according to Boyle. “First of all, it is going to be perhaps on the fire engines itself. Then it will be on the first responder equipment. Perhaps for people it is in their house and all around them. In the future, AUDREY can learn from internet of things that is developing nowadays.”

Inside the flashover cans

(5) 360° cameras, IR cameras and other instruments were secured to iron piping at various positions in the “flash cans.” Insulation protected (for a period of time) sensitive and costly instruments including sapphire windowed radiometers.

Qwake Technologies, a San Francisco-based technology company, participated in the test burn to test out their C-Thru product. C-Thru is a heads-up display that combines thermal imagery with augmented reality to improve situational awareness and sensory perception. “By taking information from AUDREY, we can display it to a firefighter in way that reduces their stress level and cognitive load and optimizes their performance,” said Sam Cossman, co-founder of the company.

Testing of AR system

(6) Captain Kirk Mckinzie tested Qwake Technologies’ C-THRU system in an IDLH environment during DHS S&T / NASA test burn at Cosumnes Fire Department. C-THRU is an augmented reality (AR) vision system designed to improve vision and cognitive performance in complex, zero-visibility conditions.

The wider potential of AUDREY was not lost on the firefighters in attendance. “AUDREY was designed by NASA to help their launch program and to hear multiple conversations at a time. The human ear and the brain don’t necessarily have the ability to comprehend those fine pieces of communication that are happening when lots of stuff is going on. AUDREY is not affected by those influences. She has the ability to reduce noise, separate conversations [and] key terms like Mayday, emergency…and [is] able to send that directly to the incident commander and dispatcher,” according to Battalion Chief Rick Clarke at the Cosumnes Fire Department.

The final fire was in a purpose-built, 10 x 10 wood-framed room. The room was fitted with painted drywall, a window, working electrical fixtures, carpet, smoke alarm, and all the furniture and clothing typically found in a nursery. Seeing how quickly that fire developed the firefighters and public educators in attendance noted that the data would also provide useful in community outreach campaigns. As Chief Clarke noted, “Public service announcements are a great way to reach out to the public and give them a first-person view of how to reduce fire risk at home by using smoke detectors, by closing their doors at night and being able to contain smoke, and keep survivable space available just by using the features of their home. [With] the information we have been given with the 360° video along with the scientific data, thermal imaging cameras inside and helmet cams, we will be able to build a video program that can encapsulate all of that in a safety message.”

It isn’t just the general public that stands to benefit. Chief Clarke is also hoping to use 360° imagery collected during the test burn to train firefighters. “The modern fuel package is not something we can put firefighters in and teach them how to fight fire in. It is too dangerous. Our gear is not built for that. With that being the case, our firefighters are missing key elements to help them do their job. We can take this information and give them a first-person view of how fire will grow, how quickly they need to react.”

Much of the technical development will occur back in more traditional laboratory settings. But the data collected is indispensable to train AUDREY accurately. “As humans, if we see a ball coming towards us, we can move our hand in a certain way to catch it. To teach a machine do that is tricky because it doesn’t know what a ball is or know how to move. There is this new science of being able to give a whole different set of data to a robot, i.e., what a ball looks like and how fast it is moving, so it can try to catch it. If it fails, it will learn from that experience and do something different. Over time, it will become perfect. We learn from behavior and improve outcome each time,” noted Dr. Mullins.

Involvement of various groups

(7) Many hands remain involved, from line personnel to start-up technology firms, engineers, academia, and the federal government.

For the firefighters in attendance, it was an opportunity to glimpse the future of cutting-edge technology and to occasionally share bit of real-world context to ensure the technologist’s world of “what if” or “what could be” isn’t too detached from the world of “what is.” For Chief Clarke, a firefighter with nearly 30 years’ experience, hosting this test burn in his training facility was a significant step towards getting this cutting-edge technology into the hands of firefighters.

Tactical camera

(8) The Bounce Imaging throwable tactical camera sent streaming 360° video to the hardened Sonim phone; enabling B.C. Clarke (Burn boss) to see the interior in real-time VR.

“As we get into bigger technologies such as better maps, better tracking of firefighters, communications tools, AI apps, better mics, virtual reality that seem so far out of touch…it will take a group of individuals similar to those who showed up here and the kind of collaboration we experienced. We need more time and a little bit of vision to get us to that point, because while it may work on a computer screen or while it may work in labs or at a test burn, when you put it on a fire engine with a firefighter who has to rely on that technology to make a decision in 30 seconds, and that 30 seconds will make the difference in saving a life, their technology has to work. A theory is theory until you prove it.”

S&T’s Next Generation First Responder Apex Program has been working over the last four years to prove that technology can help our first responders be better protected, connected and fully aware. S&T has been collaborating with state and local agencies, international partners, national labs, universities and technology firms for testing and evaluation of new research and development within the public safety space.

For more information or questions about AUDREY please contact NGFR@HQ.DHS.GOV

Kirk McKinzie is a captain with the Consumnes (CA) Fire Department and a SMART Technologist working with teams around the world pushing the boundary of the possible. Cosumnes Fire Department is home to 911GO, a SMART digital solution project. For more, e-mail


No posts to display