By Dane Carley and Craig Nelson
Our last discussion, as illustrated below, centered on HRO principle #2: a reluctance to simplify. We discussed fireifghters’ need for routine in their tasks (e.g., pulling a preconnect), but routine in decision making leads to complacency. Therefore, seeking out differing points of view is important when there is a reluctance to simplify. With the nuggets of knowledge we learned from principle #2, we move on this month to HRO principle #3: a sensitivity to operations. If you’re picturing firefighters sitting in a circle holding hands and singing or engaging in a morning hug at changeover, you can stop worrying. For those of you who subconsciously stopped breathing during that last part, you’ll be happy to know that we are referring to operational sensitivity, which is much different.
A short example reveals a subtle but important difference between principle #2 and #3. In HRO principle #2, a firefighter starting a positive-pressure ventilation (PPV) fan follows the proper procedure for starting a fan and setting up the most effective operation. In HRO principle #3, that same firefighter also considers how his or her actions will affect others operating on the fireground when that fan is turned into the vent point (e.g., endangering a crew or victim if used inappropriately).
Where We Are: HRO Principle #3 A Sensitivity to Operations
A sensitivity to operations boils down to those making decisions listening to those operating on the street and those on the street providing only pertinent and accurate information to those making the decisions. Personnel operating on the street can provide a more accurate picture of what is actually happening as long as they are well trained and understand when it is important to pass information up to the decision makers. Decision makers can be anyone: firefighters, captains, chiefs, incident commanders, paramedics in charge, and so on. Regardless of who is making the decisions, they must seek out, listen for, and trust the information from those around them. As stated above, though, the information for the decision makers must come from well-trained personnel if the information is to be accurate.
For example, have you ever seen firefighters who don’t always understand what is going on at the top of the department? Often, in this case, there is a lack of information that leads to rumors, misinformation, and false assumptions, all of which affect morale. Meanwhile, the top of the department may not always know what is going on in the street because each of them has a role in the department that causes them to focus on their specific tasks. The administration makes decisions based on the information available to them. As a firefighter in an outlying station may be missing pieces of information about an administrative decision, the administration may also be missing operational awareness because of their separation from the street.
One proven aspect fueling this commonality across the fire service is that we seek out information supporting our point of view while tending to ignore conflicting information. This is called “cognitive dissonance.” Once a person has made a decision about what he thinks is happening, he gathers information to support his decision while ignoring information that conflicts with his point of view. An example of cognitive dissonance provided by Gordon Graham (2011) is the ongoing debate about Fords and Chevrolets. A person who decides to purchase and drive a Ford immediately notices more positives about Fords while noticing the negatives of Chevrolets, even though one is not necessarily better than the other.
Let’s look at a fictitious example of writing specs for a new apparatus. In this case, administrators prefer an apparatus similar to what the department already has because it reduces maintenance and purchasing costs and satisfies certain ISO and strategic considerations. The decision about the “administrative stuff” has already been made, so consideration of alternatives is already limited. Yet, these requirements mean tall hosebeds that are less functional; less interior cab space; and, increased turning radius–the “firefighting stuff.” Do you see the cognitive dissonance coming from both sides? Being sensitive to operations means the administration listens to the operational staff asking for modifications that simplify tasks, make the job safer, or generally improves company effectiveness. Likewise, the operational staff takes the time to understand the strategic considerations. This way, both parties find common ground to achieve all the goals as closely as possible.
The same processes carry over to the emergency scene. Let’s return to the PPV example at the beginning of this column. A decision maker orders a firefighter to turn the PPV fan into the building and begin ventilation. However, the firefighter, being sensitive to operations (awareness, training, understanding), recognizes that doing so endangers a crew searching a room near the vent point. The firefighter then communicates the important details of not being able to vent as ordered. The decision maker (being sensitive to operations), on hearing this, recognizes that the new information is important to successful operations, terminates the request, and develops an alternative solution.
Weick and Sutcliffe (2007) developed an audit (p. 97) to provide some measure of a department’s sensitivity to operations. The audit contains nine questions revolving around the idea of how familiar department members are with other aspects of the department’s operations, how much discretion employees have to solve problems, and how accessible resources are on a moment’s notice. Additionally, Weick and Sutcliffe (2007, pp. 60 – 61) identify three threats to sensitivity to operations.
- HROs treat experience and education equally because both are equally important to success.
- HROs do not work mindlessly or treat tasks as routine (if you’re not sure what this means, think of your C-shifters).
- HROs do not treat the avoidance of an incident or an injury from a close call as proof of safe operations but rather as evidence of a potential failure in the future (if you’re looking for an example of this, watch your B-shifters. Eventually, they’ll blow something up but call it safe if no one was hurt).
Case Study The following case study is from www.firefighternearmiss.com. The near miss report, 11-0000094 , is not edited. We were not involved in this incident and do not know the department involved so we make certain assumptions based on our fire service experience to relate the incident to the discussion above. Event Description Note: Brackets [] denote reviewer de-identification. Engine [1], [2], [3], Rescue [1], Tower [1], Battalion [1] and Medic [1] were dispatched to a two-story residence for light smoke showing from an upstairs window. We arrived first on Engine [1] to find light smoke showing from a barely open window in a two-story four-plex of lightweight construction. Civilian accountability was attempted. and all units were believed to be empty. Upon arrival of other crews with RIT lines ready, we made entry to find zero-visibility conditions and low heat. Using the thermal imager, we made our way through the first floor to the back door, which we opened for natural ventilation. The camera showed the kitchen area to be warm but not hot. In my haste, I rushed the team through the downstairs area, completely missing a door. We made out no victims or fire, so we proceeded to the stairs behind us and went to the second floor. Upstairs was also near-zero visibility conditions with low heat. As we vented windows upstairs and eventually tried hydraulic ventilation, the door that I had missed downstairs started to burn through. The RIT team consisted of a lieutenant and one fire rescue officer who were positioned at the front door. They witnessed this progression and attempted to contact us by radio, which we failed to hear. The hydraulic ventilation obviously did not help conditions downstairs and caused most of the kitchen area to flash, pulling much of the fire upstairs. I was attempting to come down the stairs when this suddenly occurred, and I was engulfed in a rollover that took up half the stairwell. I stumbled up the stairs to my team, who had no idea of what was happening just beneath us. The RIT team immediately took action, sweeping the ceiling and pushing the fire back. The conditions improved very fast, and my crew and I were able to retreat to the first floor due to the swift actions of the RIT. I hope by sharing this it doesn’t happen to anyone else. It was a close call for sure, and I can’t thank the RIT team enough for keeping cool and taking care of business. Lessons Learned
|
Discussion Questions
1. Based on pieces of information in the case study about not hearing radio traffic and the recommendation to check portables, it is likely the reporter was on the wrong radio channel. How can the routine of morning checks lead to mindless actions contributing to near misses at an incident later in the day?
2. The rapid intervention team (RIT) acted on behalf of the initial attack group without direct orders from the attack group to do so. Discuss how this relates to the concept of comparing what is actually happening to what is supposed to be happening.
3. Thermal imaging cameras (TICs) are becoming common fire service tools. Many of us have experienced using them, yet there is less education on how they work. Does this contribute to common misunderstandings or misbeliefs about what we can and cannot see in some cases?
4. How does the previous idea relate to the concept of treating experience and education equally?
Possible Discussion Answers
1. It is easy to go through normal morning checks (or any other routine) mindlessly and miss a radio that was on the wrong channel from a previous fire, being bumped, training, and so on. The ramifications of such a simple mistake are potential life safety threats on the fireground.
2. The RIT recognized that what was supposed to be happening was not what was actually happening (sensitivity to operations). It acted properly based on this recognition, which saved the attack group.
3. Being well trained with a TIC (or any equipment) and its use is a basic part of sensitivity to operations because you must be well trained and understand your equipment to understand when it is not working properly–being able to realize that what is being seen through the TIC is not actually what is going on in the environment.
4. Education (classroom training) provides the background, whereas training provides the experience. In this example, education provides the understanding of how an imaging camera works, which helps those using it to better understand the tool. Education also helped the RIT understand what should be happening and recognize that something different was actually happening. Because team members were obviously well trained and had good situational awareness, they acted decisively to intervene. In this case, experience probably also played an equally important role in the decision making.
Where We Are Going
The next installment of this column discusses the fourth HRO principle – a commitment to resilience. We would appreciate any feedback, thoughts, or complaints. Please contact us at tailboardtalk@yahoo.com.
References
Graham, G. (2011, August 8). YouTube. Retrieved August 8, 2011, from GordyGraham.com: http://youtu.be/HjU-vtPS1nw
Weick, K. E., & Sutcliffe, K. M. (2007). Managing the Unexpected: Resilient Performance i an Age of Uncertainty (2nd ed. ed.). San Francisco, CA: Jossey – Bass.
Craig Nelson (left) works for the Fargo (ND) Fire Department and works part-time at Minnesota State Community and Technical College – Moorhead as a fire instructor. He also works seasonally for the Minnesota Department of Natural Resources as a wildland firefighter in Northwest Minnesota. Previously, he was an airline pilot. He has a bachelor’s degree in business administration and a master’s degree in executive fire service leadership.
Dane Carley (right) entered the fire service in 1989 in southern California and is currently a captain for the Fargo (ND) Fire Department. Since then, he has worked in structural, wildland-urban interface, and wildland firefighting in capacities ranging from fire explorer to career captain. He has both a bachelor’s degree in fire and safety engineering technology, and a master’s degree in public safety executive leadership. Dane also serves as both an operations section chief and a planning section chief for North Dakota’s Type III Incident Management Assistance Team, which provides support to local jurisdictions overwhelmed by the magnitude of an incident.
Previous Articles
- Tailboard Talk: K.I.S.S. or Not? A Reluctance to Simplify
- Tailboard Talk: Mistakes Even Happen to Firefighters: A Preoccupation with Failure
- Tailboard Talk: Why Do We Play the Blame Game? Let’s Turn It on Its Head!
- Tailboard Talk: Near Miss Report
- Tailboard Talk: HROs Use Human Factors to Improve Learning Within the Organization
- Tailboard Talk: HROs Use Human Factors to Improve Learning Within the Organization
- Tailboard Talk: Introduction to Higher Reliability Organizations