At the start of the Flawed Situational Awareness program I share a story about my early years as a company officer and subsequently as a command-level officer. Even in those days (more than 30 years ago), I held a deep desire to learn from failure and catastrophe. I read every near-miss and casualty report I could get my hands on, trying to glean every possible lesson so I could keep those working with me, and for me, as safe as possible. And while I was reading a lot, I was learning very little and the reason comes down to mindset.
It is so easy to judge
In the early days, my efforts to understand involved reading casualty reports (or watching videos) and judging those involved as having done things that were senseless, unexplainable, dumb and in some cases, I even thought their actions bordered on criminal. I had become skilled at passing judgment. I didn’t, at the time, realize how wrong this was. I was judging people I didn’t even know who were working under conditions that I truly did not understand (or appreciate). The more I judged, the better I got at it.
It felt so good
As I reflect back on why I was so judgmental, I now realize why I did it. It made me feel good. It made me feel more competent. You see, when I read a report (or watched a video) and judged those operating at the incident scene as “inexperienced” it was quickly followed by my internal voice saying “But I’m not.” Or I would judge them as “inexperienced” again quickly followed by my internal voice affirming “But I’m not.” Or they are “incompetent, but I’m not.” Or “They should have never done that” followed by “I know I never would.” It felt so good to compare myself to those who failed and to give myself a congratulatory pat on the back to celebrate my greatness. What an arrogant ass I was in those days.
When I started learning more about human error, human factors, mindfulness, situational awareness, resiliency, decision making and biases, my entire perspective changed. I was actually ashamed of myself and vowed that I would no longer judge others in harsh ways based on well-documented outcomes that, with a judgmental mind, would lead to the conclusion that someone was doing something wrong and they should have known better.
Learning by asking questions
Some years ago I learned a teaching method whereby an instructor would teach by asking questions. This form of adult facilitated learning has become my favorite instructional methodology. And it is nothing short of amazing what can be learned when you ask the right questions.
In the book, Situational Awareness for Emergency Response (PennWell, 2013), I outline seven questions that help to form understanding. It is no accident that these questions are offered right in the front of the book, because the book addresses issues of barriers to situational awareness (included among them is human error). It would be very easy for a person to start judging the individuals who have been impacted by the barriers. When the judging begins, however, the learning ends.
Flawed situational awareness
Flawed situational awareness is NEVER an acceptable explanation for bad outcomes. Rather, it is simply a label for the symptom (and arguably not a very good label if it compels someone to use it as an explanation for bad outcomes). The explanation for why someone did something, or didn’t do something, can be very complex to understand, especially by those who were not there and who only have the after-action report as their sole perspective of what REALLY happened.
For example, I was recently in a discussion with someone who was sharing his indignation for the commander of an incident that resulted in a responder casualty. He said the commander should have known better because the information he needed to make the “right decision” was shared by personnel on the scene over the radio. He knew this because the radio transmission log was an appendix to the report.
This prompted me to ask a few questions: “Do you know that the commander actually heard every one of those radio transmissions?” Simply because the transmission was captured on a recording device doesn’t mean the transmission was heard. His response: “He should have heard it!” (It usually doesn’t take long in these discussions for indignation and judgment to surface.)
Then I asked: “Let’s assume the commander did hear the radio transmission. In fact, let’s even go as far as to assume the commander acknowledged the radio transmissions. Does this mean the commander understood the meaning of every one of those messages?” Again, the indignant response: “He should have and if he didn’t understand it, then he should have asked for clarification.”
Then I asked: “Is it possible that when the commander acknowledged the radio transmission he thought, at that moment, he did understand it and it wasn’t until after the fact that he realized there had been miscommunication?” And the indignant response: “Yes, I suppose ANYTHING is possible.” (Indicating his frustration with me because I appeared to be defending a person he clearly had contempt for).
And my final question: “Have you ever been in a conversation with someone where you thought you knew the meaning of what they were saying, only to find out, after the fact, that you indeed did not understand?” And the final indignant response: “Yes, but no one died when it happened to me.”
It is so easy to see flawed situational awareness after the fact. But what good does that do? It doesn’t bring back the victim(s). The challenge for all of is to understand – deeply understand – what was happening in the moment that may have caused the meltdown in awareness.
Seek first to understand
In 2004 I had a fundamental shift in my mindset and, deploying one of the habits of successful people that Steven Covey espoused, I began to seek first to understand. This opened my mind to explanations for human error that I had never before considered (remember, I was too busy judging).
Since then, I have studied over 100 barriers to situational awareness. The barriers fall into a variety of categories and are labeled differently by those who study this stuff. Barriers can be traced to, and explained (sometimes), by understanding the complexities that can result when we combine humans, systems, policies, strategies, goals, politics, equipment and environments.
I would encourage all seeking to learn from bad outcomes to avoid getting caught up on nomenclature (e.g., Call it situational awareness, Call it human error. Call it sense making. Call it resiliency. Or call it unicorns if you so desire). It doesn’t matter what you call it. In fact, labels explain nothing. The true understanding of error is achieved only when the big labels are deconstructed and the component parts evaluated. There are more than 250 articles here on SAMatters, all dedicated to the mission of deconstructing and explaining the component parts of situational awareness (or whatever label you want to put on it).
The important thing isn’t the label, it’s that we stop judging and seek to deeply understand why bad outcomes occur and then learn from it. It is truly amazing what we are capable of learning from others when we come down off our high perch of judgment.
If you are interested in taking your understanding of situational awareness and high-risk decision making to a higher level, check out the Situational Awareness Matters Online Academy.
CLICK HERE for details, enrollment options and pricing.
Share your comments on this article in the “Leave a Reply” box below. If you want to send me incident pictures, videos or have an idea you’d like me to research and write about, contact me. I really enjoy getting feedback and supportive messages from fellow first responders. It gives me the energy to work harder for you.
Facebook Fan Page: www.facebook.com/SAMatters
LinkedIn: Rich Gasaway
iTunes: SAMatters Radio