I am just back from attending a fire conference where the vendors of information management software have coined a new buzzword to sell their wares – “situational intelligence.” A couple of years ago I had a conversation with a software vendor who claimed his software “created” situational awareness. I found this claim both annoying and amusing.
Following a demonstration of his situational awareness creating software, I wrote an article about his claim and how misleading it was. Software does not create situational awareness any more than a book creates intelligence. Software may enhance situational awareness, but software does not create situational awareness.
Now, apparently, the vendors want us to believe they have developed software that will create intelligence. Let’s explore the capabilities and limitations of software to create awareness or, as they are claiming now, intelligence.
What is situational awareness?
Situational awareness is an individual’s ability to perceive and understand what is happening in their environment (in the context of how time is passing) and then, in turn, being able to make accurate predictions of future events, in time to avoid bad outcomes.
What is intelligence?
According to Merriam-Webster, intelligence is defined as the ability to learn or understand things or to deal with new or difficult situations. The alternative definition is the collection of secret information that a government collects (but I don’t think the alternate definition applies here).
Applied to software claims, situational intelligence would ensure the ability of the user to learn or understand. Hopefully, you can see that software is incapable of ensuring a user learns or understands anything. It can provide the data, but the learning, much like the development of situational awareness, is in the mind of the individual.
What can software and databases do?
I am not a software developer or engineer, nor do I claim to be. So I asked a friend to explain it to me and then I did my best to interpret his ramblings. The competent computer person reading my explanations would likely provide better explanations. If that is you, feel free to comment in the box at the end of the article and I will make adjustments accordingly.
Software and databases are simply tools to gather, store, sort and retrieve data. The value of the data to end user – in this case first responders, emergency responders and anyone working in or around a high-risk environment or work place – depend much on the quality of the input. The phrase I recall from my computer programming classes in college was: “Garbage in, garbage out.”
The interface is the system used to operate the computer and establishes the way the end user is able to work with the computer. The operator uses the interface to navigate through the software. If the interface is intuitive it may be easy to maneuver. However, as many end users of software know, what is intuitive to a programmer is not always intuitive to the end user. The “intuitive” design of a software interface can prove to be especially challenging when the end user is under stress.
One of the greatest risks to any incident manager is information overload. Face it, during an emergency there can be dozens, if not hundreds, of pieces of information coming at you all at the same time. Making sense of all this information, and prioritizing important from unimportant, can be extremely difficult. Often times it is impossible.
When the brain gets overloaded with input it can start a filtering process that is outside of conscious control. In other words, under stress, what the brain chooses to pay attention to, and remember (at the subconscious level), may not necessarily be in agreement with what you want it to remember (at the conscious level).
The brain can also shut down inputs. Under stress, with changing conditions and massive amounts of information to process, it is plausible a person may not hear inputs or may not see inputs. All of this happens as a mechanism designed to help us narrow our focus on threats and aid in our survival.
What concerns me most is that some software developers believe (and are marketing) that more information is better for the development of situational awareness, err, situational intelligence, sorry. The software demonstrations I have observed show that we can access ten, twenty, even thirty layers of data. It is not for me to judge what data is important. I suppose under varying conditions each layer of the data might serve a valued purpose.
What decision makers under stress really need
Let’s start with what decision makers don’t need. They don’t need to be overloaded with volumes of information. Under stress, volumes of information is not your friend, it’s your enemy. What is really needed is a small quantity of the most important information, prepared and presented in a succinct, easy to understand format.
The average person (and that’s most of us) can take in, process, comprehend and remember about seven pieces of unrelated information. After that, the brain starts forgetting and filtering. Keep that in mind when you are being pitched software designed to create intelligence!
Heads down on the display
With the concern about information noted, one of my larger concerns about software being used by decision makers (under conditions of stress and dynamically changing conditions) is the user might go “heads down” on their display. If this were to happen, the decision maker could become task fixated on the process of navigating through screens trying to locate data (often with a fair amount of trial and error because of limited familiarity with the software).
Once the data is located, the interface may require a fair amount of reading (or just scanning the screen) to locate the desired morsel of information. If that information is changing (e.g., tracking air usage of personnel operating in an IDLH environment) it could require constant monitoring of the changing data on the computer screen, potentially contributing even more to a heads-down condition.
The challenging of multitasking
Because there is limited capacity for the brain to process and comprehend, it can be very challenging to hear (and understand) the meaning of radio traffic while reading a display screen. Conversely, it can also be very difficult to read and comprehend the meaning of information on a display screen while attentively listening to a radio. You can test this by having a conversation with someone while simultaneously reading the newspaper or your favorite blog.
For those who attend my live programs, you know it is a hallmark of my presentations to use exercises to demonstrate the challenges we face with developing and maintaining situational awareness. I recently created a new exercise that demonstrates this limitation. I have been pleased with the results as it shows everyone in attendance the challenges we face with concurrent visual and audible processing of information. The attendees see for themselves, firsthand, their own limitations in processing and comprehension.
As you know, commercial airlines have two pilots. What you may not know is that one pilot is designated as the PF (Pilot Flying) and the other is designated as the PNF (Pilot Not Flying). The PF flies the plane. The PNF supports the pilot by interpreting the gauges and managing radio traffic. If something goes wrong, the PF flies the plane while the PNF gathers and processes information to solve the problem. This task sharing ensures the PF does not become fixated on computer screens or overloaded with information.
Sharing the load
Arguably there is a need for much of the data gathered, housed and accessible in incident management software. But it is best retrieved and interpreted by someone other than the decision maker. The person who uses the software should be very intimate and competent with how to navigate the screens and how to quickly cherry-pick the morsels of important data from the program.
Finally, no software creates intelligence. That would be the equivalent of saying if you attend a professor’s lecture, you will become more intelligent. That COULD happen… but only if you pay attention to the lecture and you understand what the professor is trying to convey to you. Speaking only from my own experiences in college, that was not always the case. In fact, following some lectures (e.g., a poorly delivered dump of information) it felt as though I actually lost intelligence by sitting in the room.
For a long time, the word “Intel” has been used synonymously to mean information. Software can provide “Intel” that can aid in the development of awareness, but no software can ensure the end user will be able to UNDERSTAND the meaning of the “Intel” in the context of changing conditions. That requires expert training, expert experience and an environment free of situational awareness barriers, including information overload and frustrating, complicated, difficult to navigate displays and databases.
Dr. Gasaway’s Advice
Keep in mind that software vendors have a financial motive for selling you solutions. The better they are at convincing you of the need for their solutions, the more profitable their company will be. Be wary. I am not suggesting that all software is bad for situational awareness. Nor am I suggesting that all software vendors are driven by greed. I know some whose passion for safety and their desire to be helpful is unrivaled.
Before purchasing, secure a demo copy of the software and use it, in the field, during real emergencies, to see how easy it is to navigate. Is the interface truly intuitive? Is the most important data easily accessible? Is it presented succinctly to avoid information overload? Would it be used under conditions of stress, consequence and time compression? All valuable questions to be asked before making the purchase.
1. Before purchasing software, assess what problems you have with information access or management that the software may be able to resolve.
2. If you have incident management software now, ask the users what they like and don’t like about it. Make sure the new software resolves the bigger issues of your existing software.
3. Discuss a time (on or off an emergency scene) where you found yourself heads-down on a display and lost awareness of what was happening around you. (I see this all the time with people texting on their phones while walking through busy airports. And… yes, I’ve done it too.)
4. Discuss your concerns for using software interfaces and develop a strategy to ensure the software serves as an asset, not a liability.
5. Finally, see software for what it is… a tool that can help enhance situational awareness and aid in the development of end-user intelligence. But also see software as a tool that can adversely impact situational awareness and adversely impact end user intelligence.
6. Become a student of situational awareness and invest some time to learn more about what it is, how you develop it, how you can lose it, and how you can restore it once it is lost.
If you are interested in taking your understanding of situational awareness and high-risk decision making to a higher level, check out the Situational Awareness Matters Online Academy.
CLICK HERE for details, enrollment options and pricing.
Share your comments on this article in the “Leave a Reply” box below. If you want to send me incident pictures, videos or have an idea you’d like me to research and write about, contact me. I really enjoy getting feedback and supportive messages from fellow first responders. It gives me the energy to work harder for you.
SAMatters Online Academy
Facebook Fan Page: www.facebook.com/SAMatters
LinkedIn: Rich Gasaway
iTunes: SAMatters Radio
iHeart Radio: SAMatters Radio
1 thought on “Situational Intelligence – The New Buzz Word”
I really enjoyed your article and agree completely that there are too many software vendors throwing too many features/functions into their situational awareness apps instead of following a proper User Experience (UX) design approach. I made that mistake in the past when designing crisis management software because we were looking at the long ‘wish lists’ of functionality contained in RFP’s from agencies that often didn’t really know what they wanted (other than ‘everything’).
So with my latest venture have spent over 8 months working with police, fire, paramedics on the front-lines and back in the command centre and dispatch during field exercises and live operations to fully understand the critical needs by each user role before even starting to write a line of software code. Our design guideline is if it can’t be used and “understood” with just a 1 minute briefing then it won’t be used effectively or at all. And the only way to really prove that its working is through a variety of field tests not in labs or simulated environments.
To address the ‘heads-up/hands-free’ requirement, we continue to experiment with wearable technology to test usability and form factor. Some devices have been readily accepted by first responders we’ve worked with but others have not cut the mustard (e.g. unacceptable battery life, form factor not appropriate and lack of ruggedization are the common limitations). We are happy to share the learnings with you and your readers if you’d like to get in touch with me: firstname.lastname@example.org.
CommandWear Systems Inc.