It’s not often that topics like software usability find their way into the mainstream news, so it’s worth taking a look at what happened to see what we can learn. As I’m sure you know by now, on January 13 of this year, at 8:07 AM, the people of Hawaii received a disturbing this-is-not-a-drill warning of incoming missiles. The fact that it turned out to be a false alarm probably didn’t do much to undo the bad start to that day.
Widespread media coverage at the time put the blame on a software application that led a poor unsuspecting operator into clicking “not a drill” by mistake. Recent coverage contradicts that, quoting the operator as saying it was a garbled recorded phone message that led him to believe that the attack was, indeed, real and not an exercise.
This may be true, or it may be this poor soul’s way of rationalizing a mistake that caused so much upset. But either way, the coverage of the incident uncovered enough detail about the application design to convince us that the problem could have been usability-related. Which makes this a good opportunity to once again review the things that we all should think about when designing any software application, and especially applications that do Very Important Things.
(What we won’t cover here, although it bears considering, is the ill-considered reliance on a recorded message to deliver such drastic news, leaving no opportunity for an operator to say “Wait, what?!? Can you repeat that?”)
Overview of Hawaii’s Emergency Management Alert System
Two images were released to the media showing the user interface (UI) of Hawaii’s Emergency Management Alert System (HI-EMA). Initially, the governor’s office released a photo of a rather basic UI that quickly drew sharp criticism.
HI-EMA stated the original image was not correct and released a photo of a system that was similar to the real system. They could not share a photo of the actual system for security purposes.
The image showed a picture of a drop-down menu which included options for both “DRILL – PACOM (CDW) – STATE ONLY” and “PACOM (CDW) – STATE ONLY.” The employee selected the option that was not a drill.
As we mentioned earlier, it was originally believed this option was chosen in error. But, it turns out the employee was under the impression that the alert was real. The drill was run during the change of shift and involved a phone recording with conflicting information. The recording included the proper language stating the drill was an exercise, but it also contained words describing a real missile alert. The recording did not adhere to the standard script for a drill. Most of the officers understood this was indeed a drill, but the employee responsible for issuing the alert believed it was a real alert.
HI-EMA confirmed that for the alert to be sent out, the employee also had to click “yes” to confirm the option chosen. No picture of this confirmation screen was shared, but the information provided leads us to believe that the confirmation was a simple: “Are you sure you want to send this Alert?” message. Since the employee believed the alert was real, he clicked “yes” on the confirmation screen.
HI-EMA immediately recognized this was a false alarm, but the system did not contain an option to send out a “false alarm” message, causing long delays in notifying Hawaii’s citizens. This option has since been added.
It took 38 minutes before the state government issued a correction for the false alarm. The delay was attributed to the government waiting for permission from FEMA to issue the correction, although it turned out permission was not needed.
Hawaii Governor David Ige also took 17 minutes to tweet a “false alarm” message to citizens. The reason for his delay? He simply forgot his Twitter password. Ige has since saved his login information on his phone so he can be more responsive on social media going forward.
Preliminary steps have been taken to prevent this from happening again:
- All future drills are suspended until a full evaluation of the false alarm is completed.
- A two-person activation and verification process is now being enforced for both tests and actual notifications.
- A cancellation command can now be sent out within seconds of an error.
- The employee responsible for sending out the alert has been fired, as it turns out this was not the first time he confused a drill with a real alert.
- The head of HI-EMA has resigned.
These steps address some of the issues, but there are larger usability and design flaws that require attention.
Usability flaws and how they can be fixed
Let’s take a look at flaws with the UI design and how they should be improved to prevent a similar incident from happening again.
- The UI itself is outdated and the use of a drop-down menu is inappropriate. It makes it too easy to select the wrong item by accident.
- The options for “drill” and “real” are not logically organized in the drop-down menu.
- The only difference between the drill and the real alarm is the word “DRILL” in all caps, which is not enough of a differentiator.
- A nondescript confirmation message is used, contributing to the alert fatigue that is sweeping the globe. Increased notifications and prompts have made users in general numb to alerts.
A better approach
- Given the purpose of the UI, options for “drill” and “real” should be placed in totally separate areas. Different fields or different sub-menus would be more appropriate.
- The “drill” option should be placed higher up in the list, above the option for the real alert, as the “drill” option is used more frequently.
- If a drop-down menu is used, the default selection should either be empty or set to the ‘“drill” option to make it harder to choose the real alert.
- The “drill’” and “real” options should be renamed to make a clearer distinction between them.
- The confirmation message presented to the user to re-confirm their selection must be specific to the item selected.
- For a real alarm, the confirmation message should force the user to actively specify the emergency a second time.
- The font color (preferably red) and text size (large and bold to grab attention) of the confirmation message should reinforce the severity and realness of the alarm they are about to transmit.
Usage must be considered
This is a UI that will be used in times of stress. Real-world usage and repercussions of an error were not factored into the UI design. The design must account for how users will interact with it under duress.
The current design makes it too simple to inadvertently select and confirm a real alarm. The UI must force a user to make extra clicks to send out a real alarm and present the user with a clear and vivid description of the alarm they are about to send.
The option to immediately correct an error and transmit a “false alarm” message should have been built into a system of this magnitude from day one, showing a failure of the imagination on the part of the design team. A false alarm scenario was not considered in the design, so no false alarm message was included.
Given the severity of a real alarm, the design team should have considered the importance of an extra layer of security. If we look at an extreme example of launching nuclear missiles, two keys are required, and for good reason.
The Hawaii false alarm scenario is less extreme, but you could consider the likelihood that a bad actor may try to intentionally send a false alert. This is a rare, but possible, case, making heightened security elements a judgment call to be made by the designers.
Should approval be required by a second individual, whether it be a designated peer employee or a manager to send out a real alarm? After this incident HI-EMA has made the choice to enforce a two-person activation and verification for both drills and real alarms, showing where they stand on the security issue after this unfortunate event.
Was the Hawaii false alarm caused by human error or usability problems? There is no doubt that human error did come into play given the employee’s confusion. Butthe poor design of the UI makes this kind of mistake possible – and more likely – even without confusion. And the lack of a “false alarm” message option in the original design prevented HI-EMA from responding swiftly and appropriately. A redesign of this UI should be the highest priority for HI-EMA.
This incident points to the bigger picture that UI design is more than just code. Functionality is not the only requirement. The design and testing process should consider all use cases and the ramifications not only on the end user, but those who might be affected by “flipping the switch.” In this case, all of Hawaii’s residents believed they had minutes to live.
Development teams need to step back and focus on more than just code, to prevent these types of situations from happening. Expertise is required in all applicable areas – programming, product architecture, human factors, user experience, and management.
A properly designed UI can help humans perform their jobs better and decrease the incidence and severity of errors. Usability testing must be conducted during the development process to prevent such simple – yet sometimes grave – mistakes from happening in production.
A formal UI design and testing strategy – one that includes testing the application in real-life conditions – would have identified and corrected the issues that contributed to the Hawaii false alarm.
The incident shows an extreme case of what can go wrong when usability is ignored.
You may be tempted to shrug off such an egregious example, thinking it can never happen to you. Instead, it should serve as a reminder that it can happen to you. If you’re not careful with your design, then your users will definitely make mistakes. While their mistakes probably won’t terrorize an entire state, ask yourself what will happen. Will their mistakes cost you money? Will you lose customers? You won’t know these answers to these questions unless you think about it during design.