It would be difficult to have missed the news alert from January 13th that residents of Hawaii had received an emergency alert that a ballistic missile, presumably launched by North Korea, was heading for their island state. Thirty-eight minutes later— enough time to create a national anxiety attack — resident’s received a clarifying alert stating that the first alert had been a false alarm.

As Hawaiian residents emerged from their hiding places, the question of how such a mistake could have been made took the front page. Conspiracy theories that an actual ICBM had been neutralized began to appear on Twitter, while others wondered how quickly the responsible operator would be fired. As with other public blunders I began to suspect that poor design lay at the heart of the problem.

There are two major UX issues to address: how this mistake was made in the first place and why it took so long to correct. In addressing the first of these issues, it’s natural (and easy) to blame the human operator for this mistake, but research shows that human error is almost always the result of poor design. Don Norman’s The Design of Everyday Things addressed exactly this,

“…In my experience, human error usually is a result of poor design: it should be called system error. Humans err continually; it is an intrinsic part of our nature. System design should take this into account. Pinning the blame on the person may be a comfortable way to proceed, but why was the system ever designed so that a single act by a single person could cause calamity? Worse, blaming the person without fixing the root, underlying cause does not fix the problem: the same error is likely to be repeated by someone else.”

In product design, the Japanese term poke-yoke (literally “mistake-proofing”) is used to refer to the practice of making a system or product foolproof. Any design decision which reduces the likelihood of inadvertent behavior is considered a part of poke-yoke, which is meant to be implemented throughout the design process rather than as a single step. In software design, this practice of anticipating and preventing human error is referred to as simply “defensive design.”

Designers use a variety of principles to guide our thinking around defensive design. Redundancy, failure transparency, and graceful degradation are examples of principles borrowed from safety engineering that are commonly practiced in interaction design. The most prevalent example is what’s known as a forcing function. From the Glossary of Human Computer Interaction,

A forcing function is an aspect of a design that prevents the user from taking an action without consciously considering information relevant to that action. It forces conscious attention upon something (“bringing to conciousness”) and thus deliberately disrupts the efficient or automatized performance of a task.

The popular “are you sure you want to take this action” dialog that appears when deleting files or performing other sensitive operations is one common example, but forcing functions can be seen all over the place. Consider gear shift patterns or speed bumps — design decisions that are meant to prevent user error. In order to prevent users from accidentally leaving their card, most ATMs wont dispense cash until you’ve removed your debit card. The famous “break glass in case of emergency” protocol is used as a forcing function to deter the misuse of emergency equipment. In even more sensitive operations, forcing functions like the button shield pictured below are used to prevent both accidental activation and misuse.

A button shield is locked closed in an example of a forcing function

One of my favorite examples of poke-yoke is the three-pin plug. The longest pin — called the “earth pin” — grounds the circuit. Because this pin is longer than either of the current pins it is the first to connect and last to disconnect, preventing the human hand from being used as a ground which would cause electrocution. Dead-simple design solutions like this are often overlooked, but without this elegance many users would suffer electric shock.

Because of its length the “earth pin” is the first to connect and last to disconnect, preventing electric shock

These designs are meant to take into account both defensive design and ease of use. One of the great challenges product designers face is the trade-off between behavior-shaping constraints like forcing functions and the general ease of use and personalization of a system. In general, however, designers aim to make it easy to do the right thing and difficult to do the wrong thing. While this seems obvious and straightforward, the Hawaiian false missile calamity is only one of many examples of design failing to prevent otherwise avoidable catastrophes.

Sure enough, images of the Emergency Alert System interface show it is sporting a lack of information hierarchy or any first-level forcing functions. How much can we really blame an operator for pressing the wrong string of blue characters from a list of blue characters?