Stanislav Petrov – Human decision making

On 26 September 1983, Stanislav Petrov took a stand against what his systems were telling him and he may have changed the course of history. Petrov was working as a duty officer at the command center for the Oko nuclear early warning system. This is the place where the Soviets monitored incoming attacks, much like the US command center you remember from War Games. Earlier that month, the Soviet Union shot down a Korean commercial jetliner over the Sea of Japan, claiming that it was on a spy mission. 269 people died in that incident, including a US Congressman. Some at the Soviet Union were fearful of a retaliation strike by the US. Cold War tensions were high.

At the command center, Petrov was getting data that a launch of five missiles had been made in the US towards the Soviet Union. But instead of just reading that dashboard and acting he actually used his own inner analytics system to process the data and decide not to report or react.

Had Petrov reported incoming American missiles, his superiors might have launched an assault against the United States, precipitating a corresponding nuclear response from the United States. Petrov declared the system’s indications a false alarm. Later, it was apparent that he was right: no missiles were approaching and the computer detection system was malfunctioning. It was subsequently determined that the false alarms had been created by a rare alignment of sunlight on high-altitude clouds and the satellites’ Molniya orbits, an error later corrected by cross-referencing a geostationary satellite.[5] Petrov later indicated the influences in this decision included: that he was informed a U.S. strike would be all-out, so five missiles seemed an illogical start;[1] that the launch detection system was new and, in his view, not yet wholly trustworthy; and that ground radars failed to pick up corroborative evidence, even after minutes of delay.[6] – Wikipedia contributors. "Stanislav Petrov." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 26 Sep. 2012. Web. 26 Sep. 2012.

I’ve always wondered if the system he was using had a bunch of fancy dashboard features, like shiny 3D pie charts, moving average lines and drill down capable reports if he would have been able to not trust the data. I’ve seen this sort of over-trust of data with data model diagrams. It seems the prettier or more advanced the presentation of the data is, the more people want to believe it is right. In fact, I’ve learned to present draft documents to people on my teams with hand-written notes/comments on them to sort of "break the ice" to show people that they are drafts. A modern solution might have included some sort of decision making guidance that say "Confidence Factor of Attack: 99%" or something like that. And it would have been highlighted by some sort of red bar, showing just how confident the system was based on the data – bad data, it turns out.

More details about Petrov and his actions in the video above from History.com