Unmanned aircraft crash. In fact, they crash a lot—though there's no recent specific data, the Congressional Research Service reported last year that despite improvements "the accident rate for unmanned aircraft is still far above that of manned aircraft." And while many of those accidents can be attributed to hostile fire or terrible flight conditions, a significant percentage of drone crashes is caused by human error. A December 2004 Federal Aviation Administration (FAA) study of Defense Department drone crashes found human factors to be a causal factor in about a third of the cases the researchers examined.

But as four human factors engineering researchers have found, sometimes the accidents are by design. That is, the design of the systems that operators use to fly the drones are so bad that they invite accidents. A recent Ergonomics in Design article reported that a small but significant number of crashes could be directly attributed to bad ergonomics on ground control station hardware. These factors may have played a major part in crashes that were attributed to other causes.

Take, for example, one drone crash in 2006. As the operator brought the drone in for a landing, he meant to flip the landing gear button on the control joystick but accidentally hit the nearby ignition switch instead—shutting off the engine in mid-flight. The $1.5 million drone plummeted to the ground, a total loss. On another occasion, glare on a screen was so bad that a drone operator couldn't read an alert and mistook it for a landing signal—again killing the engines before the drone had landed.

Unmanned aircraft have been pushed into service so quickly in the last decade that their control systems were often still in development when they arrived on battlefields in Iraq and Afghanistan. Despite many of the systems being based on technology very similar to the average PC—and the level of automation in drones continuing to increase as operations move from flying with a joystick to a mouse—the Department of Defense has still not developed human factors standards for ground control station systems, even as the systems have matured. Considering how much human factors engineering goes into nearly every bit of other weapons system procurement (and having worked as a contractor at the Army Test Lab at Aberdeen Proving Grounds at one point, I can attest that it's significant), that's a bit of a surprise.

The authors of the report were Dr. Qaisar "Raza" Waraich (an engineer at Smartronix who recently completed his PhD at George Washington University) and GWU faculty members Dr. Thomas A. Mazzuchi, Dr. Shahram Sarkani, and adjunct instructor David F. Rico (who has also done UAS design work for the US Navy). They surveyed 20 drone operators about the characteristics of their ground-control station systems and found that there was a 98 percent overlap in the input and output devices used by ground control workstations and those used by general purpose computers. Some devices even drew from the realm of computer and console gaming.

Therefore, they concluded, drone systems could benefit greatly from the application of well-established ergonomic standards for general-purpose computing workstations—specifically, the Human Factors and Ergonomics Society and American National Standards Institute's ANSI/HFES-100-2007 standards for computing workstations.

"The IO category of ANSI/HFES 100-2007 specifies the ergonomic shape of auxiliary input devices that best conforms to humans, bodily constraints, and layout," Waraich and his co-authors wrote. If the DOD used the standard as the basis for acceptable drone pilot workstations, such as button layout specifications that take hand and finger movements into account and try to avoid those that can cause hand fatigue, "many [drone] mishaps may be avoided."

Hopefully, the FAA will take human factors into account before it starts certifying any drones to fly in US airspace.