Software that controls vital human functions should always be open source, else it could prove to be a danger to one's existence, the executive director of the GNOME Foundation says.

Karen Sandler, who was in Melbourne on her way back to the US from the Australian national Linux conference in Ballarat, makes the observation from personal experience; she suffers from a condition known as hypertrophic cardiomyopathy where the human heart is much bigger than normal, making the risk of dying suddenly a very real thing.

When Sandler (below) was made aware of her condition five years ago, her first reaction to being told that she needed a pacemaker was "what is it running?" Her investigations of such medical devices led her, along with three others, to write a paper in 2010 titled "Killed by code: software transparency in implantable medical devices."

She finally agreed to have an older model of pacemaker, which does not have wireless capability, implanted, when her doctor told her that the risk she was exposing herself to was huge.



The paper says: "The software on these devices performs life-sustaining functions such as cardiac pacing and defibrillation, drug delivery, and insulin administration. It is also responsible for monitoring, recording and storing private patient information, communicating wirelessly with other computers, and responding to changes in doctors' orders."

Sandler says there is a danger of people hacking into insulin pumps and putting others' lives at risk. If the code is open, then bugs will be found out much faster and even if a bug is found, it can be fixed fast.

At a security conference last year, a hacker who is himself diabetic demonstrated how one could hack into an insulin pump, to show exactly how poor the security of the device was.

Sandler points out in the paper that the US Food and Drug Administration issued 23 recalls of defective devices in 2010, all of them classified as Class I which means there is "reasonable probability that use of these products will cause serious adverse health consequences or death."

She pointed out that the use of software to control functions in vehicles was increasing and this was another area where it was dangerous to have closed code. It would be relatively simple for someone to take control of a car and cause the driver's death.

The compromising of a system was being done with commonly available commmercial tools which made it all the more scary, Sandler pointed out. The US Department of Defence itself has said that "continuous and broad peer-review, enabled by publicly available source code, improves software reliability and security through the identification and elimination of defects that might otherwise go unrecognized (sic) by the core development team."