The risk of serious cyber-attacks on nuclear power plants is growing, according to a new report by think-tank Chatham House. If you follow this type of news, then this is probably not a big shocker, but did you know there have been around 50 cyberattacks on nuclear plants?

One unnamed expert quoted in the Chatham report (pdf) claimed, “What people keep saying is ‘wait until something big happens, then we’ll take it seriously’. But the problem is that we have already had a lot of very big things happen. There have probably been about 50 actual control systems cyber incidents in the nuclear industry so far, but only two or three have been made public.” The report claimed that there is limited incident disclosure and a “need to know” mindset that further limits collaboration and information-sharing.

“In a worst-case scenario, cyberattacks could lead to a release of ionizing radiation with potentially disastrous impacts on local populations.” In fact, “something as simple as employees installing a personal device onto a nuclear facility's internal network could open it up to attacks,” Caroline Baylon, a cybersecurity researcher at Chatham House, told Newsweek. She explained, “Let's say the people in the plant want to install a router so they can check their emails. That might all of a sudden open up a vulnerability.”

It is also a “pervading myth” that nuclear power plants are air-gapped – not connected to the Internet – and can’t be hacked. Yet executives controlling the purse strings at some plants are in denial; a source said they consider a cyberattack just “a movie scenario, maybe in the future. They think it is just states against states, not everybody wants to hack us, and also it won’t happen here.”

Some plants don’t understand connectivity, sometimes unconcerned about the risks because they are unaware it even exists when search engines like Shodan or ERIPP (Every Routable IP Project) have proven the nuclear plant is connected to the internet. Others have “undocumented or forgotten connections;” if a contractor or vendor has VPN access and an attacker learns about it, such as in the countries which allow remote access for to the digital reactor protection systems, then that could serve as an entry point.

Some of the more interesting tidbits in the report (pdf) are quotes by anonymous insiders, expert sources from the US, UK, Canada, France, Germany, Japan, Ukraine and Russia that were collected over 18 months for the study. If you could be quoted anonymously, then it’s likely you might be inclined to be extremely honest. The report includes some serious backbiting, not attacks on a specific Joe Doe, but friction between professions that all work on nuclear power plants.

Touch it and die…

One source called engineers the “worst offenders” when it comes to bringing in their own laptops to take data off machines. “Lots of times they have introduced viruses in the PLCs when doing tests.” It was suggested under “handling the human factor,” that “engineers should be required to turn in any personal laptops that they bring to the plant.”

There is serious cultural divide causing friction between nuclear OT and IT personnel. A source “emphasized that OT engineers’ general dislike of IT engineers is a major part of the cybersecurity challenge: The problem is as much cultural and sociological as it is technical. One of the biggest problems we have is that – as in any industry – the operations people dislike IT.”

An IT engineer source “attempted to view the situation from the OT engineers’ perspective: I can understand why nuclear plant managers don’t like us, because they think we are painful. We come in at the end of a procedure that works [and say that all of these cybersecurity measures must be added]. We add in cybersecurity in order to protect them, but from their perspective they don’t see the benefit.”

An OT engineer said of his frustration with working with IT engineers: “I’ve never been convinced that if we ever implemented the [cyber emergency] procedure, the guy was even qualified. Certainly not qualified to the extent I was, where I had to go through schools. He might be the biggest computer wizard in the world, he had no idea how a nuclear plant worked.”

A ‘denial of service’ to an OT engineer could “mean that a 10,000 horsepower main coolant pump in the nuclear plant has shut down” and a physical security dude might interpret ‘intrusion detection system’ as a card reader at the plant gate. IT engineers might roll their eyes at either of those definitions.

Unclear procedures are also at play; very often, nuclear plant personnel don’t understand cybersecurity procedures or what to do in a cyber-related emergency, explained a source. “The procedures are confusing as hell.”

Insecure by design, default passwords, failure to patch

“The nuclear industry’s late adoption of digital systems has resulted in a lower level of cybersecurity experience than in other industries.” The report says, “The use of default vendor login details is everywhere, including in nuclear. You just put these in and you can get access to the networks.”

The report explained that industrial control systems (ICS) are insecure by design, as trying to bolt security on as an afterthought doesn’t work well. ICS is “particularly vulnerable” due to the lack of “authentication and verification.” One source stated, “The field devices accept the message immediately, without asking. The receiving device does not have to authenticate. Control systems are thus very fragile due to man-in-the-middle attacks.”

“The flexibility of code means that an attacker can change the logic, or the set of programming instructions, for a piece of equipment in order to cause it to behave differently.” Another source claimed, “You can tell the field device to do whatever you want and it will just say, ‘OK, you command, I’ll do it.’ … The most skilled attackers won’t even bother with finding vulnerabilities, they’ll use features instead.”

Yet another confirmed, “It is almost impossible to protect the system once someone gains access to it.”

To say patching is problematic is being kind; the “unique challenges” of patching means it is “infrequently used.” Some plants run operating systems that are so old Microsoft no longer supports them by releasing patches. Other times there are concerns that if patches are deployed and something goes wrong, the entire plant could go offline. “Some operators consider the risk too high.” Since the process of patching never ends, and patches for “switches, firewalls and embedded devices” need tested before being deployed, the “process could at best take weeks or months, but in many cases it could take years.”

As one insider bluntly put it, “Patching is really challenging, and the reality is that very few people are actually installing any patches.” He added, “Operators are not going to be willing to shut a unit down for three days to install a patch for a vulnerability that somebody might or might not exploit.”

Safety vs cybersecurity

Many nuclear plant personnel regard cybersecurity as a “nuisance” forced upon them by evil IT since they don’t believe cybersecurity poses a real threat. One source said, “In terms of safety, it is definitely bullet-proof, but in terms of cybersecurity, not so much.” Another said, “When it comes to breaches, if it were a nuclear safety issue, it would be public for sure. But because it is a nuclear security issue, no one talks about it.”

Some vendors don’t “recognize that cybersecurity is even an issue” and don’t bother to respond to researchers who find vulnerabilities. “The owner-operators aren’t pushing the vendors for greater cyber security. This allows the owner-operators to say, ‘Well, we can’t buy it; the vendors don’t make it.’ And the vendors can say, ‘Well, there is no market demand.’ And so everyone is happy doing nothing.”

It’s all reactive, instead of proactive. There were also numerous quotes from various unnamed experts about microprocessors creating reduced redundancy.

Threat actors and cyberattack scenarios

Nuclear plants are not prepared for “a large-scale cybersecurity emergency” and such an “emergency occurring at night could be particularly dangerous.”

Threat actors posing cyber-risks to nuclear facilities were divided into four categories: “hacktivists; cyber criminals; states (governments and militaries); and non-state armed groups (terrorists).” The report added that ISIS has a “sophisticated use of Facebook and websites for recruiting purposes” and it “could potentially pose such a threat.” An expert said that radical extremism is such a serious risk that “we can consider it at least equal [to a] governmental hack attack. If an attacker really wants to penetrate or infiltrate the network, it is a question of time and money.”

A few of the scary scenarios included cyberattacks that cause a domino effect taking down power plants until the compromised grid is unstable. Losing power seems tame when compared to a fire sale that could involve simultaneously targeting nuclear facilities and “other types of critical infrastructure such as regional water systems, the electrical grid or banking systems.” The report added, “In theory, a cyber-attack on a nuclear plant could bring about an uncontrolled release of ionizing radiation.”

The executive summary (pdf) spells out the main issues amplifying cybersecurity risks to nuclear plants. In a nutshell, that includes the increased use of vulnerable ‘off-the-shelf’ software, digitization, lack of executive-level awareness, search engines like Shodan and automatic cyber-attack packages that have made hacking even easier. The supply chain is at risk, there is a lack of sufficient cyber investment and many more. There is even more in the main Chatham report “Cyber Security at Civil Nuclear Facilities Understanding the Risks” (pdf).