The UK cyber security community is all abuzz about a report in The Guardian alleging that Sellafield of Cumbria, England, the nuclear waste disposal facility with the largest store of plutonium on the planet, was hacked by threat actors linked to China and Russia. Mixed in with the litany of allegations and assertions is the actual story: IT systems in the nuclear waste facility, at some point, were compromised by ‘sleeper’ malware which accessed highly sensitive documents and no one is certain if said malware has been removed. Making matters worse, senior officials at Sellafield might have known about this exploit, covered it up, and neglected to inform the regulator along the way. Pretty egregious stuff.
Now, Sellafield has released a complete rebuttal in the form of an official statement which basically states that there is no evidence of any compromise. This could be read in two ways: either it happened, but they have no way of really knowing the details since their security operations are so lacking that they can’t identify indicators of compromise or perform any sort of post mortem forensic investigation. Or, much more likely, The Guardian’s article has linked together spurious details across a decade and published some tenuously plausible conjecture.
I guess, if you launch a year-long investigation simultaneously into cyber hacking and toxic workplace culture at any given company you’ll find at least one disgruntled IT person who’ll tell you some stuff which may or may not have happened. Certainly, if you read The Guardian’s article, it’s short on details and long on vagaries and assumptions.
However, there is some worrying stuff in there. Remote access to a nuclear faciliity’s IT servers – the same servers that have been described as ‘fundamentally insecure’ by a government official – and exposing passwords and credentials on Countryfile is never a good look. To say the least! Maybe there is no smoke without fire, but what can we really take away from all the hyperbole in the article?
Sellafield is apparently placed on a diet of ‘special measures’ and ‘improvements are required’ by ONR. There are also apparently special investigations into some specific incidents. How could this happen at such a critical UK infrastructure site? Is it apathy? Is it incompetence? No, probably not. It’s likely due to the acute challenges that we see consistently across UK Critical National Infrastructure (CNI) sectors and sites just like this one. Sellafield has to operate in an environment where a lot of the Information Technology (IT), Operational Technology (OT), and infrastructure was designed and built a very long time ago. The facility and its systems and technologies were designed decades before the concept of cyber security existed, let alone when most of the threats we face today have emerged to attack our operations and data. Sellafield’s “old kit” was not conceived with the ability to cope with cyber threats. Making matters worse, you can’t just bring an old plant up to date by swapping out yesterday’s nuclear tech. You can’t simply power down and replace the machines that maintain conditions around stores of plutonium. You can’t just update and reboot the control systems which ensure that Cumbria doesn’t turn into a radioactive crater.
As you would expect, making changes to such dangerous and critical environments is incredibly difficult and painfully tedious. Unfortunately the pace of technological improvements is not quick enough to keep up with today’s rapidly evolving cyber warfare, so there will continue to be dangerous exposure. Despite the bad PR resulting from The Guardian report, Sellafield might be doing the best it can with the resources it’s provided and given the constraints within which it must operate.
The Lesser Evil
The Guardian’s report also asserts that there are a number of worrying vulnerabilities present at Sellafield, citing two key claims: there was a document in 2012 named Critical Security Vulnerabilities, and that external contractors used memory sticks without supervision. Again, nothing uncommon here. An 11 year-old report on cyber vulnerabilities hardly seems damning. Even if the report was more recent, it’s fairly commonplace for companies who run legacy systems to identify and accept certain vulnerabilities. In many cases there is nothing that can feasibly be done about these vulnerabilities, so the best approach is to assess system risk and implement compensating controls.
A 3rd-party contractor using remote media (USB drive or ‘memory stick’) is potentially a big risk, but it is quite commonplace! What is meant by ‘without supervision’? Unsupervised from the aspect that there isn’t someone from Sellafeild watching them at all times? That is a minor issue (probably). What would be a major issue is if there are no controls at all over the use of memory sticks. What we want to know is: has this contractor been through security awareness training, is the memory stick used for a very limited and controlled purpose, does the device have to be scrutinized for malware before use, and so on. Vulnerability will, unfortunately, always be present in our IT and OT environments, so how we identify vulnerabilities and manage the risks that they create is what matters.
In my view, the most important (and interesting) statement in the article is that scrutiny by the regulator, the Office for Nuclear Regulation (ONR), has been ineffective for more than a decade. The regulator has known that there are cyber issues at Sellafield for all that time. They have legal power to enforce better cyber standards. There are also other important government regulations such as the NIS2 directive, and frameworks such as the NCSC Cyber Assessment Framework. Yet the regulator has not been able to effect the necessary improvements at Sellafield. I am sure many in the UK CNI industry along with cyber security experts will resonate with this issue.
UK nuclear industry regulators are not cyber experts. They are at best generalists. ONR is very capable when it comes to moving hazardous nuclear material or even health and safety, but cyber security is very complex and new to them. As any business owner will attest, building a new capability of expert people and mature processes takes a lot of time. Both regulators and operators are feeling this pain. The irony is that UK operators of essential services in the nuclear, energy, and other critical infrastructure sectors are not themselves cyber security companies. They are relatively immature when compared to sectors such as banking and pharmaceuticals, and they look to their regulators, who themselves are also not cyber security experts, for advice. What they receive back is a notice that they must do better and try harder and not a lot of guidance on how to do it. All (memory) stick, no carrot. Both regulators and operators are improving, but the required pace is fast.
Perhaps what we need is less enforcement and more incentive and guidance to promote cyber maturity in a positive way and more collaboration between regulators and authorities with their respective industries to mutually develop the capabilities and people they need to get ahead of the cyber curve.
The question I keep asking myself is; “Does The Guardian’s article help?” On one hand, it could be promoting baseless fear-mongering – pedaling yet more fear, uncertainty, and doubt. But on the other hand, there is a bonafide issue around cyber security for Britain’s Critical National Infrastructure as we hear all the time from the bigwigs at the National Cyber Security Centre. Even if the facts of the case turn out to be dubious, does this reporting ultimately help us get closer to what we need to do by awakening government and the general public to the fact that if we do not do more, invest more, and incentivise more in our CNI security posture, we really could have a nuclear disaster?