This is part 3 of a series of blog posts looking at the security of the UK Government’s web infrastructure.

Britain’s National Health Service is riddled with old and insecure WordPress-based websites. Many of these sites have severe flaws including being vulnerable to XSS attacks.

There is absolutely no suggestion that patient data or confidentiality has been put at risk.

These flaws were discovered passively using the information which was returned by the web server following a normal request. I have not exploited any of the holes found.

All these flaws were responsibly disclosed to Department of Health Officials in January 2014. Throughout February I was repeatedly in contact with various NHS officials trying to get them to do something about these problems.

This is a technical look at how I found these flaws. Please buy the latest edition of Computer Active to read the full story.

Step 0 – Was This A Problem In The Past?

In 2009, a security researcher discovered a severe security flaw in one of the NHS’s websites. I wondered if the NHS had improved its web security practices in the last 5 years.

Step 1 – Get All NHS Domain Names

I initially thought there would be a public list of all the NHS’s websites. There isn’t.

Thankfully, Rob Aley had made a Freedom of Information request a year ago which I was able to use.

The list dates from January 2013 – so it doesn’t contain any of the more recent domains. However, as any WordPress site created in the last 12 months is (hopefully) free of vulnerabilities, that’s not too big an issue.

So, with 5,000 domains in hand, it’s on to….

Step 2 – Look for Vulnerabilities

There were five main classes of vulnerabilities I was looking for.

Old WordPress versions. Server Information. Directory listings. Unsecured login pages. XSS Flaws.

Finding the WordPress version is simple enough. Most sites will spit out a header in the HTML which says:

<meta name="generator" content="WordPress 3.5.2" />

If the administrator is sensible enough to have hidden that header, we can still infer which WordPress release is running by looking at which JavaScript libraries are bundled with the site.

Server information means we can see if the website is running on old, unpatched software. Directory listings allow us to see all the files on the server. Better hope there’s nothing confidential on there!

Unsecured login pages means that anyone can guess the address of the login page. Without suitable protection, repeated login attempts can be made until. Unless the site is running SSL (and most aren’t) the username and password are sent unencrypted. Better hope no one is logging on over public WiFi!

Finally, on to XSS. The easiest way to test is to search for an HTML string and see if it is returned.

Testing each one of these manually is possible – although a right pain in the arse – so I turned to…

Step 3 – wpscan

The Open Source software wpscan is a simple tool – you give it a URL and it finds every single WordPress vulnerability on the site. It tells you the version, what bugs are present, whether the site is likely vulnerable the XSS, and all sorts of other interesting details.



Right, time to get started!

Step 4 – Scanning

Sadly, wpscan doesn’t have a batch mode. Nor does it play well with parallel processing. That means running it in serial.

Taking a list of NHS domains in a .txt file, it’s relatively easy to extract each one, scan it, then dump the result to a text file with the same name as the domain.

cat nhs.txt | xargs -iURL sh -c './wpscan.rb --follow-redirection --url URL > URL.txt'

In order to stop wpscan asking me every time it couldn’t find plugin directory, I patched the wpscan.rb file.

unless wp_target.wp_plugins_dir_exists? puts "The plugins directory '#{wp_target.wp_plugins_dir}' does not exist." puts 'You can specify one per command line option (don't forget to include the wp-content directory if needed)' print 'Continue? [y/n] ' #unless Readline.readline =~ /^y/i # exit(0) #end end

With 5,000 records to check, it was bound to take some time. Thankfully, not all the sites run WordPress and wpscan only takes a second to ignore a site it can’t scan.

The scan ran at about 7 URLs per minute. Meaning the whole thing was done in less than half a day.

Step 5 – Parsing The Results

Out of the 5,000 domains, 358 were identified as running WordPress.

5 were identified as running the extremely old WordPress 2.X!

How many potential XSS vulnerabilities were found? 597. Several of the sites were identified with multiple potential exploits (I say “potential” because they were not all manually checked).

After running the reports, parsing the data, summing the number of XSS, privilege escalation, open redirects, and other miscellaneous bugs – I came up with the linkbait conservative total that over 2,000 security bug were identified.

Step 6 – Calm Down

It’s important to note that these are suspected vulnerabilities. The wpscan software isn’t perfect – some of the flaws it detects may be mitigated by other measures.

Many of the problems are “Privilege Escalation” vulnerabilities. This means that the secretary who updates the opening times, may be able to assume the role of an administrator and do some serious damage. This makes it unlikely that an external malicious user could exploit these flaws.

Ok, so what about the ~4,500 sites which aren’t running WordPress? Are they secure? No!

Step 7 – Look for Non-WordPress Vulnerabilities

There are a number of sites which don’t run WordPress which are still vulnerable to XSS attacks.

Nearly every single site built by a particular Norfolk based company had a confirmed XSS vulnerability.

These were found by manually searching for an HTML string and seeing if it was returned unescaped.

After repeated contact, and some hand-holding, they were able to fix dozens of vulnerable sites.

Step 8 – Responsibly Disclose the Problems

It’s really hard to contact the Department of Health to report these issues to them. I’m lucky enough to have some friends in the Civil Service who were able to escalate my concerns – but even then I seemed to hit a brick wall.

I tried contacting individual website owners – who mostly forwarded me on to other people who then ignored me.

I contacted the Department of Health directly and provided screenshots of the problems – no reply was forthcoming.

Finally, I contacted James Temperton, the award winning journalist from Computer Active. James was the only journalist who responded to my request for a PGP key in order to communicate securely. In the age of Snowden, it seems bizarre that computing journalists don’t take the minimum amount of effort to provide a secure contact channel.

With James’ help, I was able to craft this story and he was able to contact the PR people at the Department of Health. You can read James’ story in the latest issue of Computer Active.

What I Learned

Many Doctors’ Surgeries in an area will all use the same cheap, private sector contractors to built their site. If there’s a bug in one – that bug is present in hundreds of other sites.

On 12th February, I finally heard back from someone senior within the NHS. They explained that the Department of Health has no central control over NHS websites. As a result, sites fall through the cracks as local teams change. Consequently, in many cases there is simply no way to contact the website owners.

Abandoned Sites

I’ve tried to disclose the flaws to the site owners and directly to the Department of Health. In some cases – such as the following – no one is responsible for the site!

I contacted the designer – he passed me on to the agency commissioned to design the website. The agency passed me on to the NHS group they did the work for – which has since been re-organised. They passed me on to the local government contact who is meant to be responsible. She cannot find out who currently controls the site.

The Department of Health, HSCIC, local government, and NHS Care Commissioning Groups are all abdicating responsibility.

So now we have a situation where the NHS has lost control of its websites. They can be used to host spam and malware, hijack their usernames and passwords, or scam patients into giving up confidential information.

Recommendations

I love WordPress – this blog runs it, as do many more sites I administer. Like any software, it needs to be kept updated and maintained.

It’s clear that many NHS websites are not being actively maintained. That’s a serious failing. I don’t think it’s an exaggeration to say that looking after a website is as important as cleaning a hospital.

Ok, maybe a bit of an exaggeration. But XSS flaws are especially pernicious when they’re on a trusted domain like nhs.uk.

It’s clear that the fractured nature of the NHS means that private companies are free to exploit small NHS practices. Many of these vulnerable sites have been delivered by private companies with no thought of the public harm they are doing.

Earlier this year, Sam Smith asked a very important question:

Why isn’t there an NHS Digital Service yet? (as in #GDS) — Sam (@smithsam) January 24, 2014

It’s clear that neither tiny NHS practices nor megalithic Trusts have the experience to commission and run simple websites. The ideological desire for “competition” has lead to a waste of millions of pounds of taxpayers’ money and resulted in horrendous security flaws throughout the NHS.

Public health is too important to leave to the “invisible hand” of capitalism’s free market. We need a strong, centralised management which can produce and enforce best-practice across the NHS’s web portfolio.

It’s time that the Secretary of State for Health, Jeremy Hunt, stopped trying to undermine the public sector ethos of the NHS and, instead, concentrated on making it stronger. Rather than setting the NHS up to fail via phoney “competition”, he should be ensuring it works together as a community to ensure the security of the NHS’s digital portfolio.

The Official Response

After raising this through multiple channels – including directly to some of the site involved and to GovCertUK – this is the official reply we got from HSCIC on 18th February.

In relation to nhs.uk sites, the HSCIC’s role is to process applications to use the domain name from NHS organisations and provide permission for its use, where appropriate. However, responsibility for the maintenance and security of sites using the nhs.uk domain sits with the organisation running each website or service. The HSCIC is currently drafting some additional guidance, in support of our existing technical guidance, to be issued to all applicants receiving permission to use the nhs.uk domain. We are grateful to the individuals who have alerted us to these issues so that we can take them into account when drawing up this document.

A Special Message To Tim Kelsey about care.data

If the NHS can’t be trusted to secure their websites – why should I trust them to secure my confidential medical details?

That’s why I’ve opted-out of care.data and you should too.