A New Scientist investigation raises questions about the basis under which an NHS Trust is sharing patient data with Google’s AI firm

We’d appreciate more transparency EPA/How Hwee Young

Update, 8 June 2016: We have updated this story to include new information from the Memorandum of Understanding between Royal Free and DeepMind obtained by New Scientist yesterday.

Three weeks ago, New Scientist revealed that Google’s artificial intelligence company DeepMind has access to the identifiable personal medical information of millions of UK patients through a data-sharing agreement with the Royal Free London NHS Foundation Trust.

Now, a New Scientist investigation has found that Google DeepMind deployed a medical app called Streams for monitoring kidney conditions without first contacting the relevant regulatory authority. Our investigation also asks whether an ethical approval process that covers this kind of data transfer should have been obtained, and raises questions about the basis under which Royal Free is sharing data with Google DeepMind.


DeepMind’s partnership with the Royal Free provides it with fully identifiable information – including names, addresses and details of medical conditions – for the 1.6 million patients treated at Barnet, Chase Farm and the Royal Free each year. It also includes in-depth data on all patients treated by the trust in the past five years.

Google and the Royal Free say they are acting in compliance with the rules.

Approval for medical devices

According to guidelines issued by the UK’s Medicines and Healthcare products Regulatory Agency (MHRA), software like Streams counts as a medical device. The MHRA was set up to ensure that such devices are safe and effective.

When introducing new medical devices, as DeepMind has done with Streams at Royal Free, it is a requirement to first obtain a “Notice of No Objection” from the MHRA. This is a document stating that the agency has found no objection to the product being brought to market in its current form. Not doing this goes against the Medical Devices Regulations of 2002.

Streams has been running on and off in three live tests since December 2015, according to Royal Free. As of 10 May, according to the public list that the MHRA keeps on registered manufacturers – which is updated every week – Google DeepMind was not registered. The MHRA says it prosecutes only as a last resort when a serious offence has been committed, but that prosecution can result in a £5000 fine and/or 6 months’ imprisonment.

“Throughout these tests, Streams has always served as an additional information system for clinicians, tested alongside rather than replacing standard hospital safety systems,” Google says.

In a statement to New Scientist, the MHRA said that it has been in contact with Google since 4 May.

“We and our partners at the Royal Free are in touch with MHRA regarding our development work,” Google said in a statement to New Scientist on 13 May. The app is currently offline.

Data transfer

There are questions around the transfer of data itself from Royal Free to Google DeepMind.

“Under the Data Protection Act 1998, a data controller is required to have a robust contract in place to require a data processor, like Google in this case, to maintain adequate technical and organisational measures to prevent loss or destruction of personal data and to properly govern the processing of that data by the contractor,” Ben Wootton of data protection law firm, Pritchetts Law told New Scientist.

Royal Free has said that the “Information Sharing Agreement” which New Scientist revealed in April is the contract which governs DeepMind’s data processing.

“The information sharing agreement you refer to is not simply a data sharing agreement, but is a legally binding contract that includes clear commitments required for compliance with the Data Protection Act and was prepared specifically in line with relevant ICO guidance,” Royal Free said in a statement.

Privacy campaigners medConfidential say the agreement appears to be pasted together from two different template documents – an “information sharing protocol” from the UK Renal Registry and an “information sharing agreement” from University College London’s School of Life and Medical Sciences. Royal Free is one of UCL’s teaching hospitals, while two of DeepMind’s three co-founders – Shane Legg and Demis Hassabis – studied neuroscience at the School of Life and Medical Sciences. The agreement was signed on 29 September 2015 by Mustafa Suleyman, a DeepMind co-founder, and by Subir Mondal, Royal Free’s data protection officer.

“Our agreement with DeepMind is our standard third-party data sharing agreement, with the trust being the data controller and DeepMind being the data processor,” Royal Free said in an earlier statement.

Terms of processing

As well as Streams, the agreement constrains DeepMind to building tools for “real time clinical analytics, detection, diagnosis and decision support to support treatment and avert clinical deterioration across a range of diagnoses and organ systems”.

Sam Smith of medConfidential and data protection consultant Tim Turner both say that these terms are so vague that they may not give Royal Free enough control over how DeepMind processes data.

According to the Information Commissioner’s Office, which provides guidance for data controllers and processors, “the data controller must exercise overall control over the purpose for which, and the manner in which, personal data are processed”.

Minutes from the Royal Free’s board meeting on 6 April make the trust’s relationship with DeepMind explicit: “The board had agreed to enter into a memorandum of understanding with Google DeepMind to form a strategic partnership to develop transformational analytics and artificial intelligence healthcare products building on work currently underway on an acute kidney failure application.”

The Information Commission’s Office says it is looking into the data-sharing between DeepMind and Royal Free.

On 24 May, the ICO confirmed that it has no record of any paper or electronic communication from the Royal Free in reference to the DeepMind data-sharing agreement.

Consent

Even if the contract does constrain DeepMind’s data processing, the Royal Free should still have obtained informed consent from the individuals whose data Google DeepMind is processing, says Turner. In Royal Free and DeepMind’s case, this means consent from every one of the millions of patients whose sensitive, fully identifiable data has been shared.

The Health and Social Care Information System (HSCIC) is an arm of the Department of Health which deals with digital healthcare. Its FAQ on legally accessing personal confidential information says: “To lawfully process confidential information you must have the consent of the data subject or a statutory basis.”

Royal Free’s own privacy statement says: “We will share information with non-NHS organisations only with your permission if the information is required for purposes other than the provision of care to you.”

Google has said that each individual patient’s consent for their data being shared is implied, because it is providing “direct care” to Royal Free’s patients. On 4 May, DeepMind consultant Hugh Montgomery at University College London told the BBC: “There is an assumption of consent that [the Royal Free] can manage that data and share it with people if it’s for direct clinical patient benefit.”

So the question becomes: is DeepMind providing direct care?

The UK’s Caldicott guidelines for handling healthcare data – most recently revised in 2013 – define “direct care” as: “a clinical, social or public health activity concerned with the prevention, investigation and treatment of illness and the alleviation of suffering of individuals”.

The Streams app is being developed to monitor kidney conditions. In this case, the clinical activity is the potential treatment of individuals who have had blood tests for kidney disease. Streams is not concerned with the treatment of the vast majority of individuals at Royal Free who have not had blood tests for kidney disease. Yet all of those individuals’ data has been shared with DeepMind, without their consent.

The Caldicott guidelines give examples of implied consent with a direct care basis: it might be a clinician giving a patient’s medical files to a colleague who is directly involved in their care, without seeking explicit consent first. Doctors might share confidential information with nurses without obtaining direct consent each time. A physiotherapist may examine a patient’s medical records before meeting them face to face if that patient has accepted a referral.

Consent might also be implied when patient data is shared with a third-party processor that runs a system for an entire hospital, such as an electronic patient records system. In this case, the services that the third-party processor provides are relevant to each individual in the hospital.

“What they’re trying to do is apply the language of doctors and nurses applying treatment to patients to something that has never happened before,” says Turner. “I cannot see how what DeepMind is doing is comparable to the doctor-patient direct care relationship.”

When asked how the trust had obtained patients’ consent before sharing data with Google DeepMind, Royal Free gave New Scientist the following statement: “The trust is sharing patient data for the purpose of direct care only on the basis of implied consent.”

Referring to the privacy statement which mandates informed sharing, the trust said: “The trust’s privacy statement, which provides a summary of how we use patient data and how consent can be withdrawn, is available to view on our website.”

The Caldicott guidelines state that there are four cases in which identifiable data may be processed: when all patients have given consent; through specific laws, such as when the Secretary of State for Health has given consent on their behalf after ethical approval has been obtained; for reasons of public interest, such as during a national health emergency; or, finally, through a court order.

Ethics

There is an ethical approval process which can cover the kind of data-sharing being carried out at the Royal Free. It’s called Section 251 assent, a section of the NHS Act 2006 which allows for a patient’s sensitive identifiable data to be shared without their explicit consent for some purposes other than their care. Assent can be obtained through the Confidentiality Advisory Group, a committee set up by the NHS and run by the Health Research Authority (HRA), designed to ensure that the transfer of sensitive personal medical data is ethical and appropriate.

In situations where consent cannot reasonably be given in practice – in large research projects, for example – the CAG review process allows for sensitive medical data to be shared and processed. Successful applications culminate with consent for data processing being given on patients’ behalf by the UK Secretary of State for Health, currently Jeremy Hunt.

The HRA confirmed to New Scientist that DeepMind had not started the approval process as of 11 May.

“Google is getting data from a hospital without consent or ethical approval,” claims Smith. “There are ethical processes around what data can be used for, and for a good reason.”

“Section 251 assent is not required in this case,” Google said in a statement to New Scientist. “All the identifiable data under this agreement can only ever be used to assist clinicians with direct patient care and can never be used for research.”

The Memorandum of Understanding between Royal Free and DeepMind explicitly states that DeepMind had already assigned research scientists to the AKI project. The document, dated 28 January 28 2016, was obtained by New Scientist on 7 June through a Freedom of Information Act request.

The Computational Health Informatics lab, led by David Clifton at the University of Oxford, has deployed healthcare algorithms – with full ethical approval – in hospitals run by the Oxford University Hospitals NHS Foundation Trust. Clifton says ethical approval is a key step when sensitive medical data is to be accessed. “We typically budget up to one year for the process of ethics applications, responding to comments from the ethics-review panels,” Clifton says. “While time-consuming, we recognise that the process is there for an important reason, and that the process builds confidence in the technologies that derive from our research.”

“If duty of confidentiality is going to be set aside for the hospital, there should have been a process followed,” says Smith.

Hospital trusts are required to appoint a “Caldicott Guardian” to ensure the guidelines are followed. The NHS defines the position as a “senior person responsible for protecting the confidentiality of patient and service-user information and enabling appropriate information-sharing”. Royal Free’s Caldicott Guardian Kilian Hynes has not responded to questions from New Scientist about whether he reviewed the data-sharing agreement before it was signed and the data transferred to DeepMind.

On its website, the HSCIC describes the role of a Caldicott Guardian as follows: “Acting as the ‘conscience’ of an organisation, the Guardian actively supports work to enable information sharing where it is appropriate to share, and advises on options for lawful and ethical processing of information.”

In a statement, the Royal Free said: “The data sharing agreement was reviewed by the data protection officer prior to signing, in line with trust policy. It has been reviewed by the trust’s Caldicott Guardian.” Royal Free declined to say whether Dr Hynes reviewed the agreement before it was signed.

This article has been updated based on information received from Google DeepMind and the Royal Free London NHS Foundation Trust Correction: This article has been updated to clarify the link between University College London and DeepMind, and how much data DeepMind holds on all patients