Google DeepMind has announced its second collaboration with the NHS, working with Moorfields Eye Hospital in east London to build a machine learning system which will eventually be able to recognise sight-threatening conditions from just a digital scan of the eye.

The collaboration is the second between the NHS and DeepMind, which is the artificial intelligence research arm of Google, but Deepmind’s co-founder, Mustafa Suleyman, says this is the first time the company is embarking purely on medical research. An earlier, ongoing, collaboration, with the Royal Free hospital in north London, is focused on direct patient care, using a smartphone app called Streams to monitor kidney function of patients.

The Moorfields collaboration is also the first time DeepMind has used machine learning in a healthcare project. At the heart of the research is the sharing of a million anonymous eye scans, which the DeepMind researchers will use to train an algorithm to better spot the early signs of eye conditions such as wet age-related macular degeneration and diabetic retinopathy.

Suleyman said: “There’s so much at stake, particularly with diabetic retinopathy. If you have diabetes you’re 25 times more likely to go blind. If we can detect this, and get in there as early as possible, then 98% of the most severe visual loss might be prevented.” Training a neural network to do the assessment of eye scans could vastly increase both the speed and accuracy of diagnosis, potentially saving the sight of thousands.

The collaboration between the two organisations came about thanks to an unsolicited request from one doctor at Moorfields. Pearse Keane, a consultant ophthalmologist, contacted the Google subsidiary through its website to discuss the need to better analyse scans of the eye, and initiated the research project shortly after. “I’d been reading about deep learning and the success that technology had had in image recognition,” he said, when he came across an article about DeepMind training a machine to play Atari games – the company’s first public success.

“I had the brainwave that deep learning could be really good at looking at the images of the eye. Optical Coherence Tomography is my area, and we have the largest depository of OCT images in the world. Within a couple of days I got in touch with Mustafa, and he replied.”

DeepMind’s previous collaboration with the NHS had led to controversy, after it and its parter, the Royal Free hospital, were accused of not having the proper authority to share the records of patients who would be involved in the trial. At the time, Royal Free said that the arrangement “is the standard NHS information-sharing agreement set out by NHS England’s corporate information governance department and is the same as the other 1,500 agreements with third-party organisations that process NHS patient data.”

Since the Moorfields collaboration involves anonymised information, the privacy hurdles are much lower. The company has been given permission for access through a research collaboration agreement with the hospital, and has published a research protocol, as is standard practice for medical trials.

The company says the information shared amounts to “approximately 1m anonymous digital eye scans, along with some related anonymous information about eye condition and disease management.

“This means it’s not possible to identify any individual patients from the scans. They’re also historic scans, meaning that while the results of our research may be used to improve future care, they won’t affect the care any patient receives today. The data used in this research is not personally identifiable. When research is working with such data, which is anonymous with no way for researchers to identify individual patients, explicit consent from patients for their data to be used in this way is not required.”

Prof Peng Tee Khaw, the head of Moorfields’ ophthalmology research centre, said that the key to the collaboration was the huge increase in the volume of incredibly precise retinal scans available. “These scans are incredibly detailed, more detailed than any other scan of the body we do: we can see at the cellular level. But the problem for us is handling this amount of data.

“It takes me my whole life experience to follow one patient’s history. And yet patients rely on my experience to predict their future. If we could use machine assisted deep learning, we could be so much better at doing this, because then I could have the experience of 10,000 lifetimes.”

Somewhat oddly, the DeepMind/Moorfield collaboration is actually the second time that Google has looked at using machine learning to detect diabetic retinopathy in eye scans. An earlier, different, project was announced by Google chief executive Sundar Pichai onstage at the company’s annual developer conference, Google I/O, in May.