More than 285 million people worldwide live with some form of sight loss iStock / goir

When Pearse Keane started using optical coherence tomography (OCT) scanners to peer to the back of a person's eye in Los Angeles a decade ago, the machines were relatively crude. "The devices were lower resolution, they had much slower image acquisition speeds," says Keane, a consultant ophthalmic surgeon at Moorfields Eye Hospital and researcher at University College, London. From 2007, Keane spent two years studying scans from OCT machines learning to diagnose eye conditions in patients and pick out the minute details which make up sight-threatening diseases.

"It was very time consuming, laborious work," Keane says. OCT scans use light to quickly create high resolution, 3D images of the back of the eye. Modern OCT scanners create around 65 million data points each time they are used – mapping each layer of the retina. The three-dimensional images have become common-place for doctors diagnosing eye problems but across the NHS, thousands of the scans are completed each day. "It's dealing with those numbers, that's the issue," Keane says.


An automated algorithm to diagnose eye diseases from Moorfields' and Google's London-based artificial intelligence unit, DeepMind, may have the potential to cut down the amount of time doctors spend diagnosing from OCT scans. New research published in the journal Nature Medicine shows DeepMind's AI being taught to recognise 50 common eye problems – including three of the biggest eye diseases: glaucoma, diabetic retinopathy and age-related macular degeneration.

The AI correctly identified types of eye disease from OCT scans 94.5 per cent of the time. "The algorithm is on a par with expert performance at diagnosing OCT scans," Keane says. "It's as good, or maybe even a little bit better, than world-leading consultant ophthalmologists at Moorfields in saying what is wrong in these OCT scans." In a rarity for AI systems, the algorithm is also able to explain how it reached a certain diagnosis and be used on more than one type of OCT machine.

Read next This CIA spy game reveals the secrets of successful teams This CIA spy game reveals the secrets of successful teams

DeepMind co-founder Mustafa Suleyman says the firm and Moorfields' are now planning on using the method in clinical trials and will attempt to get a final product approved by regulators. The promise of the work is to cut down on the time needed for doctors to manually inspect scans, make diagnoses and refer a patient for treatment.

China’s children are its secret weapon in the global AI arms race Long Reads China’s children are its secret weapon in the global AI arms race


The researchers used two neural networks – algorithms modelled on the brain – to understand the OCT scans and determine what may be wrong with the eye. Both networks were trained, using deep learning, on existing scans. The first neural network was trained to spot different features of diseases in OCT scans. A team of ten expert ophthalmologists and optometrists spent hours highlighting diseases on scans which were then fed to the neural network to learn from, Keane says. In total, the AI system was trained on 14,884 scans – even though DeepMind had access to one million OCT images.

The second neural network analysed the output of the first and is able to present doctors with diagnoses and a referral recommendation. The systems gives a confidence rating in the form of a percentage. "It doesn't just say this looks like a macular degeneration, the algorithm says here are the specific aspects of the scan which we think indicate the diagnosis. It marks up the scan itself," Suleyman says.

The system was tested on a dataset of 997 OCT scans and compared with diagnoses given by eight doctors. In some cases the doctors had extra information – patient records and further eye images – to make their decisions. "Our framework achieved and in some cases exceeded expert performance," the researchers write in the paper. The algorithms were correct 94.5 per cent of the time, which is equal to retina specialists who were using extra notes and the OCT scans. Retina specialists making diagnoses with just OCT scans were correct 6.7 per cent, 6.8 per cent, 10.9 per cent and 13 per cent of the time.

Read next Covid-19 has shown how easy it is to automate white-collar work Covid-19 has shown how easy it is to automate white-collar work

Findings from the research are the latest use of artificial intelligence in medical scenarios. In other research, Stanford University computer scientists taught a machine learning system to identify deadly and common types of skin cancer; breast cancer has been spotted on digital slides; and one medical device has already been given approval by the US Food and Drug Administration to help detect eye problems. The UK government has also included healthcare within its grand plan for AI. But there's still a lot that needs to be worked out.


Some of DeepMind's previous work with the NHS has faced heavy criticism for its approach to privacy. In July 2017, the UK's data protection regulator said the Royal Free NHS Foundation Trust in West London had unlawfully shared details of 1.6 million patients with DeepMind. At the time of the decision, DeepMind's Suleyman wrote in a blog post that the company had "underestimated the complexity of the NHS and of the rules around patient data". The Royal Free and DeepMind have since rewritten data sharing agreements.

With Moorfields, DeepMind has been paying for some of the costs of the research. A Freedom of Information Act request revealed that up to August 2017 the Google-owned firm had paid the NHS body £110,000. This included paying for the cost of de-personalising data and manually segmenting the eye scans. Moorfields had not paid anything to DeepMind. The company says it had access to more than one million historic, de-identified OCT scans and some information about eye conditions and disease management. Eye conditions where there were fewer than 10 cases were removed from the study as well as OCT scans where people had requested their information should not be shared.

DeepMind's Mustafa Suleyman: In 2018, AI will gain a moral compass Artificial Intelligence DeepMind's Mustafa Suleyman: In 2018, AI will gain a moral compass

Pearse first became aware of DeepMind after reading WIRED's 2015 cover feature on the company. He says he paid for one month's subscription to LinkedIn's premium membership to send Suleyman a message. A week after he went for initial meetings with DeepMind and the formal partnership between the two was announced in July 2016. If the eye-scanning work reaches a stage where it is used on real-world patients to help with diagnosis and treatment, Moorfields' will be able to use it at its 30 UK sites without paying for five years. Both Suleyman and Pearse stress the new findings are only the first stage of research.

The promise of AI's efficiency still has a lot to prove. Doctors working with IBM's Watson system, which has been modified to help treat cancer, have described it as providing "multiple examples of unsafe and incorrect treatment recommendations." (IBM said Watson for Oncology is being used in 230 hospitals around the world and has "supported" care for more than 84,000 patients). Meanwhile, UK startup Babylon Health, which is working with the NHS, Samsung and Tencent, has faced criticism that its AI-driven symptom checker has missed conditions it should have recognised. Babylon has said variable outcomes are possible if different symptoms are selected.

DeepMind and Moorfields system isn't perfect. From the 997 diagnoses it was tested against, there was a 5.5 per cent rate of mistakes. "There was no case where the AI said that this patient is normal when actually there was something serious going on with them," Keane says, adding in two cases the algorithm recommended urgent or semi-urgent care when the eye was normal. The error rating was lower than all the decisions created by experts, except for retinal specialists who also made decisions using extra information.


Suleyman says using two neural networks in the system allowed the results of both stages to be looked at closely to find why a particular diagnosis was recommended. "We're trying to go down through the layers of the neural network to interrogate the different representations at each layer and that is a very fundamental priority across all applications of machine learning". This has real-world implications. "A lot of the research on AI and healthcare is not really useful in that regard," Keane adds. "Engineers are making these algorithms that are amazingly accurate but don't have a real-world use case."

DeepMind and Moorfields also tested the AI system on two OCT scanners – one from Japanese firm Topcon and the other from Heidelberg Engineering. "What I think few people appreciate is, individual bits of hardware produce quite a different raw signal for the algorithm to interpret," the DeepMind co-founder says. Suleyman adds the algorithm being able to work on two different machines pushes the company closer to its aim of creating a general purpose AI.

That ambition may still be decades away but next for DeepMind and Moorfields are plans to run clinical trials of their OCT system. Doctors working with patients needing eye scans will get the opportunity to test the work. "What we do hope is that when this is ready for deployment, which will be several years away, it will end up impacting 300,000 patients per year," Suleyman says.