We won’t have robot doctors for a long time, but the human doctors we have now are beginning to lean on specialized artificial intelligence to help save time.

Google DeepMind just announced a partnership with University College London Hospital which will explore using artificial intelligence to treat patients with head and neck cancers. The goal is to develop tools to automatically identify cancerous cells for radiology machines.

Currently, radiologists employ a manual process, called image segmentation, to take CT and MRI scans and use them to create a map of the patient’s anatomy with clear guidelines of where to direct the radiation. Avoiding healthy areas of the head and neck requires that map to be extraordinarily detailed; typically it takes four hours to create. Google believes it can do the same job (or better) in one hour.

DeepMind, Google’s research arm, works primarily in deep learning, a form of artificial intelligence that learns to identify patterns from looking at large amount of data. In this case, DeepMind researchers will obtain access to anonymized radiology scans from up to 700 former UCLH patients, and then feed them into algorithms that would process the scans to learn the visual difference between healthy and cancerous tissue.

The partnership will allow researchers to train their algorithms with highly-specialized, high-quality data, which theoretically will enable the algorithm to perform at a higher rate of success than if they had been using publicly-available scans.

For those concerned about machines making health care decisions, UCLH made it clear in a statement to The Guardian that clinicians will still be in complete control of diagnoses and treatment.

DeepMind isn’t the first to apply deep learning to cancer research. Samsung Medison, the South Korean technology company’s medical device arm, recently released an ultrasound machine that uses deep learning to quickly recommend whether breast tissue is cancerous or benign. The machine’s algorithm was trained on 9,000 breast tissue scans, and is pending FDA approval in the US.

This is the second DeepMind partnership this year with London-based medical institutions. DeepMind is also using data from Moorfields Eye Hospital to try to detect maladies of the eye, and with UK’s National Health Service to help hospital staff monitor patients with kidney disease.

The logistics of the latter raised has raised some red flags: a New Scientist investigation uncovered that the NHS shared historical medical records of 1.6 million patients—and not just certain kidney disease-related data as was initially expected. Although patients were not notified that more information was being released than discussed, DeepMind said the extra information was vital to their work.

In any case, these partnerships are predicated on the condition that Google cannot use the health data for purposes other than the development of their intended tools, and the UCLH data must be destroyed once the project is finished.