If there’s ever a time you want to spend less time under the knife, it’s during brain surgery.

Artificial intelligence could help doctors diagnose brain tumors more quickly and more accurately, according to a new study by researchers at the University of Michigan Medical School and Harvard University.

“Our goal is to develop an algorithm that approaches the performance of a neuropathologist at diagnosis during an operation,” said Dr. Daniel Orringer, first author of the study in Nature Biomedical Engineering and an assistant professor of neurosurgery at Michigan Medicine.

Shorter, Safer Surgery

In their experiments on more than 100 brain tissue samples, the researchers used deep learning to detect the presence of a tumor and classify it into one of several broad categories.

The algorithm analyzes tissue from a laser imaging technique the researchers developed called stimulated Raman histology, or SRH. Currently, doctors must halt surgery for 30-40 minutes while tissue sent to the lab is processed, frozen and stained. SRH reduces the wait time to three minutes by making it possible for pathologists to diagnose tumors without the tissue leaving the operating room.

“Helping patients get diagnosed more quickly means patients spend less time in the operating room, which decreases the risks associated with surgery,” said Orringer, a practicing neurosurgeon.

Better Brain Tumor Diagnosis

The deep learning algorithm identified four categories of tumors in the samples. As researchers collect more samples, Orringer said he wants to expand that to eight categories, which would include most of the tumors neurosurgeons encounter.

Accuracy rates on 30 tissue samples tested were 90 percent, compared with neuropathologists’ accuracy rates of 90-95 percent in clinical practice, said Orringer.

“We want to bring accuracy rates up so fewer patients are misdiagnosed,” he said. By enabling prompt, consistent and accurate tissue diagnosis during surgery, deep learning could help fix the problem of variability among pathologists’ diagnoses, Orringer said.

Deep learning would not replace pathologists, whose expertise is needed to make the final diagnosis, he added.

To Operate or Not to Operate

Orringer and his team have tested the technique on more than 370 patients and are driving toward 500.

“The more we feed the computer, the more accurate its diagnoses will become,” Orringer said. The research could be applied to tumors beyond the brain, he added.

Deep learning and the SRH imaging technique could help doctors make better decisions about how and whether to operate. Some tumors respond better to chemotherapy and radiation than surgery, Orringer said, so patients could avoid surgery entirely.

Neurologists Without Borders

SRH and deep learning could help small hospitals or those in remote areas without access to neurologists, according to the study. Although 1,400 U.S. hospitals perform brain-tumor surgery, there are only 800 board-certified neuropathologists in the country.

Bringing these technologies to smaller hospitals would extend their capabilities because images can be interpreted remotely, Orringer said.

The researchers trained their neural network using the CUDA parallel computing platform, an NVIDIA GeForce GTX 1080 GPU with cuDNN on the Theano deep learning framework.

“GPUs were a vital part of our tool chest for building this algorithm,” Orringer said.

The next step is a large-scale clinical trial, Orringer said. The prototype SRH system and deep learning algorithms are intended for research only.

All images in this story are courtesy of the University of Michigan School of Medicine.

To learn more about how AI computing is changing industries, subscribe to NVIDIA’s AI Podcast on iTunes http://nvda.ws/2hQ4Leb or Google Play Music http://nvda.ws/2hQaIrh.