There are two main reasons why no heart surgeon is perfect: vision and motion. The naked eye just can't see everything when repairing a heart in an open chest, and even a surgeon's trained hands can't feel everything.

Robert Howe, a professor at Harvard's biorobotics lab, wants to give them a robotic assist. In the lab, Howe and his team are testing a robot called Raven intended to help surgeons see and navigate around the heart, guiding their instruments to the right place to perform repairs. Their approach uses 3D ultrasound imaging to show internal organs in real time. "Until recently, all ultrasound scanners produced 2D 'slice' images that don't show enough of the heart and the instruments to allow surgeons to work effectively," Howe says. "We take the volumetric images as they are produced by the scanner and apply fast image processing software we've written to locate the target tissue and the instrument."

Howe isn't the only scientist trying to use the Raven to change surgery. Blake Hannaford, a University of Washington electrical engineering professor involved with the Raven project, says UW researchers built the original Raven for telerobotic surgery study back in 2005. Now, they've developed a new version, Raven II, which is smaller, has more dexterity in its hands, and can hold surgical tools during operations. UW researchers also created software to work with the Robot Operating System, a popular open-source robotics code, so labs can easily connect the Raven to other devices and share ideas.

This year, biorobotics labs around the country are out to see what Raven II can do: In addition to Harvard, Johns Hopkins University, the University of Nebraska-Lincoln, UCLA, and UC Berkeley recently received Ravens. "They're all tackling different pieces of the problem and they build on each other's work, not starting from scratch and reinventing the wheel every time," says Blake Hannaford, a UW electrical engineer professor involved with the project.

Better Bots

Howe says robots are already part of the de facto standard-of-care in the country for some surgical procedures, such as prostate cancer surgery. "The main benefit of the present commercial surgical robot system is that they allow good dexterity when working through small incisions," Howe says, referring to the console-controlled da Vinci Surgical system, cleared by the FDA in 2000 to perform complex surgery using minimally invasive techniques.

Today's robots, like the da Vinci, are designed to replicate human motion and assist human perception. But Gregory Hager, a computer science professor at Johns Hopkins who specializes in computer vision and robotics, says the Raven research means there's room to grow. His team received the robot system two weeks ago. "The opportunity is to go from what humans can do, to doing things that are really superhuman," Hager says. "And to do superhuman surgery will require robots to have enough intelligence to recognize what the surgeon is doing and to offer appropriate assistance, remotely setting up no-fly zones for safety, superimposing images. All of that is coming down the road."

That's the kind of thing that Raven's builders—and the scientists now testing its capabilities—have in mind. The Raven software runs on a high-performance image-processing computer, similar to those found in a top-of-the-line gaming computer. Howe says the same graphics processors that produce high-quality computer-game images are ideal for real-time medical imaging. "We need the processing power because the data is coming out of the ultrasound scanner at very high rates," Howe says. "And because ultrasound images are inherently noisy, we need to use sophisticated algorithms to track the targets through the noise."

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

What a Raven Can Do

At Harvard, Howe and his team use a single, fast-motorized direction on the lab's robot to match the beating motions of the target heart tissue. When the surgical instrument reaches the tissue, a control loop is closed around them so that the instrument automatically moves in tandem with the beating motion of the heart structures. "The surgeon can command motions relative to beating the tissue, without worrying about the fast motion," he says. "Ideally, it's as if the surgeon is working on a stationary heart."

Raven also could potentially take the place of heart–lung machines, which circulate blood during invasive surgeries that require stopping the heart. They are a lifesaving technology, Howe says, but there's growing evidence that the device can lead to negative side effects such as an increased risk of stroke and general cognitive decline. "While most people might worry about the large chest incisions, it's stopping the heart that can carry the greater risks," Howe says. "If our technology can enable repairs within the heart while it continues to beat, the heart–lung machine won't be needed and these risks will decrease."

Hager's team is investigating whether surgical robots could make invasive operations such as functional endoscopic sinus surgery safer. This procedure that uses a nasal endoscope to find and treat nasal polyps and sinus inflammation. It's a risky procedure because of the sinuses' proximity to key eye and brain structures.

"When you go to do a surgery like that, the sinuses are a highway into the center of the head and the brain structures there," Hager says. "But there's a lot of land mines as you go in. You're passing the optic nerves, you're passing the carotid arteries, you're passing a set of craniofacial nerves that control the movement of the face."

Bleeding and anatomical variations can hinder a surgeon's view, and damaging these vital structures could lead to intracranial hemorrhaging. Research with the Raven system can improve existing CT-navigation systems, which use imaging to identify the 3D location of the endoscope's tip as the surgeon maneuvers through the sinuses.

If this works, Hager says, surgeons wouldn't be constantly looking back and forth between a map of the anatomy and the patient through the endoscope. "You can imagine now if you're doing the surgery, if at some point you need the map for these structures, you dial in that preoperative image," he says. "You superimpose that on the endoscope, and you can identify all these structures. Even though I can't see the carotid artery or optic nerve, they're behind the tissue. I now know where they are. You're fusing the two images in your head."

And five to 10 years from now, Hager says, medical students will probably use Raven as a research tool. The system can document expert hands at suturing, dissecting, and other canonical surgical tasks, allowing it to measure and rate new surgeons' abilities as they perform the same procedures.

"Robotics in some form in surgery is here to stay," Hager says. "Exactly how it's going to evolve is hard to predict at this point."

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io