M. Hossein M. Kouhani

for the Lansing State Journal

Christopher Sanchez asked his wife, Neyda, to push the button.

She invited their son, AJ, to join her. The moment was too big to do it alone.

Together, they turned on a bionic eye called The Argus II.

“What do you see, Dad?” the boy asked.

“I see you, AJ,” his father replied. “Is that you?”

After 20 years of blindness, 47-year-old Christopher Sanchez of Stockton, California, could see shapes and shadows.

And, for the first time ever, his 11-year-old son.

And 2,300 miles away, in the BioElectronic Vision Laboratory at the University of Michigan, James Weiland is working to improve that technology, so that someday Sanchez and others who have lost their vision to degeneration of the retina will be able to read small type and see the faces of their loved ones as more than just light and shadow.

“The roadblock today is the patient seeing (too) simple forms,” said Weiland, a professor of biomedical engineering, who has worked on technology for the Argus II since 1997. "There are certain conditions where patients can read a letter. (However), it takes them ten to thirty seconds…We would like to make it more intuitive.”

The University of Michigan has played an important role in connecting human brains to man-made electronics.

Starting in the early 1980s, researchers borrowed manufacturing techniques from the microelectronics industry to create neural probes capable of measuring and influencing brain activity.

The technology that has evolved from those early probes is now used to treat epilepsy, paralysis and Parkinson’s disease, to give scientists a clearer view of how the brain works and, now, to help blind people see.

Weiland and his colleagues hope to help them see better.

The beginnings

The design and development of the retinal prosthetic devices – bionic eyes - started at the Duke Eye Center. The first attempts to surgically implant the devices in humans took place in the late 1980s.

There were tests at the Wilmer Eye Institute at Johns Hopkins University, where scientists and engineers made a prototype that worked in an eye for an hour. Then it would fail and never work again.

The project was transferred to the Doheny Eye Institute at the University of Southern California in mid-1990s. That's where Second Sight Medical Products Inc. was founded. There Argus II co-inventor Dr. Mark Humayun and the Second Sight team refined the technology.

Weiland, who joined the company before it launched in 1998, dedicated four years to helping them create an implant that could last for months, which required creative design using cutting-edge materials that lasted longer in the body.

Now, with his team at U-M, he is exploring techniques to understand and improve the performance of the bionic eye.

They conduct experiments on rodents to learn how electricity can better be controlled and sent to the back of the eye with higher resolution, wider angle of view and maybe even colorful imagery.

The most immediate need is to “add more pixels, expand the field of view,” he said.

How is this even possible?

The Argus II is designed to help patients who have lost their vision due to a condition called retinitis pigmentosa, a hereditary disease that causes degeneration of the retina.

In most cases, the outer layers of the tissue are damaged first. The job of the Argus II is to reproduce the functions of those layers, which are mainly comprised of cells that convert light into electrical signals.

The device uses an external video camera to capture images, which are processed in real time by computer algorithms inside a miniature chip and then sent to a micro-electrode array that produces flashes of light that can be perceived by the patient.

It costs about $150,000 for a unit plus the cost of surgery and patient training. A number of private insurers, Medicaid and Medicare cover the cost. The University of Michigan Kellogg Eye Center is one of 13 major centers across the US to offer the implant.

But to make it last longer than a decade, improve the resolution and widen the field of view will require researchers to address a handful of engineering challenges.

The best visual acuity reported for the Argus II is currently 20/1260, meaning that a user of the device sees from 20 feet what an average person sees from 1260 feet.

That’s partly due to the device’s bulky case. The space it takes up limits the number of electrodes that could interface with nerve cells in the retina.

Even though its bullet-proof sealing enables implanting a device for more than 10 years, Weiland and his team are working on a diamond-and-metal coating that could make space for more electrodes and help achieve higher resolution.

Elena Della Valle moved from Italy to join the team as a post-doctoral researcher and is trying new approaches in making electrodes sturdier and smaller so that they last longer in the body and so that more of them can be built into a single device, yielding higher resolutions.

“I am working on microfabrication and electroplating process and (trying) out different insulating material and testing its resistance to various temperatures,” she said.

The work is unglamorous — a lot of basic research is — but it holds the possibility of improving resolution in future versions of bionic eyes.

Seeing clearer

One hypothesis explaining muddy vision reported by The Argus II users is that the electrodes unwantedly stimulate the long portion of nerve cells called axons, which causes patients to see oblong shapes of light instead of distinct points.

Weiland’s team showed that by tweaking the electrical signal they could diminish the activation of those nerve cells and potentially improve the visual acuity in patients up to 20/312 which would be a huge improvement.

A normal healthy eye achieves a 90-degree field of vision side to side, somewhat less up and down. The Argus II provides a visual field of only about 19 degrees in any direction.

That could be improved with a wider electrode array that covered a larger area of the retina. The main limitation is the size of the incision where the electrode is inserted into the eye.

Weiland and his team suggested an array that could unfold inside the eye and cover most of the retina, elevating the field of view up to 34 degrees.

None of it is easy. Just connecting an electrode to a nerve cell demands immense patience and dexterity.

“You need to be patient. Stay calm, sit tight and record a full day,” says Dr. Wennan Li, a post-doctoral researcher at BioElectronic Vision Lab. “You have to stay here. You cannot walk around if you want good results. You have to sit in front of the equipment; you want to stick to it, concentrate and cherish the time especially when you find good healthy cells.”

Cochlear implants

The bionic eye is an example of what are known as brain-computer interfaces, technology that links the human brain to computers using magnetic waves, sound waves, light, direct electrical stimulation or some combination thereof.

Some brain-computer interface devices help to treat neurological diseases such as epilepsy, paralysis, Parkinson’s, blindness or deafness. Others aim to help scientists understand how information is processed in the brain, so they can design faster computers inspired by nervous system.

The most developed and most familiar of such devices are cochlear implants, which stimulate nerve fibers in the ear and induce electrical pulses to the brain perceived as sound.

The Food and Drug Administration approved the first cochlear implant for use in adults in 1984. In the United States, roughly 58,000 devices have been implanted in adults and 38,000 in children.

“Cochlear implants are the ones that are furthest along and they are succeeding because people hear somewhat naturally. It is not natural hearing, but they can hear and understand speech. We are not there yet with retinal or brain-controlled prosthesis,” Weiland said.

Second Sight Medical Products is already testing a new device, the Orion, which uses much of the same technology but bypasses the eye, instead placing electrodes on the brain's visual cortex.

If that approach works, it would allow the device to restore vision to a far greater number of patients.

The 1970s

In 1970s, using the technology borrowed from computer industry it became possible to record neural activity from one and only one cell at a time inside a laboratory.

The industrial electronics technology at the time was crude and even cruder in academic labs. Just shaping the silicon was more like art than an automated process.

“It was frustrating because it was just as hard to make the second one as it was to make the first one,” said Kensall Wise, a professor emeritus of electrical engineering and computer science, biomedical engineering and atmospheric, oceanic and space sciences at U-M. “That is not what you want.”

Especially, he said, if you wanted to make an impact on the field of neuroscience.

Beginning in 1981, Wise and collaborators at U-M developed a series of neural probes capable of measuring and influencing activity in the brain at the cellular level.

Their innovation was the use of fabrication techniques borrowed from the quickly growing microelectronics industry.

Made in Michigan and given to neuroscientists around the country to advance brain research, they became known as Michigan Probes.

The technology has expanded and nowadays offers an established tool set for scientific research and clinical applications.

“The exciting thing is that the whole area of BCI is finally above critical mass.” Wise said. The ability to supply large number of devices to large number of physicians and investigators has shaped the field of neuroscience.

And its uses have expanded.

For instance, in a third of epilepsy cases, seizure medicines do not control seizures or cause severe side effects.

Since 1980s, brain-computer interface technology has provided another option called neuro-modulation. It involves using a device to monitor brain activity and administer electrical stimulation once it detects an onset of seizure.

Patients with late-stage Parkinson’s disease, are also treated with deep-brain stimulation devices that deliver electric pulses to the brain cells to alleviate the symptoms.

U-M researchers have developed multiple ways to refine the process, including special imaging tools that help surgeons place the electrodes more accurately.

The technology also allows people who have lost control of their limbs to perform tasks like pouring from a bottle, stirring their coffee or even playing guitar hero.

A device smaller than a penny can pick up information from the brain, process it in a computer and send it out to the muscles using long sleeves filled with electrodes that sit on the skin.

”Right now, we are at a point (where) we can create visual perception, create sense of hearing, and we can track people to control prosthetic arms with neural signals,” Weiland said.

Once the decoding algorithms that interpret the brain signals are improved, these devices will be able to perform with much higher accuracy with minimal response time and daily setup.

Like Geordi La Forge

Christopher Sanchez is just happy that he can see something.

The Argus II has allowed him to see lights, locate walls and windows, follow the lines of the crosswalks across Pacific Avenue, read the large letters on the Peet’s Coffee and Tea sign near his house and find his café latte like a boss.

A camera in his special pair of glasses takes in his surroundings and converts the signal into pixels.

An implant inside his right eye receives the information wirelessly and sends those pixels to the back of his eye in the form of electric pulses. Those electric pulses mimic what a healthy retina would otherwise generate for the brain to process.

“I was told an implant in one eye is enough since I only got one brain,” said Sanchez, who is an Infosys contract engineer with Apple Inc.

He’d spent years dreaming he’d one day have something like the device worn by Geordi La Forge on “Star Trek: The Next Generation.”

Now he does.

M. Hossein M. Kouhani is a research assistant and PhD candidate in the Department of Electrical and Computer Engineering at Michigan State University. He can be reached at hosmaz01@yandex.ru. Reporting on this story was supported, in part, by a grant from the Science and Society at State program at MSU.