Check out our featured article: Does This Headline Know You’re Reading It?

——————————–

Why bother to type a document using a keyboard when you can write it by simply thinking about the letters you need to type?

A brain wave study presented at the 2009 annual meeting of the American Epilepsy Society shows that people with electrodes in their brains can “type” (input data into a computer) using just their minds. Neurologist Jerry Shih, M.D. Shih and other Mayo Clinic researchers worked with Dean Krusienski, Ph.D., from the University of North Florida in an experiment involving two patients with epilepsy. Both patients were already being monitored for seizure activity using electrocorticography (ECoG), in which a sheet of electrodes is laid directly on the surface of the brain. This procedure requires a craniotomy, a surgical incision into the skull. Dr. Shih and colleagues hypothesized that feedback from electrodes placed directly on the brain would be much more specific than data collected with EEG (electroencephalography) alone, in which electrodes are placed on the scalp. Most studies of mind-machine interaction have occurred with EEG. "There is a big difference in the quality of information you get from ECoG compared to EEG. The scalp and bony skull diffuses and distorts the signal, rather like how the Earth’s atmosphere blurs the light from stars," says Dr. Shih. "That’s why progress to date on developing these kinds of mind interfaces has been slow."

Dr. Shih’s patients at the Mayo Clinic were asked to look at a computer screen containing a 6-by-6 matrix with a single alphanumeric character inside each square. Every time the square with a certain letter flashed, the patient focused on it and a computer application recorded the brain’s response to the flashing letter. The computer software calibrated the system with the individual patient’s specific brain wave patterns. When the patient then focused on a letter, the letter appeared on the screen. "We were able to consistently predict the desired letters for our patients at or near 100 percent accuracy," Shih explains. "While this is comparable to other researchers’ results with EEGs, this approach is more localized and can potentially provide a faster communication rate.”

A recent h+ article, “Mind Reading (Neuro Decoding) Goes Mainstream” (see Resources) describes a similar study by Dr. Gerwin Schalk, who worked with patients using ECoG at the Wadsworth Center, in Albany, NY. The patients were asked to say or imagine words flashed on a screen while their brain activity was recorded. Schalk’s team then used specially designed decoder algorithms to predict the vowels and consonants of the word, using only the pattern of brain activity. They found that both speaking and imagining the word gave roughly the same level of accuracy.

Brain Wave applications include the ability to “mind read” vowels, consonants, and individual letters; algorithms to turn brain waves into musical scores; even twittering by thought alone.

In addition to the ability to “mind read” vowels, consonants, and individual letters, brain wave applications also include algorithms to turn brain waves into music and even “tweeting” (using the popular Twitter Internet application) by thought alone. Brain music therapy is a form of neurofeedback using EEG based on a variable ratio of fast and slow rhythms –- it can be used to turn a person’s brain waves into music notes using a computerized mathematical formula. Dr. Galina Mindlin, a neuropsychiatrist with the Brain Music Therapy Center in New York City brought this to the U.S. from Moscow in 2006 as a form of entrainment therapy. Interviewed on NBC’s Today Show, she said, “Brain waves are translated into music digitally with a special algorithm. Once the brain waves are converted into musical sounds, they are placed on a CD with a relaxing file and activating file and instructions on how to use them.” What does this mind-machine interface sound like? “It sounds like classical piano music,” says Dr. Mindlin. Here’s a video showing the use of an EEG mind-machine interface to control sampled sound clips on a piano:

In addition to his ECoG research at the Wadsworth Center, Dr. Gerwin Schalk and his colleagues also worked with University of Wisconsin-Madison biomedical engineering doctoral student Adam Wilson to develop an interface that involves a keyboard displayed on a computer screen that interprets brain waves to send Twitter messages (tweets). "We started thinking that moving a cursor on a screen is a good scientific exercise," said Justin Williams, a University of Wisconsin-Madison assistant professor of biomedical engineering and Wilson’s adviser. "But when we talk to people who have locked-in syndrome or a spinal-cord injury, their number one concern is communication."

Using the EEG-based interface, "All the letters come up, and each one of them flashes individually," explains Williams. "And what your brain does is –- if you’re looking at the ‘R’ on the screen and all the other letters are flashing – nothing happens. But when the ‘R’ flashes, your brain says, ‘Hey, wait a minute. Something’s different about what I was just paying attention to.’ And you see a momentary change in brain activity."

Wilson was able to tweet by thought alone using EEG. Here’s a video showing the brain-Twitter interface:

Tweeting by thought alone is a somewhat slow process using this prototype technology –- we speak at approximately 120 words per minute. But, as with texting, users can improve as they practice using the interface. "I’ve seen people do up to eight characters per minute," Wilson says.

Brain wave applications in the laboratory –- whether using EEG or the more invasive ECoG –- now include the ability to “mind read” vowels, consonants, and individual letters; algorithms to turn brain waves into musical scores; and even twittering by thought alone. Who needs a keyboard when you can simply think about what you want to say (or play musically) and have it recorded and/or communicated?