The research is important because it could help show whether a wearable brain-control device is feasible and because it is an early example of a giant tech company being involved in getting hold of data directly from people’s minds.

To some neuro-ethicists, that means we are going to need some rules, and fast, about how brain data is collected, stored, and used.

In the report published today in Nature Communications, UCSF researchers led by neuroscientist Edward Chang used sheets of electrodes, called ECoG arrays, that were placed directly on the brains of volunteers.

The scientists were able to listen in in real time as three subjects heard questions read from a list and spoke simple answers. One question was “From 0 to 10, how much pain are you in?” The system was able to detect both the question and the response of 0 to 10 far better than chance.

Another question asked was which musical instrument they preferred, and the volunteers were able to answer “piano” and “violin.” The volunteers were undergoing brain surgery for epilepsy.

Facebook says the research project is ongoing, and that is it now funding UCSF in efforts to try to restore the ability to communicate to a disabled person with a speech impairment.

Eventually, Facebook wants to create a wearable headset that lets users control music or interact in virtual reality using their thoughts.

To that end, Facebook has also been funding work on systems that listen in on the brain from outside the skull, using fiber optics or lasers to measure changes in blood flow, similar to an MRI machine.

Such blood-flow patterns represent only a small part of what’s going on in the brain, but they could be enough to distinguish between a limited set of commands.

“Being able to recognize even a handful of imagined commands, like ‘home,’ ‘select,’ and ‘delete,’ would provide entirely new ways of interacting with today's VR systems—and tomorrow's AR glasses,” Facebook wrote in a blog post.

Facebook has plans to demonstrate a prototype portable system by the end of the year, although the company didn’t say what it would be capable of, or how it would measure the brain.

Privacy question

Research on brain-computer interfaces has been speeding up as rich tech companies jump in. On July 16, Neuralink, a brain interface company formed by SpaceX founder Elon Musk, said it hoped to implant electrodes into the brains of paralyzed volunteers within two years.

However, the public has reason to doubt whether tech companies can be trusted with a window into their brains. Last month, for example, Facebook was hit with a record $5 billion fine for deceiving customers about how their personal information gets used.

“To me the brain is the one safe place for freedom of thought, of fantasies, and for dissent,” says Nita Farahany, a professor at Duke University who specializes in neuro-ethics. “We’re getting close to crossing the final frontier of privacy in the absence of any protections whatsoever.”

Facebook emphasizes that all the brain data collected at UCSF will stay at the university, but Facebook employees are able to go there to study it.

It’s not known how much money Facebook is providing the university nor how much volunteers know about the company’s role. A university spokesman, Nicholas Weiler, declined to provide a copy of the research contract or the consent forms signed by patients. He said the consent forms list Facebook among several potential sponsors of the research.

While a brain reader could be a convenient way to control devices, it would also mean Facebook would be hearing brain signals that could, in theory, give it much more information, like how people are reacting to posts and updates.

“Brain data is information-rich and privacy sensitive, it’s a reasonable concern,” says Marcello Ienca, a brain-interface researcher at ETH in Zurich. “Privacy policies implemented at Facebook are clearly insufficient.”

Facebook says it will do better with brain data. “We take privacy very seriously,” says Mark Chevillet, who leads the brain reading project at Facebook.

Correction: A system developed by UCSF decoded speech signals in the brain as people spoke aloud, not from silently imagined speech.