Despite the large crowd it drew, Neurable's booth setup at SIGGRAPH 2017 earlier this year was pretty simple. There were no handheld controllers, mice, motion trackers, or keyboards, no HP Z Backpack computers, no robots ... just a chair, a screen, and a modified HTC Vive VR headset. The game on the screen also looked simple enough, tasking players with picking up and manipulating toys and other objects to escape a room they were trapped in. But the difference was that the players weren't playing the game with controllers, head movement, or even hand gestures ... they were using their thoughts.

In its current iteration Neurable's EEG headset is an array of seven electrodes that attaches to a HTV Vive headset. (Image source: Neurable)

The demonstration, by Boston-based startup Neurable (a combination of “neuron” and “able”), was one of the company's first official live demonstrations of its brain-computer interface (BCI) technology. And while it could be easy to label it as another sort of high-end gaming peripheral, company founder and CEO Ramses Alcaide said virtual reality (VR) and augmented reality (AR) are only the entry points for Neurable's mission – “creating a world without limits,” where control over any device is a simple thought away.

In an interview with Design News Alcaide admitted his company's ambitions sound like something out of a William Gibson novel. But he also believes its adoption could come sooner than we think. “Interacting with your brain is the new way in which people will interact with the world,” he said. “Technology comes in three stages: first, the tech is created; second, an interaction method that makes it natural is created; and third, is the killer app...we're already at the second phase.”

That second phase comes via Neurable's leveraging of a technology that has been around for over a century, electroencephalography (EEG). EEG has a big presence in medical research and in academia, where researchers have used it to in everything from medical diagnosis and drug testing, to creating brain-computer interfaces that transform the brain's electrical activity into actions and commands in computers and other devices.

There's even an annual award, the BCI Research Award, devoted to BCI projects. The 2016 winner was a team of researchers from the Battelle Memorial Institute in Ohio, who created a system that used an implanted BCI to restore movement to a quadriplegic's wrist and fingers.

Like so many other BCI projects, Neurable began on the university research level. While studying for a PhD in neuroscience and BCIs at the University of Michigan, Alcaide developed the unique machine learning algorithm that forms the basis of Neurable's technology. “[Our] innovation isn't the electrodes, it's the machine learning platform that interprets brain activity,” he said.

EEG picks up what are called event-related potentials (ERPs) in your brain, small electrical signals that occur as your your neurons fire in response to some sort of stimulation. “It basically tells you if something changes that a person is interested in and they want to take conscious action on,” Alcaide explained. “Think of how a mouse on a computer works. The clicking motion itself is you consciously wanting to do something. What we do is cut out the middleman, the mouse.”

Part of what separates Neurable's algorithm is its ability to detect event-related potentials even in noisy environments that would create a lot of interfering brain activity. The way that scientists work with EEG has become very standardized and hasn't changed very much over the years and decades. Typically EEGs readings are done in a lab setting and across multiple trials in order for researchers to get the signal they want. Things like other brain activity and electromagnetic interference from objects like computers and lamps can all create noise that obstructs the signal of interest. In a lab this is not a huge problem because you can record an EEG over and over until you get a read on what you're looking for. But in the real-world, especially in entertainment applications like games, the BCI has to be able to pick up an accurate and reliable signal the first time around even in a noisy and crowded environment.

This was the challenge for Alcaide and his team to overcome. “ I thought about what ERP means and started to reimagine machine learning pipelines,” he said. Rather than trying to pick up the frequency of an ERP, Alcaide opted to go after the latency and shape of the signal. “The general noise is still there, but instead of frequencies, we used combinations of shapes, so whenever we get noise we're able to be okay,” he said.

Neurable's original prototype device used 256 dry electrodes connected to various parts of the scalp. The current version is a seven-electrode headset that attaches to a HTC Vive. Alcaide said the EEG sensors are off-the-shelf with no proprietary modifications.

Alcaide demonstrates an earlier version of Neurable's technology in a video from the University of Michigan.

Alcaide originally hoped for Neurable to emerge as a medical device company. “When I was 8 years old my uncle got into a trucking accident and lost his legs,” he said. “It's what inspired me to try developing this technology. Originally we implemented it into wheelchairs, cars, and things, doing that sort of stuff in real time.”

He said he and his team took a step back after realizing the heavily regulated world of medical would be a very slow route. “So we asked ourselves, could we find a consumer application. That's when we thought about AR and VR. When you think about it the computer has the mouse, the smartphone has the touch screen...but there's nothing for VR and AR.”

The SIGGRAPH demo only required a few moments of training to get users oriented. To calibrate the BCI users would first watch various objects and were instructed to think “grab” each time one moved. After the orientation players were freely able to pick up any number of objects around the room in an effect akin to telekinesis simply by focusing on it and thinking about grabbing it.

And while the company's latest demo is a video game, Alcaide doesn't expect Neurable's EEG rig to become the go-to controller for the latest, fast-paced action games just yet. The level of speed and accuracy, though high, is not quite ready to match the speed of a gamer using keyboard and mouse or game controller just yet. “We're really interested in making user interface and user experience (UI/UX) interactions,” Alcaide said. “We're creating a platform for all devices and we see it as us achieving that even faster by getting there from a UX/UI perspective.”

Neurable's demo at SIGGRAPH 2017. After only a few minutes of training users are able to manipulate virtual objects using their thoughts.

Having raised an initial $2 million in seed funding, Neurable recently completed another round of funding in the wake of its public debut. Future iterations of its headset are planned to have even fewer electrodes, “So hopefully they will become more invisible and smaller over time. It means more work on the machine learning side,” Alcaide said.

The company has also released a Unity-compatible software development kit (SDK) for developers looking to create BCI-controlled VR, AR, and mixed reality experiences. The ultimate goal is to eventually circle back and reach the company's original core goal by getting into medical, automotive, and other devices, ideally making its EEG rig a standard feature of AR and VR headsets.

“The next step will be pilot programs to see where the technology will enter the roadmap,” Alcaide said. “Right now we're establishing relationships with headset manufacturers so we can push this as a standard and get manufacturers to build electrodes into their headsets.” He also said the company has launched a beta program and will be giving out some early-release headsets to interested developers. “Anyone looking to solve some of the biggest issues with how we interact with technology today.”

Artificial Intelligence: What Will the Future Be?

Intelligent systems and robots will one day help us with routine tasks, handle dangerous jobs, and keep us company. But they could also make decisions that violate our ethical principles, take control of our lives, and disrupt society. Join Click here to register today! Intelligent systems and robots will one day help us with routine tasks, handle dangerous jobs, and keep us company. But they could also make decisions that violate our ethical principles, take control of our lives, and disrupt society. Join Maria Gini — accompanied by her AI-enabled humanoid robot — during her keynote address at ESC Minneapolis Nov. 8-9, 2017, and explore state-of-the-art intelligent systems and discuss future developments and challenges.

Chris Wiltz is a senior editor at Design News covering emerging technologies including VR/AR, AI, and robotics.