It does so with sensors constructed from a mix of silver and carbon which allows them to work through a person's hair while adding minimal noise to the signal. These signals are often contaminated with electrical interference caused by moving or talking so, to generate a clear reading, the dry EEG's signal is fed into a custom-designed algorithm that separates them in real-time into different components. These components are then compared against baseline readings to differentiate the data from the electrical noise.

"This is going to take neuroimaging to the next level by deploying on a much larger scale," Mike Yu Chi, a Jacobs School alumnus and CTO of Cognionics, a startup that helped with the study, told PsyPost. "You will be able to work in subjects' homes. You can put this on someone driving." The team envisions dry EEGs in a variety of futuristic applications -- from mobile sensor networks to mind-controlled phone apps and prosthetics.

They especially want to see it employed in neurological therapies. "We will be able to prompt the brain to fix its own problems," said Gert Cauwenberghs, UCSD bioengineering professor and the study's principal investigator. "We are trying to get away from invasive technologies, such as deep brain stimulation and prescription medications, and instead start up a repair process by using the brain's synaptic plasticity."

The technology is still in its infancy, mind you. We're still likely a decade away from being able to control our phones with our thoughts. Still, the prospects for this new technology are exciting -- enough even to score the team's startup, Cognionics, score a DARPA contract to further develop it.

[Image Credit: Jacobs School of Engineering]