In order to see this embed, you must give consent to Social Media cookies. Open my cookie preferences.

Glass Brain is a tool that maps the electrical activity of your brain in realtime, creating a 3D visualisation that you can navigate with a gaming controller.

The anatomically realistic 3D brain will show realtime data from electroencephalographic (EEG) signals taken from a specially-designed EEG cap. This data is mapped to the source of that electrical activity, i.e. the specific part of the brain. The underlying brain model is generated through MRI scans so that the EEG data is accurately mapped to an individual's brain model.


Different colours are given to the different signal frequency bands to create a beautiful interactive artwork that seems to crackle with energy, showing how information is transferred (or at least estimated to do so) between different regions of the brain.

Glass Brain was born out of a collaboration between Neuroscape Lab at the University of California, San Francisco, and the Swartz Centre for Computational Neuroscience at the University of California San Diego.

Read next If you just beat Covid-19, exercise might do more harm than good If you just beat Covid-19, exercise might do more harm than good

The project was spearheaded by Adam Gazzaley, associate professor of Neurology, Physiology and Psychiatry and director of the Neuroscape Lab. He told Wired.co.uk that it grew out of a couple of projects that he and his team have been working on in the lab. The first project is realtime EEG; Gazzaley's lab has been exploring building 3D immersive video games that take neural data from a person playing a game and feed it back into the game mechanics. In order for this closed-loop approach to work, it requires data to be supplied in as close to realtime as possible.

Secondly, Gazzaley has been working on educating and entertaining audiences about the brain, showing what the organ looks like and overlaying MRI and EEG data. "It's fun for an entertainment piece, but it's full of artefacts. It's a beautiful rendition but it doesn't go further than that," he explains.


Glass Brain takes the science that goes into the visualisation and improves upon the quality of the EEG data so that it's corrected for artefacts, is closer to realtime and the sources of the EEG data are mapped more accurately. "The first goal for Glass Brain was still largely to show people for educational purposes and to show what realtime EEG data might look like. Once we have this, it can potentially have a benefit as a realtime diagnostic tool for surgeons and neurologists and perhaps a therapeutic tool," Gazzaley told Wired.co.uk.

Displaying EEG data in close to realtime presents a major challenge. When you want to take the noise of raw EEG data and turn it into signal by converting the frequency domains and localising the sources, it adds processing time. Collaborators from the Swartz Center had been building algorithms that improved EEG processing, so these skills were combined with Gazzaley's visualisation skills using the Unity game engine. "I'm really interested in bringing consumer-facing tech into neuroscience," he says.

To help provide some processing muscle, the team also collaborated with Nvidia, which offered up some powerful graphics processing units (GPUs), which help shave milliseconds off the data delay. A company called Cognionics developed a 64-channel EEG cap for use with Glass Brain.

Read next This is what life on Venus might look like – and how we’ll find it This is what life on Venus might look like – and how we’ll find it

The biggest challenge was working out how to convert so much data into an aesthetically pleasing visualisation. "The complexity of the brain is a marvel in itself, but we have to make decisions as to how we convert data into the stuff we see," Gazzaley says.

The next challenge is to try and correlate events taking place in a game and the activity taking place in the brain of the person playing that game. So, for example, if a sign appears in the game, can the team map the parts of the brain that fire as the person sees it? "How do you filter it in a way that it appears from the background? The brain is such a busy place," he adds.

Once the team manages that, they want to do a closed-loop experiment. "Take that realtime EEG data that's localised to the sources in the brain and feed it into the game algorithms -- so you can learn how to play games that teach you how to regulate your own neural activity."


Gazzaley says that the team is at the early stages of this, but he has already experimented with using the Oculus Rift to navigate the Glass Brain. "If you had a friend or colleague wearing the EEG cap, you can fly in virtual reality into their functioning brain in realtime. You can fly into your own functioning brain if you like!" "The first time I was able to take an Xbox joystick and move it into the brain and see it's beauty, was amazing. But putting on the Oculus Rift and flying into the Glass Brain and seeing so much neural data -- both structural and functional -- that was an emotional experience."

He suggests that this opens up the possibility of creating games where players must travel to different parts of their own brain and try and increase activity in different parts of their brain in order to affect gameplay.

The Glass Brain has already been demoed at SXSW and Gazzaley hopes that one day people will be able to explore their brains in science museums.