In March, Elon Musk announced his gambit to merge human and machine with a Brain Computer Interface (BCI) called Neuralink. For a full report on the announcement, click here.

This technology would take the form of an injectable “neural lace” — composed of a mesh of electrodes — that would augment the human brain, adding another layer to the cortex and limbic system that is capable of communicating with a computer (essentially creating cyborgs). This, hypothetically, creates an upgradable, updatable interface that could be applied in countless ways. Some of these include:

Controlling Computers With Your Mind

Brains and technology both operate using the same vectors: electricity and data. Musk’s Neural Lace would be a system that provides a way for them to communicate directly with each other. To borrow a simile from Phillip Alvelda, the Neural Engineering System Design (NESD) program manager (another nascent BCI), “Today’s best brain-computer interface systems are like two supercomputers trying to talk to each other using an old 300-baud modem […] Imagine what will become possible when we upgrade our tools.” Applications could stretch from the remote operation of technology to the completely hands free and voiceless operation of computers. Researchers in Korea have already used a BCI to control turtles.

Updating Your Mind or Communicating With Someone Else’s

Elon Musk’s idea could both initiate brain activity and monitor it. The technology does not necessarily have to be a one-way communication stream, it is capable of both sending messages and creating information in the brain. The high-bandwidth interface could allow you to wirelessly transmit information to the cloud, to computers, or even directly to the brains of other people with a similar interface in their head. There is also the possibility of downloading content to augment your consciousness: think Neo learning kung-fu in the Matrix. While initial tests to improve intelligence haven’t been too successful, if brains and computers speak the same language, then computers can impart information to the brain. The technology is currently being used to allow paralyzed people to communicate, but its uses could extend far beyond that.

Bionic Limbs That Feel Like Real Limbs

As part of this two-way communication stream, robotic arms could communicate fleshy messages by being connected to existing nerve structures. Rather than making the brain learn how to use a new part of the nervous system, robotic limbs could be quickly and easily integrated into the system. This has the potential to revolutionize prosthetic limbs for the disabled, but may also encourage people to rid themselves of their biological arms in favour of mechanical super limbs. Who knows!

Emotionally Aware Technology

As computers and brains would essentially be speaking the same language, emotions could be read as data using electrodes. This would shift technology’s perception of humans from basic recognition to complex understanding. Robot helpers would be able to adapt to your emotional state rather than just responding to commands. Photos and videos could also be implanted with emotional metadata, meaning that one could feel what it would be like to be in any given scenario, rather than just trying to imagine it.

Next Generation Adaptable Gaming

One issue with the lifespan of games is repetition; people become accustomed, know what to expect, or are limited by the programmed narrative. A BCI could improve this situation by having games respond to what your brain is feeling, remaining one step ahead and endlessly diverse. This would be most applicable to the horror genre, in which enemies could come at you when and where you least expect it, providing constant shocks, jumps, and thrills. The Black Mirror episode Playtest is an hour long exploration of just how terrifying this could be. Since AI has been shown to be as creative as a human composer, this reality could be surprisingly close.