A group of American researchers from MIT, Indiana University, and Tufts University, led by Erin Treacy Solovey, have developed Brainput — pronounced brain-put, not bra-input — a system that can detect when your brain is trying to multitask, and offload some of that workload to a computer.

The idea of using computers to do our grunt work isn’t exactly new — without them, the internet wouldn’t exist, manufacturing would be a very different beast, and we’d all have to get a lot better at mental arithmetic. I would say that the development of cheap, general purpose computers over the last 50 years, and the freedoms they have granted us, is one of mankind’s most important advancements. Brainput is something else entirely though.

Using functional near-infrared spectroscopy (fNIRS), which is basically a portable, poor man’s version of fMRI, Brainput measures the activity of your brain. This data is analyzed, and if Brainput detects that you’re multitasking, the software kicks in and helps you out. In the case of the Brainput research paper, Solovey and her team set up a maze with two remotely controlled robots. The operator, equipped with fNIRS headgear, has to navigate both robots through the maze simultaneously, constantly switching back and forth between them. When Brainput detects that the driver is multitasking, it tells the robots to use their own sensors to help with navigation. Overall, with Brainput turned on, operator performance improved — and yet they didn’t generally notice that the robots were partially autonomous.

Now, it’s easy to see how this could be extrapolated out into the real world. We already have steering wheels that detect when we’re falling asleep — with Brainput, your car could automatically drive itself during that split second where you turn around to shout at your kids, or twiddle with various dashboard knobs. The same goes for airplane pilots, or indeed anyone seated behind the controls of a large, dangerous vehicle. As you can see in the picture at the top of the story, fNIRS is lightweight and doesn’t require a lot of hardware — and there are wireless systems available, too.

Moving forward, Solovey now wants to investigate other cognitive states that can be reliably detected using fNIRS. Imagine a computer that increases the size of buttons and text when you’re tired, or a video game that slows down when you’re stressed. Your Xbox might detect that you’re in the mood for fighting games, and change its splash screen accordingly. Likewise, Firefox could detect that you’re feeling amorous, and automatically load up Private Browsing mode. Menu buttons could move around and change in size — or disappear entirely. Eventually, computer interfaces might completely remold themselves to your mental state.

Read more at Erin Treacy Solovey’s website (or jump directly to the paper [PDF])