Some hobbyist hackers have rigged up an iPhone 4S to collect brain wave patterns from some simple ECG pads, translate them into synthesized speech, which is in turn pumped through the 3.5 mm headphone jack, and recognized by Siri as a usable command. Besides pressing the home key to initiate Siri, all you have to do is think your command, and your iPhone 4S will hop to it. The engineers expect that they’ll even be able to eliminate the need to press the home key, making it fully automatic. So far, the guys at Project Black Mirror have been able to link 25 brain wave patterns to specific Siri commands. Of course, right now the project is a bulky Arduino test board hooked up to a Macbook, which also occupies the headphone jack, and makes the user look like he belongs in Clockwork Orange, but these guys are putting up a Kickstarter page shortly to get funding and turn this thing into a real product.

If you aren’t familiar, Siri is an app bundled in with the iPhone 4S that translates natural speech into actions, such as setting your phone’s alarm, sending text messages, booking meetings, or looking up information on Wikipedia or Wolfram Alpha. We’ve got a full review of Siri here if you want a closer look. Here’s some of the nitty-gritty on how they’ve set this up:

1. ECG pads provide raw skin conductivity / electrical activity as analogue data (0-5v).

2. This is plugged into the Arduino board via 4 analogue inputs (no activity = 0v, high activity = 5v).

3. The Arduino has a program burnt to it’s EPROM chip that filters the signals.

4. Josh trained the program by thinking of the main Siri commands (“Call”, “Set”, “Diary” etc.) one at a time and the program where we captured the signature brain patterns they produce.

5. The program can detect the signature patterns that indicate a certain word is being thought of. The program will then wait for a natural ‘release’ in brain waves and assume the chain of commands is now complete and action is required.

6. The series of commands are fed to a SpeakJet speech synthesiser chip

7. The audio output of which simply plugs into the iPhone’s microphone jack.

You know we’re one step closer to the robots taking over when they can start reading our minds, right? Don’t get me wrong, the technology here is awesome, but consumer-grade brain wave monitoring gear has been around for awhile, and still hasn’t seen much real uptake. If the final product from all of this is bundles as a stylish headband or hat and a classy iPhone 4S case that doesn’t occupy the headphone jack, I could potentially see this working. I’d be really curious to see how extensible this set-up is; maybe over time, this could enable full message transcription, too. The novelty is great, but there are practical applications for the disabled, too.

Here’s a demo of the Black Mirror in action, but I actually love the one after it, which looks like the first time they got it to work. You can follow the project’s progress at their blog over here.