As much as I’d love a brain-computer interface for controlling game characters or avatars in virtual worlds, there are still a few years until they’ll be able to analyze our body’s every move and translate them in real time to machines. In the meantime, there are other options. Small and cheap sensors have promise for new machine interfaces that monitor your entire body.



I’ve played games with floor-pads as controllers, like Dance Dance Revolution. I’ll be frank: it sucks. I’ve also had a go with Playstation’s EyeToy. It sucked too. The reason they suck is the limited freedom they provide. In essence, they’re still stuck in the prison-like dimension of finite button-pushing: The dance-pads are basically just a joystick you jump on, with the same monotonous button combinations as a typical handheld controller. Similarly, the EyeToy has extremely limited understanding of what it’s looking at and thus it basically has to split video input into sectors, where motion in a certain sector basically equals a push of a button. This gives the EyeToy’s entertainment value an eerie resemblance to the satisfaction I get from wiping my laptop screen clean.

We Need More Flexible Controls

What I’m looking forward to are interfaces that give you more dynamic control. When I first heard of the Nintendo Wii’s motion-controller (Wiimote) I got excited because I envisioned, for example, sword fighting where the slashes of the sword were not pre-determined “action sequences”, but real time actions performed by the user swinging the stick. Unfortunately, while the controller hardware certainly opens up this option — the software implementation is harder and thus the Wii games are still somewhat stuck in the same button-pushing dimension.

But that’s not to say things aren’t moving along. Nintendo made a major splash in the non-conventional controller department, and we’re seeing the effects of this reverberate throughout the commercial world. Portable media devices like the iPod are also giving wearable machine R&D a kick in the ass (something like a head-mounted display suddenly has obvious commercial potential for watching movies on your iPod). And then of course, there’s the constant ongoing research at MIT.

The MIT Whole-Body Sensor Suit

In collaboration with the Swiss Federal Institute of Technology (ETH) and Mitsubishi Electric Research Laboratories, they’ve produced a new kind of suit that can capture the motions of your entire body — and you don’t need a studio or lab environment to use it. It even works for people driving or playing ping-pong. A New Scientist article discusses the suit, explaining how it uses ultra-sonic beeps, gyroscopes and accelerometers:

Several sensors measuring about 2.5 centimetres on each side are attached to a person’s legs and arms. The sensors detect movement in two different ways: accelerometers and gyroscopes measure motion, but ultrasonic beeps are also emitted. Tiny microphones mounted on the torso pick up these beeps, allowing a laptop computer, carried in a backpack, to calculate the distance to the sensor. The system is similar to, albeit much simpler than, bats’ ultrasonic echolocation, and together with the motion sensors provides a more accurate overall picture of body movement. The small backpack also holds the batteries that power the system.

Originally designed to allow MIT students to give Clippy a virtual beating, the system could be utilized for various purposes. A New Scientist article mentions the potential for making animated movies more realistic and help doctors analyze movements of patients undergoing physical therapy. The following video was uploaded by the wonderful people at CSAIL, explaining the system and showing it in use.

This is awesome. Is there anything more to this? Yes. The price. The NS article quotes Rolf Adelsberger from ETH on that: The suit was made from off-the-shelf components and is much cheaper than similar systems used in the past. Currently, it costs about $3,000 — but Adelsberger imagines that the price could go down to a few hundred dollars if mass-produced.

Combining Almost-On-the-Market Technologies for the Ultimate Control Suit

What I find very interesting is the prospect of integrating some of these new groundbreaking machine interfaces. Especially for immersive, 3D environments. For example, the MIT suit is excellent work — but complemented with a head-mounted display? Oh man, that’s a winning combination right there. But there’s still something missing … oh yeah, I can’t really use the MIT suit to walk around inside a virtual world: If I can see and use my entire body, an attempt to make my avatar duck the fiery breath of a dragon would probably leave me with one foot in my cat’s food-plate and the other on the cat.

So, its use is limited to an immobile, vitruvian sphere, if you will. No worries, let’s combine the suit with Project Epoc from Emotiv — a non-invasive brain-computer interface where the system detects user intentions. Ah-hah, now we’re getting somewhere: Using the suit to detect limb motions, a HMD for visuals and the Epoc headgear to track intentions to walk-forward, backwards or turn, we’ve got ourselves a damn nifty setup. Additionally, it wouldn’t be bad if the Epoc headgear detects facial expressions like the Emotiv team claims (they haven’t released new information for a while, I’m waiting for some answers in the mail). The only thing that’d be missing would be force-feedback, but I’m not aware of any plans to make a commercial product of out of those.

Can you imagine the potential such a suit would give, say, teleconferencing? gaming? virtual world communities? learning Tai Chi? cybersex? Hot stuff. And here we are, born about 50 years too early. Waiting for these devices to debut one at a time. Then waiting for the prices to come down. Then waiting for someone to combine all the systems into one. Finally, waiting for the price to come down on the combined system.

Well, I’m being pessimistic. Technology is really improving and at an ever increasing rate. Hell, we were using black and white laptops 15 years ago. But just in case immersive virtual reality technologies take a while to show up on our doorstep — we’re making headway in life-extension technologies as well, so no worries.

Links and References

Practical Motion Capture in Everyday Surroundings, paper published in SIGGRAPH 2007 on the MIT suit (PDF)

Cheap sensors could capture your every move, New Scientist on the motion suit

No related posts detected.