Millions are drinking the Kinect Kool-aid, jumping around in front of their XBox and playing games by flailing their bodies. Now a student at MIT’s Personal Robotics Group is going to put all that wild gyrating to a good use: controlling robots. Philipp Robbel has hacked together the Kinect 3D sensor with an iRobot Create platform and assembled a battery powered bot that can see its environment and obey your gestured commands. Tentatively named KinectBot, Robbel’s creation can generate some beautifully detailed 3D maps of its surroundings and wirelessly send them to a host computer. KinectBot can also detect nearby humans and track their movements to understand where they want it to go, as you can see in the video below. I had a chance to talk with Robbel and learn more about how this “weekend hacking project” might evolve into some really cool applications for rescuing victims after a disaster.



Robbel is a PhD student at MIT, and his research is aimed at creating a team of robots that could work together and find missing or trapped people. He has four of iRobot’s Create machines, which he calls “iPucks”, and four quadrotors. KinectBot is a sort of proof of concept. If he can get the 3D mapping to succeed then we might see it in the rest of his work. Don’t expect Kinect sensors strapped to quadrotors anytime soon (they’re too heavy and need too much power) but the iPucks might be able to visualize a general map of an area that the flying drones could use to navigate and explore. This sort of miniature swarm could be a powerful tool in disaster relief. Robbel admits there’s some concerns over whether or not a distressed person’s gesture commands should be obeyed, but there’s certainly the possibility that a KinectBot that found one victim could use their motions to help it find the next one more quickly.

KinectBot’s software is hacked together just as much as its hardware. Robbel used simultaneous localization and mapping (SLAM) code from OpenSLAM.org (specifically GMapping) as well as some visualization packets from Mobile Robot Programming Toolkit (MRPT). He added his own interaction, human detection, and gesture code so that KinectBot could follow signed commands. As all of this is pretty much open (MRPT is GNU GPL, GMapping is Creative Commons, and Robbel’s own code isn’t anything MIT needs to keep proprietary) there’s a chance the software package will be available at some point for you to download. That’s good news to all the hackers out there who’d like to put together a KinectBot of their own.

On a side note, I think it’s pretty funny that the first iRobot to finally use SLAM (which next generation robot vacuums already take advantage of) is a hack. Maybe Robbel should be put in charge of improving the Roomba?

*Update 10.17.10 – I should have known not to poke fun of iRobot without iron clad evidence. As many have pointed out, SLAM (or SLAM like algorithms) have been used on many projects using iRobot products before now (though I don’t think any of those were a commercial product, which was sort of my point).

As with the Nintendo Wii-motes before it, the Kinect sensor has the potential to become a powerful hacking tool in the years ahead. Sophisticated 3D mapping for $150? Yes, please. We’re likely to see a lot of developers use Kinect for proof of concepts just like Robbel did. It will be cool if Robbel is able to upgrade his KinectBot into a meaningful component of swarm robotics. I don’t know about you, but the idea of an elite band of robot troopers rescuing me from a pile of rubble sounds pretty damn awesome. There’s a movie in there somewhere. And best of all, those robots could obey my gestured commands.

Come, my minions, follow my outstretched arm towards freedom!

[image and video credit: Philipp Robbel]

[source: Philipp Robbel]