What technology does this share from the SL demo you did?

AB: It's the same technology for getting the data from the suit and manipulating all those rotations.

With Second Life I could pretty much pass those rotations to the avatar with almost no modification, but here the situation is very different. The robot has a completely different joint structure, less joints with different degrees of freedom, different center of mass, etc. It can do movements that I can't do (like roll its elbow 360 degrees) and I can do many things the robot can't. So we have to find a way to deal with those embodiment/balance dissimilarities, which we did.

What equipment are you using (including the robot)?

AB: Xsens MVN full body motion capture suit + Aldebaran Nao robot. The robot is something very popular in research for robot soccer competitions.

Will it be made commercially available?

AB: My university is currently investigating potential patenting and commercial viability. The target commercial focus would be to use such robots in rescue missions and disaster situations. Something like Fukushima is a perfect case where complex tasks like fixing a broken cable, driving a vehicle, climbing a ladder etc. have to be performed, but no robot can be pre-programmed for all those situations (not known in advance) and manipulating it through joysticks and sliders is a pain. There are plenty of hardware issues to be sorted out before wide-scale commercial deployment is possible (like using shielded chips and reliable communication lines in high radiation environments). But that's what research is all about.

Will the code to run it be made open source?

AB: That's a tough question. I, personally, don't like the idea of my code being used by military, so probably not. I realise that sooner or later the army will get something like this working (or maybe already did) without my help, but still I don't want to be facilitating military uses of such technology in any way.