Building off its technology that controls robots with brain signals and hand gestures, MIT's "RoboRaise" involves putting electromyography (EMG) sensors on a user's biceps and triceps to monitor muscle activity. It can then detect the person's arm level, and be instructed to raise and lower via discrete up-and-down gestures. The system was tested in a series of tasks involving picking up and putting down mock airplane components, and responded correctly to around 70 percent of all gestures.

The system is particularly ground-breaking because it taps into the subtle collaboration two humans embark upon when working together to lift something -- previously, a machine could only carry out this kind of task within the strict parameters of its pre-programming or third-party controls.

"Our approach to lifting objects with a robot aims to be intuitive and similar to how you might lift something with another person -- roughly copying each other's motions while inferring helpful adjustments," says graduate student and the paper's lead author, Joseph DelPreto. "The key insight is to use nonverbal cues that encode instructions for how to coordinate, for example to lift a little higher or lower. Using muscle signals to communicate almost makes the robot an extension of yourself that you can fluidly control."

In the future, the team hopes that adding more sensors or tapping into more muscle groups will increase RoboRaise's degrees of freedom, allowing it to assist in more complex tasks. Eventually, the system could be used in manufacturing and construction settings, or yes, even as a furniture-lifting assistant around the house.