Manufacturers have begun experimenting with a new generation of “cobots” (collaborative robots) designed to work side-by-side with humans.

To determine best practices for effectively integrating human-robot teams within manufacturing environments, a University of Wisconsin-Madison team headed by Bilge Mutlu, an assistant professor of computer sciences, is working with an MIT team headed by Julie A. Shah, an assistant professor of aeronautics and astronautics.

Their research is funded by a three-year grant from the National Science Foundation (NSF) as part of its National Robotics Initiative program.

Cobots are less expensive and intended to be easier to reprogram and integrate into manufacturing. For example, Steelcase owns four next-generation robots based on a platform called Baxter, made by Rethink Robotics.

Each Baxter robot has two arms and a tablet-like panel for “eyes” that provide cues to help human workers anticipate what the robot will do next.

“This new family of robotic technology will change how manufacturing is done,” says Mutlu. “New research can ease the transition of these robots into manufacturing by making human-robot collaboration better and more natural as they work together.”

Mutlu’s team is building on previous work related to topics such as gaze aversion in humanoid robots, robot gestures, and the issue of “speech and repair.” For example, if a human misunderstands a robot’s instructions or carries them out incorrectly, how should the robot correct the human?

On Rethink Robotics’ blog, founder and chairman Rodney Brooks notes “three exciting and significant trends taking place right now” that he thinks will begin to gain some very real traction in 2015:

We will begin to see large-scale deployment of collaborative and intelligent robots in manufacturing.

This will be a breakout year for robotics research.

Emerging technology will be designed to solve some of the world’s biggest problems.

At MIT, Shah breaks down the components of human-robot teamwork and tries to determine who should perform various tasks.

“People can sometimes have difficulty figuring out how best to work with or use a robot, especially if its capabilities are very different from people’s,” says Shah. “Automated planning techniques can help bridge the gap in our capabilities and allow us to work more effectively as a team.”

Over the summer, UW-Madison computer sciences graduate student Allison Sauppé traveled to Steelcase headquarters to learn more about its efforts to incorporate Baxter into the production line. She found that perceptions of Baxter varied according to employees’ roles.

While managers tended to see Baxter as part of the overall system of automation, some front-line workers “saw Baxter as a social being or almost a co-worker, and they talked about Baxter as if it were another person,” she says. “They unconsciously attributed human-like characteristics.”