Like any other form of transportation, walking down the street comes with its own codes of conduct. There aren't many laws, but there's a fairly universal code about walking speeds, lanes, avoiding bumping into somebody and the like. MIT has been looking at the unwritten rules of walking for a new autonomous robot with "socially aware navigation."

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

AT MIT's Sata Center, the robot, which the university says "resembles a knee-high kiosk on wheels," was able to avoid collisions while keeping up with the pace of those walking back and forth.

"Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians," says Yu Fan "Steven" Chen, who led the work on the project as a former MIT grad student and is the lead author of the resulting study, to be presented in September. "For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals."

Walking down the street may appear easy enough, but MIT's researchers broke it down into four parts: localization (understanding its precise location), perception (awareness of its surroundings), motion planning (determining the ideal path towards any given destination), and control (physically executing that ideal path).

The biggest difficulties came in trying to predict an individual route. Humans are notoriously unpredictable, and might stop or switch paths while walking for any number of reasons. "The knock on robots in real situations is that they might be too cautious or aggressive," says graduate student Michael Everett, a co-author of the paper. "People don't find them to fit into the socially accepted rules, like giving people enough space or driving at acceptable speeds, and they get more in the way than they help."

The solution? Machine learning. Chen, Everett and their fellow researchers fed their robot a variety of simulations, showing it what speeds and trajectories were appropriate. The team also used social norms, like encouraging the robot when it passed on the right and penalizing it when it passed on the left.

The test at Sata Center proof that the lessons had stuck. "We wanted to bring it somewhere where people were doing their everyday things, going to class, getting food, and we showed we were pretty robust to all that," Everett says. "One time there was even a tour group, and it perfectly avoided them."

Moving forward, the hope is that the robot will be able to handle an even larger group of people, and some who might be more surprised than MIT students to see a robot walking around. "Crowds have a different dynamic than individual people, and you may have to learn something totally different if you see five people walking together," Everett says. "There may be a social rule of, 'Don't move through people, don't split people up, treat them as one mass.' That's something we're looking at in the future."

Source: MIT via NewAtlas

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io