Drones are already taking over seemingly every aspect of life – but their current limitation is their inability to sense their environment and adapt to navigate it. That all could be about to change, if a team from the University of Sheffield has anything to do with it.

The team of researchers has fitted your average quadcopter – a four bladed drone popular among enthusiasts – with a sophisticated suite of sensors designed to allow it to explore and adapt to its environment, coupling it with advanced machine learning systems.

The robots are equipped with barometric data, which measures atmospheric pressure, as well as ultrasonic apparatus that allows it to measure its distance from the ground, just like a bat. They also have twin cameras that operate in just the same way as the human eye to determine the distance to objects in front of them, as well as depth and perspective. By overlaying different frames from the camera and selecting key reference points within the scene, it builds up a 3D map of the world around it.

The drones use all of this information to "learn" about their immediate environment, building up a sophisticated picture of their surroundings. In this way, drones can navigate complex situations even when they've never encountered them before.

Related more: A robot you can print out of paper? The future's here, and it's weird

"We are used to science fiction robots being able to act independently, recognise objects and individuals and make decisions," said Professor Sandor Veres, who headed the team."

"In the real world although robots can be extremely intelligent individually, their ability to co-operate and interact with each other and with humans is still very limited."

Another key task for these robots is to be able to interact and co-operate with each other without overloading communications networks – a vital ability in emergency situations where networks will already be overloaded.

Programming developed by the team enables the Quadcopters to work out how to 'politely' fly past each other without colliding. The robots start off flying at the same altitude and then need to collaborate to work out which robot would fly higher and which would fly lower so they are able to pass.

"The learning process the robots use here is similar to when two people meet in the street and need to get round each other," explains ACSE research fellow, Dr Jonathan Aitken. "They will simultaneously go to their left or right until they coordinate and avoid collision."

The team hopes that the new breed of robots will be able to act in rescue and cleanup operations in environments hostile to humans, such as a disaster zones or the sites of nuclear accidents.

"As we develop robots for use in space or to send into nuclear environments – places where humans cannot easily go – the goal will be for them to understand their surroundings and make decisions based on that."

"These simple tasks are part of a major research effort in the field of robotics at Sheffield University," says Professor Veres. "The next step is to extend the programming capability so that multiple robots can collaborate with each other, enabling fleets of machines to interact and collaborate on more complex tasks."

They aren't the only ones excited by the prospect of robotics. This summer, the US Navy has begun testing of its humanoid SAFFiR robot, designed to help fight fires in the restrictive environment found in the bellies of ships.

Check out the video below for more information on how the robots work.

Read more: Nuclear test sites put the next generation of UK robots to the test