With concerns about the spread of radiation from damaged Japanese nuclear reactors and the year-old Deepwater Horizon oil spill, MIT researchers have developed a new algorithm that enables sensor-laden robots to focus on the parts of their environments that change most frequently, without losing track of the regions that change more slowly.

The algorithm is designed for robots that will be monitoring an environment for long periods of time, tracing the same routes over and over.

The algorithm assumes that the data of interest (temperature, the concentration of chemicals, the presence of organisms, and so forth) fluctuate at different rates in different parts of the environment. In ocean regions with strong currents, for instance, chemical concentrations might change more rapidly than they do in more sheltered areas.

The algorithm assumes that researchers already have a mathematical model of the rates at which conditions change in different parts of the environment. The algorithm simply determines how the robots should adjust their velocities as they trace their routes. For instance, given particular rates of change along a route, would it make more sense to make one pass in an hour, slowing down considerably in areas of frequent change, or to make four or five passes, collecting less detailed data but taking more regular samples?

Although the researchers’ algorithm is designed to control robots’ velocity, the first robots on which it was tested don’t actually have velocity controllers. Researchers at the University of Southern California have been studying harmful algae blooms using commercial robotic sensors designed by the Massachusetts company Webb Research. Because the sensors are intended to monitor ocean environments for weeks on end, they have to use power very sparingly, so they have no moving parts.

Each sensor is shaped like an airplane, with an inflatable bladder on its nose. When the bladder fills, the sensor rises to the surface of the ocean; as the bladder empties, the sensor glides downward.

The more rapidly the bladder fills and empties, the steeper the sensor’s trajectory up and down, and the longer it takes to traverse a given distance — so it’s possible to concentrate the sensor’s attention in a particular location. Working with colleagues in the USC computer science department, the MIT team developed an interface that allows ocean researchers to specify regions of interest by drawing polygons around them on a digital map and indicating their priority with a numerical rating.

The new algorithm then determines a trajectory for the sensor that will maximize the amount of data it collects in high-priority regions, without neglecting lower-priority regions.

The algorithm currently depends on either some antecedent estimate of rates of change for an environment or researchers’ prioritization of regions. But in principle, a robotic sensor should be able to deduce rates of change from its own measurements, and the MIT researchers are currently working to modify the algorithm so that it can revise its own computations in light of new evidence.

Their research will be presented at the Institute of Electrical and Electronics Engineers’ International Conference on Robotics and Automation in May.