Imagine you had the ultimate surveillance system: A network of sensors on the ground and hovering overhead. All that spying power is useless, however, if you don't know where to point your cameras.

That seems to be the idea behind Persistent Surveillance Automation, an effort by the Office of Naval Research to automate sensor networks to give commanders a better understanding of a battlespace. ONR last week issued a broad area announcement seeking concepts for the system, which would task surveillance assets so they focus autonomously on areas of interest.

The idea of "persistent stare" is not new: Think of the JLENS (pictured here), a sophisticated camera system that can be perched on towers or on blimps to provide wide area surveillance. But as unmanned surveillance systems proliferate, commanders will be faced with an overwhelming amount of data. A more autonomous sensor network would – in theory – help provide a more integrated and coherent picture.

It would also do more with less. According to the ONR solicitation, this system would use a combination of hardware and software to "optimize the information within a region constrained by mission and resources, automatically analyze sensor data and associate this data to entities for further classification. It is desired that the system communicate using a simple and specific tasking language allowing the integration of multiple disparate sensors and platforms."

At stake is an initial $750,000 in research funds for Fiscal Year 2010; ONR may issue up to two awards. Concept papers are due on August 7.

[PHOTO: U.S. Army]

See Also: