Panopticon

Surveillance cameras can now recognize and identify faces and track people as they move. But they still look the same as ever — in fact, they’ve gotten smaller and harder to spot — so many people fail to recognize this looming surveillance panopticon.

That’s the argument of an essay published Tuesday in The Conversation by William Webster, a privacy researcher at the University of Stirling. He argues that the public has to be involved in meaningful conversations before any of this surveillance tech is deployed in order to make the whole process more transparent and ethical.

Bit Late, Though

With police body cameras, conventional security cameras, drones, and other cameras with built-in facial recognition software, it’s a bit too late to involve the public in advance.

But, Webster says meaningful legislation and regulations could be written to keep things from getting out of hand and to keep the public in the loop as new AI systems and ever-smaller, better-hidden cameras are developed.

“The direction of travel for surveillance cameras does not need to be towards a defined technological determinism where it inevitably becomes more and more intrusive,” Webster writes.

READ MORE: Surveillance cameras will soon be unrecognisable – time for an urgent public conversation [The Conversation]

More on AI: AI-Aided Video Surveillance Will Watch And Silently Judge Us