Since February, with relatively little scrutiny, Orlando leaders have been experimenting with a powerful new technology that could make our city one of the first in the country to run a real-time mass surveillance program. Much has been written about the free partnership with Amazon through which the Orlando Police Department has been testing Rekognition, a facial recognition software whose capabilities and usage are unprecedented among American law enforcement agencies. But few have explored exactly how the software works and how deeply it will reach into our lives.

The program is currently being tested on a small number of cameras, but if and when the program becomes fully operational, city officials hope Rekognition will help them catch criminals, find missing children and identify threats by essentially turning more than a hundred public street cameras into constant face-scanning machines.

In fact, once Rekognition becomes fully operational, its only constraint will be the amount of bandwidth available to run the program on dozens, or even hundreds, of cameras at once. But that's not for a lack of desire on the city's end.

"I can activate as many cameras as I need," says Rosa Akhtarkhavari, Orlando's chief information officer and the driving force behind Rekognition inside City Hall. "But my problem is I need to have that bandwidth to move up. Am I envisioning that we are going to stream all our hundreds of cameras? I don't think that's going be [financially viable]. ... When we need it, we can activate it."

The program will work like this: Orlando Police upload the photo of a "person of interest" into a database. Rekognition analyzes the person's face and turns it into a unique biometric. Using this data, the software taps into Orlando's network of surveillance cameras and looks for a match. If it finds one, it alerts police.

Before identifying that "person of interest," however, Rekognition might scan you.

The scan will probably last less than a second. After determining you weren't a match, it will scan the person strolling behind you. And the couple walking their dog across the street. And the business partners headed back to work from lunch. Without being explicitly told, Rekognition will scrutinize everyone around you until it finds a possible match. Amazon, in fact, says its software can detect up to 100 faces in "challenging crowded photos."

Orlando hasn't fully unleashed this technology yet. But city officials see the surveillance software as a tool to keep residents safe – and certainly as nothing out of the ordinary. After all, they say, facial recognition is everywhere. People use it to unlock their phones. Airports screen passengers to verify their identities. Facebook employs it to alert users when someone posts a photo of them.

"Facial recognition technology is not new," city staff wrote in a July 6 memo to Mayor Buddy Dyer and the City Council. "In fact it has become a relatively normal occurrence in our daily lives."

Even so, civil liberties advocates warn that Orlando and Amazon are sailing into dangerous, uncharted waters.

The way Orlando plans to use Rekognition – as real-time surveillance – is completely different from how other law enforcement agencies are commonly using facial recognition, says Jennifer Lynch, a senior staff attorney with the Electronic Frontier Foundation. Every person who passes in front of a camera is scanned, regardless of prior suspicion of a crime. And multiple studies have shown facial recognition systems make mistakes – particularly on dark-skinned women. Misidentification could lead to innocent people being wrongfully detained by police for crimes they didn't commit, or worse, trigger confusion that leads to a deadly encounter.

What's more, although Orlando has been testing Rekognition for months, city officials still haven't developed rules for the program – though they've promised to do so if it moves forward. And right now, there are no state or federal laws to regulate the use of this technology, which has led some tech companies, including Microsoft, and even police departments that use facial recognition to voice concerns about the potential abuse of this software in the hands of the government.

Privacy advocates, meanwhile, worry about it becoming a slippery slope into a dystopian world of government surveillance.

"Everybody is a suspect," Lynch says. "If facial recognition is becoming a new norm in the way Orlando is talking about, then we should all be very afraid."