Flip a coin.

If you’re an American adult, that represents the odds that your photo has been enrolled to a law enforcement face recognition database, allowing you to be identified and tracked as you walk down the street, attend a protest, or visit a rehab center.

This isn’t speculation: It’s the result of a year-long investigation into police use of face recognition technology, published last October by researchers at Georgetown University. In at least 26 states, the report found, merely having a state-issued driver’s license or photo ID allows police to remotely search for and identify your face from photos taken on the street or posted to social media – without a warrant or any court’s supervision. Sixteen states also make their residents’ ID photos available to the FBI, whose own face recognition databases now contain more than 411 million face images. And unlike with more traditional biometrics like fingerprints and DNA, the vast majority of the faces belong to innocent Americans, not criminal suspects.

In other words, the ability of police and government agents to identify and track people in public is no longer science fiction in 2017. And with the arrival of the Trump administration, ubiquitous face recognition represents a nightmare scenario for activists, journalists, immigrants, and other groups that could be targeted by Trump’s unpredictable hostilities.

Although it’s impossible to say what he’ll do as President, that scenario now looks more plausible than ever. Thanks to the Obama administration’s entrenchment and expansion of surveillance programs started under George W. Bush in the aftermath of 9/11, Trump will inherit the most powerful surveillance machine ever built — and face recognition will almost certainly be a key component.

“I take no pleasure in saying this, but we’ve been warning for the past 8 years that the Orwellian surveillance programs instituted under the Obama administration could fall into the hands of someone that people may trust less,” said Kade Crockford, the director of the ACLU of Massachusetts’ Technology For Liberty program. “We now find ourselves in exactly that situation.”

Trump’s transition team includes multiple people with a financial interest in making pervasive face recognition a reality. Among them are Michael Dougherty, who recently left his position as CEO of the Secure Identity & Biometrics Association, a lobbyist group that represents companies developing face recognition technologies. Also on Trump’s team is John Sanders, who previously served as the TSA’s Chief Technology Officer and now serves on the board of Evolv, a company that uses face recognition and other technologies to provide “real-time, modern threat detection and prevention.”

Police departments across the country have been using similar biometric technology for years, usually without any public notice. In San Diego and elsewhere, secretive police programs have equipped officers with special smartphones designed to capture the faces, irises, and sometimes DNA swabs of people stopped on the street. The devices then ping biometric databases for a match, returning any identifying information or criminal records. Body cameras, – originally pitched as a police transparency tool in response to countless high-profile killings of unarmed black people – are also being rapidly adopted, and increasingly support face recognition.

Clare Garvie, an associate with Georgetown University’s Center on Privacy and Technology and a co-author of the recent report, warns there are still no laws limiting these face recognition searches. As a result, police across the country now have the capability to indiscriminately identify people on the street — with or without their knowledge or consent.

“This is concerning when you consider the unique quality of face recognition as a remote biometric search and identification tool,” Garvie told Vocativ. “You may not know a search has happened. And it can be done not just on someone suspected of a particular crime, but on a crowd of people.”

From there, it becomes easy to map out where all this could be headed. With massive databases of faces freely available to police and government agencies, the thousands of CCTV cameras strewn about American towns and cities start to become the nervous system of a massive dragnet surveillance network. Walking past a camera not only allows police to identify you at that specific time and place, but to track your face as it appears across different times and locations.

That surveillance data can then be combined with social media monitoring tools like the CIA-backed Geofeedia, or stingrays, which are the fake cell phone towers frequently deployed in secret by police, to capture an even more precise portrait of peoples’ activities and associations.

For an authoritarian-leaning leader like Trump who has a track record of punishing and discrediting his perceived enemies, the possibilities of massive-scale face recognition are endless. A well-known political organizer can be spotted exiting a rehab center or abortion clinic, and subsequently accosted by an army of internet trolls; a journalist’s confidential sources can be easily identified while meeting her at a coffee shop; a protester can be followed back from a demonstration and intimidated by police to stay home next time, if they know what’s good for them.

In Communist East Germany, the infamous Stasi secret police once accomplished this to a much lesser degree using large human networks of informants. Now, with powerful algorithms and ubiquitous cameras, it’s theoretically possible for the ambient physical monitoring of entire populations to be completely automated in any oppressive regime.

In the U.S., there’s virtually nothing legally preventing police from using face recognition searches to conduct this kind of “sustained monitoring” right now, said Garvie. The only significant hurdles are the technical limitations of “legacy” surveillance cameras, which can’t achieve the resolution necessary to clearly and consistently detect and track faces. Computer vision experts say deploying a dependable real-time face recognition system at scale would also be complicated by various other technical factors, like lighting conditions, facial occlusion, and the efficiency of the algorithms being used.

But while it’s not yet possible to deploy at a large scale, the Georgetown Law report found that five police agencies, including the Los Angeles Police Department, are currently experimenting with real-time face recognition systems. One system used by an intelligence “fusion center” in West Virginia, called “MXserver,” boasts the ability to ping a human operator whenever a face matching a “hot list” of flagged individuals walks by a camera.

“While real-time deployments aren’t very effective today, the technology is improving far faster than any effort to regulate its use in even the most pervasive ways,” says Garvie.

Frustrated with the slow pace of legal remedies, some have designed creative tricks to protect against unaccountable face recognition. Berlin-based artist Adam Harvey, who previously designed the CV Dazzle makeup patterns that let wearers hide from face recognition algorithms, recently announced a new anti-face recognition project called HyperFace. Rather than try to prevent algorithms from detecting a face, the camouflage pattern includes template images that detect more strongly as a “face” than the wearer’s actual face.

But solutions like these will always be a cat-and-mouse game. The patterns must be designed to defeat specific face detection algorithms, and eventually those algorithms will be replaced by new and improved versions, sending technologists back to the drawing board.

Crockford says that while these creative interventions are cool, the immediate solution to fighting unchecked surveillance will be to fight back against the secret adoption of tools like face recognition at the local level. That will involve journalists and citizens working to uncover the surveillance technologies being quietly bought by their local police before they become adopted. And much like with the concept of “sanctuary cities,” citizens can also petition to stop their local governments from sharing surveillance data with federal agencies – whether it’s face recognition photos taken from state-issued IDs, iris scans, or something else.

“Clearly this doesn’t mean that the game is over for privacy. Our power is reasonably well-distributed,” says Crockford. “What we really need to do is start approaching police surveillance at the local level differently. We can’t have 15 years of civil liberties violations before it goes to a court.”

What’s the worst that can happen? This week, Vocativ explores the power of negative thinking with our look at worst case scenarios in politics, privacy, reproductive rights, antibiotics, climate change, hacking, and more. Read more here.