“It's really strange how many people put gun rights forward as the reason they stand against gun control legislation, but will in the same breath advocate for increased presence of facial recognition cameras and body scanners and gait recognition tools, as if that somehow weren't an imposition on our protection against unreasonable search and seizure,” says Damien Williams, a technology ethicist at Virginia Tech. “It seems to indicate that folks are perfectly willing to let go of some rights in the name of safety, just not the rights that would actually help curtail the danger at its source.”

The tradeoffs between surveillance and safety aren’t new. Americans have worried about phone tapping and video surveillance in the name of protection by the state for decades. After 9/11 there was plenty of ink spilled about these tradeoffs in the context of airport security and the merits of what some scholars call “surveillance theater.” But most of these conversations center on surveillance by the state—the US government tapping your phones, or watching you on camera, or scanning your face in an airport. What Lockport and Taylor Swift are doing is almost the opposite—choosing to surveil their fans and students in the name of safety, precisely because the state won’t step in to help. Perhaps because they have no faith that the state ever will. "I feel like I can't go anywhere and feel like I'm safe. I walk into a library, I don't feel safe. I walk into a yoga studio, I don't feel safe. Walk into a bar, don't feel safe," FSU student Ellie Gensch told WCTV after a shooting in California.

Polling suggests that in the past few years, Americans have become more worried about safety and less worried about privacy. Researchers are hesitant to ascribe any one cause to shifts like this, but I don’t think it’s absurd to suggest that this shift is at least in part due to the continued string of mass shootings in the United States. Amid a backdrop of constant violence, what’s a little light surveillance? The threat of ubiquitous surveillance seems less real to many people than the threat of being gunned down at work, a bar, the office, school, a hospital, a concert, or, really, anywhere.

And companies recognize this feeling. The facial recognition systems being sold to schools across the country are capitalizing on it. Nest cameras, Ring doorbells, companies like Flock Safety that sell outdoor security cameras to homeowners associations. “We’ve seen that there are tech firms that are preying on the fears of school districts to get them to purchase technology that will not work and they do not need,” says Stefanie Coyle, education counsel at the New York Civil Liberties Union. This is a booming area of business because, even while these cameras don’t necessarily make anybody actually safer, they make people feel like something is at least being done in the face of all this danger. And the “something is better than nothing” line used to justify these surveillance systems is a dangerous line of reasoning, not just because it erodes privacy, and not simply because it offers a false sense of security, but also because it disproportionately targets nonwhite bodies.

SUBSCRIBE Subscribe to WIRED and stay smart with more of your favorite Ideas writers.

Surveillance systems like the ones set up in Lockport and at the Rose Bowl rely on having “known actors” to identify—they look for faces in a database. Those databases have to come from somewhere, and are usually populated by mugshots from law enforcement. In the case of Lockport, the system will supposedly be “used to alert school officials if anyone from the local Sex Offenders Registry enters a school or if any suspended students, fired employees, known gang members or an affiliate enters a school.” When I asked KC Flynn, head of SN Technologies, which makes the Lockport system, where those “known gang members” might come from, he told me that it was up to individual schools to load those in.