This fall, as students file into Lockport City schools in upstate New York, they will be watched not just by teachers. Instead, for the first time in the district’s history, students will be monitored by a sophisticated new surveillance system that scans their faces, looking for matches to the school’s security database.

Lockport’s facial recognition program has become both a local controversy and a national test case at the forefront of a wave of similar systems rolling out in American schools. To install its system, the Lockport school district was awarded $4 million through the Smart Schools Bond Act , a New York State fund. While most other schools in the state applied for funding to update computer labs or digitize books, Lockport requested specific funds for “new cameras and wiring…to provide viewing and automated facial and object recognition of live and recorded surveillance video,” plus “additional surveillance servers…to provide enhanced storage of recorded video and processing,” according to the grant application.

It might sound like dystopian science fiction, but this could be the not-too-distant future for schools across America and beyond. Researchers at the University of California, San Diego, for instance, have already begun publishing models for how to use facial recognition and machine learning to predict student engagement. A Seattle company recently offered up an open-source facial recognition system for use in schools , while startups are already selling “engagement detectors” to online learning courses in France and China. Advocates for these systems believe the technology will make for smarter students, better teachers, and safer schools. But not everyone is convinced this kind of surveillance apparatus belongs in the classroom, that these applications even work, or that they won’t unfairly target minority faces.

Here’s what SN Technologies’ vision for Aegis looks like: A school using the platform installs a set of high-quality cameras, good enough to detect individual student faces, before determining exactly which biometrics it thinks must set off the system. Crucially, it’s up to each school to input these facial types, which it might source from local police and mug-shot databases, or school images of former students it doesn’t want on its premises. With those faces loaded, the Aegis system goes to work, scanning each face it sees and comparing it with the school’s database. If no match is found, the system throws that face away. If one is, Aegis sends an alert to the control room.

Earlier this year, the school district announced it would be using tech developed by SN Technologies Corp. , the Canadian company behind Aegis, a surveillance platform that comes with both facial recognition software and a tool designed to flag guns that might appear on the camera footage (provided the firearm is in someone’s hand, not in a bag). In the wake of high-profile mass school shootings across the US, Lockport, a small, conservative town of around 20,000 people, has invested in Aegis out of a belief the facial recognition system will help safeguard students, even though there’s no evidence that such a system would be an effective security measure in an active shooter scenario. As this issue went to press, KC Flynn, the head of SN Technologies, told me that 20 other US school districts were considering moving forward with Aegis.

The idea is that the school could get an extra few seconds of warning when an unwanted person arrives on campus, whether that’s an expelled student or an escaped felon. But critics of the system point out that the vast majority of school shooters are enrolled students—individuals who probably wouldn't be in the facial database.

The New York branch of the American Civil Liberties Union (NYCLU) is worried about just that. It’s well known that facial recognition systems are often biased in ways that disproportionately affect people of color. It’s also unclear what biometric database SN Technologies uses to train the Aegis system to detect faces. Previous cases have shown how big an impact training data has on the accuracy of these systems—and using certain databases creates a system that can, for example, incorrectly peg 28 members of Congress (most of them people of color) as criminals on the basis of images from a mug-shot database, as demonstrated in a recent ACLU test that used Amazon’s facial recognition tool. Flynn declined to comment on how Aegis was developed, citing the proprietary nature of the software.

“It is cutting edge,” Dr. Robert LiPuma, the director of technology for the district, told the Lockport Union-Sun & Journal in March. (The district did not respond to repeated requests for comment for this story.) When it comes to school security, LiPuma said Lockport hopes to be “a model.”

To implement its system, Lockport is installing or updating a total of 417 cameras, according to an excerpt of a contract obtained by Jim Shultz, a parent in Lockport, and shared with VICE, that outlines the district’s planned implementation of the surveillance tech (with SN Technologies being a subcontractor, in this case). The network will cover six elementary schools, one middle school, one high school, and an administrative building.

“The serious lack of familiarity with cybersecurity displayed in the email correspondence we received and complete absence of common sense redactions of sensitive private information speaks volumes about the district’s lack of preparation to safely store and collect biometric data on the students, parents and teachers who pass through its schools every day,” an editor’s note to the NYCLU’s statement on the Lockport documents reads.

Hundreds of documents related to Lockport’s new surveillance program , obtained by the NYCLU in late August through a Freedom of Information Law request, suggest that Lockport did not engage with the community before deciding to move ahead with installing the surveillance network, and that a security consultant who taught Lockport’s board about the tech and was later hired by the district holds licensing for Aegis through a separate company, CSI. The NYCLU found nothing in the documents outlining policies for accessing data collected by the cameras, or what faces would be fed to the system in the first place. And based on emails acquired through the same FOIL request, the NYCLU noted, Lockport administrators appeared to have a poor grasp on how to manage access to internal servers, student files, and passwords for programs and email accounts.

In theory, the safeguard against a student of color being misidentified as a felon, for example, is that whoever is in the control room must confirm that a match is indeed correct and not a false positive. That may not be so simple, especially if the security worker is white, and what happens once the system triggers an alert is up to each school to decide.

The Aegis website offers little information about how the system actually works, either. It describes the facial recognition tool as something that “will be used to alert school officials if anyone from the local Sex Offenders Registry enters a school or if any suspended students, fired employees, known gang members or an affiliate enters a school.” As to where such a database of “known gang members or an affiliate” would come from, Flynn said Aegis doesn’t come with preloaded faces, so it’s on the individual school to provide the system whatever biometrics it thinks should be registered. Individual schools also get to select the duration of data storage, though in most cases, Flynn said, the system won’t be saving individual faces as it scans students moving about the school. Rather, it will attempt to square any of them to those registered in the system, and discard if no match is found.

Of course, if a school wanted to put every student’s face in the system to track throughout the school year, theoretically, it could. “That hasn’t been my experience,” Flynn noted, when I raised that possibility. “That’s not how we package the system.”

Meanwhile, Jim Shultz, whose daughter currently attends Lockport High School, has been trying to organize parents to rally against the system. He sees it as not only an invasion of privacy, but a waste of money for a district that comprises around 4,500 students. Of the original $4 million Smart Schools grant, Lockport has spent over $3 million to date, putting its per-pupil spending on the tech at over $550. When Shultz tried to voice his concerns to school administration and a security consultant working with the district, he told me that the board seemed not to take him seriously.

“Students should think of schools as a welcoming place to learn,” Coyle added. “They shouldn’t have to worry that their every single move is being monitored, and their pictures are going to wind up in some kind of law enforcement or immigrant database just because they decided to come to school today.”

In Lockport, school security officers will be responsible for watching the cameras in a surveillance room, according to Flynn. At any other school, it’s still anyone’s guess who will have access to the surveillance system. This, in turn, leads the NYCLU to wonder whether undocumented students and their parents risk being flagged and turned over to US Immigration and Customs Enforcement for deportation. To complicate matters further, schools can each establish their own protocols and decide themselves who can access the information. Without knowing how long this data is stored for, and by whom, it’s hard to evaluate the potential security risks. It’s also currently unclear if students, for their part, will be allowed to opt out of facial scanning. In the US, biometric data from students of any age falls under the Family Educational Rights and Privacy Act (FERPA), a law meant to protect the privacy of student education records. But if the surveillance system is controlled by law enforcement, and not the school, then FERPA doesn’t apply.