This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

A New York school district plans to install facial recognition software in schools to boost security. But the state’s Education Department and advocacy groups are still concerned about how that would affect the privacy of students, parents and teachers.

CNN affiliate WKBW reported that the Lockport City School District will begin implementing the Aegis facial recognition system in schools on June 3 as part of a testing phase. The district hopes to have the system ready by September 1, according to the station.

The district told WKBW that the software would only recognize and store data for sex offenders, suspended students, staff who was terminated or on leave, people prohibited from the school district’s property and anyone believed to pose a credible threat. Video would be stored for 60 days before being erased from the server.

The school district’s policy is just one measure in an effort to ramp up security that includes armed guards in several buildings and an increase in the number of social workers and behavior behavior intervention specialists, according to WKBW.

CNN has reached out to the Lockport City School District but has not yet received a response.

The New York State Education Department said in a statement to CNN that it had not yet approved the district’s use of the software.

“We have made it clear, the Department has not approved the testing of the system planned for next week and we told the District not to commence the testing of the technology until we receive information that assures us that student information will be properly protected,” the department said.

And Jim Shultz, whose daughter is a sophomore in the town’s high school, told CNN that the community is largely calling the software “a stupid waste” and “Big Brother in real life.”

“I know people’s politics pretty well at this point and it is the one issue that joins Trump Republicans and liberal Democrats,” he said.

Rights groups raise alarm about the technology

The New York Civil Liberties Union said it was concerned that the software could perpetuate racial profiling and biases, saying that the system would compare faces to police databases that “have large numbers of minorities arrested from profiling.”

“Facial recognition technology is widely inaccurate and has difficulty identifying women and people of color, and there are so many unanswered questions about the technology,” Stefanie Coyle, education counsel for the NYCLU, told CNN. “Why would we use children as guinea pigs?”

NYCLU officials also said they were concerned that data from the technology could be shared with immigration officials, potentially threatening people at risk of deportation.

“It just stands to reason if you are putting every child or parent’s face into a law enforcement or immigration data base just by going to school, it creates a disincentive to attend,” added NYCLU Executive Director Donna Lieberman. “It turns into a law enforcement tool and that’s totally inexcusable.”

Shultz said that the technology is costly, and was largely motivated by a salesman.

“I think the district was motivated by a well-meaning desire to keep our students safe, and then got bamboozled by a salesman. The cost is $2.7 million (in a district with just 4,800 students) for a system that only works if you know in advance who a school shooter will be, put his photo in a data base, and hope he doesn’t buy a $10 ski mask. It just makes no logical sense,” he said.

‘It’s very alarming’

It is unclear whether information from the cameras would be shared with immigration authorities.

Last June, the NYCLU wrote a letter to the district and the state Education Department asking for more information about the plans and policies surrounding the implementation of the facial recognition software.

Coyle that the district’s response left them with more questions than answers.

“Some of the information even included passwords that we shouldn’t be able to see,” she said. “If they can’t even respond to a routine FOIA request, how will they protect biometric data of children. It’s very alarming.”

Officials from the NYCLU said that they were unsure of the next steps the organization would take but that they planned to fight the policy.

New York state legislators introduced a bill in March that would prevent schools from using facial recognition technology in the 2019-2020 school year and require further study. The bill is currently in committee in both chambers.