"The use of facial recognition in schools creates an unprecedented level of surveillance and scrutiny," says John Cusick, a fellow at the Legal Defense Fund. "It can exacerbate racial disparities in terms of how schools are enforcing disciplinary codes and monitoring their students."

Glaser, who says he is a “card-carrying member of the ACLU,” is all too aware of the risks of facial recognition technology being used improperly. That’s one reason, in fact, why he decided to release SAFR to schools first. “In my view, when you put tech in the market, the right thing to do is to figure out how to steer it in good directions,” he says.

“I personally agree you can overdo school surveillance. But I also agree that, in a country where there have been so many tragic incidents in schools, technology that makes it easier to keep schools safer is fundamentally a good thing.”

RealNetworks began developing the technology underpinning SAFR shortly after Glaser returned from a three-year hiatus. He hoped to reinvent the company, a pioneer of the PC age, to compete in the mobile, cloud computing era. RealNetworks’ first major product launch with Glaser back at the helm was a photo storing and sharing app called RealTimes. Initially, the facial recognition technology was meant to help the RealTimes app identify people in photos. But Glaser acknowledges that RealTimes “was not that big a success,” given the dominance of companies like Google and Facebook in the space. Besides, he was beginning to see how the technology his team had developed could be used to address a far more pressing and still unsolved problem.

Glaser approached the administrators at his children’s school in Seattle, University Child Development School, which had just installed a gate and camera system, and asked if they might try using SAFR to monitor parents, teachers, and other visitors who come into the school. The school would ask adults, not kids, to register their faces with the SAFR system. After they registered, they’d be able to enter the school by smiling at a camera at the front gate. (Smiling tells the software that it’s looking at a live person and not, for instance, a photograph). If the system recognizes the person, the gates automatically unlock. If not, they can can enter the old-fashioned way by ringing the receptionist.

According to head of school Paula Smith, the feedback from parents was positive, though only about half of them opted in to register their faces with the system. The school is approaching the technology with a light touch. It decided deliberately not to allow their students, who are all younger than 11, to participate, for instance. “I think it has to be a decision that’s very thoughtfully made,” Smith says of using this technology on kids. Today, University Child Development School uses SAFR’s age filter to prevent children from registering themselves. The software can predict a person's age and gender, enabling schools to turn off access for people below a certain age. But Glaser notes that if other schools want to register students going forward, they can.

Each face logged by SAFR gets a unique, encrypted hash that’s stored on local servers at the school. Today, Glaser says it's technically unfeasible to share that data from one site with another site, because the hashes wouldn't be compatible with other systems. But that may change going forward, Glaser says. If, for instance, a school system wanted to deploy SAFR to all of its schools, the company may allow data to flow between them.

"It's tempting to say there's a technological solution, that we're going to find the dangerous people, and we're going to stop them." Rachel Levinson-Waldman, Brennan Center

For now, RealNetworks doesn’t require schools to adhere to any specific terms about how they use the technology. The brief approval process requires only that they prove to RealNetworks that they are, in fact, a school. After that, the schools can implement the software on their own. There are no guidelines about how long the facial data gets stored, how it’s used, or whether people need to opt in to be tracked.