In the fight for more privacy online, University of Toronto researchers have devised an algorithm to sabotage facial recognition technology.

Called an “adversarial attack,” the algorithm “catastrophically” affects facial recognition detectors by making subtle changes to specific pixels — manipulations small enough they almost can’t be picked up by the human eye, said researcher Joey Bose, a U of T electrical and computer engineering master’s student.

“Machine learning models are very ubiquitous in our world, people use it all the time and it’s very important to understand methods in which they fail because we shouldn’t be so trusting of these algorithms without fully understanding them,” he said.

The impetus of the study was to help people understand the impact of facial detection and how it works, Bose said. He and his supervisor, Parham Aarabi, an associate professor, started their research in January.

“Machine learning models are very black box, so we don’t fully understand the ins-and-outs, which why research in this field is so vital,” he said.

One example of how facial recognition technology can be used is in advertising — a person’s biometric data can be collected by facial recognition detectors, he said, which can be siphoned and used for targeted advertisements and recommendations.

“We want to empower users to protect their own privacy,” he said. “We are giving agency to people.”

Bose said the prototype, which is limited to computers, but could be developed as an app, engages in a battle with the detectors.

“Over time, they play a game against each other, where the one fools the other and the other tries to detect the correct face,” he said, adding that the university-created generator can become advanced enough it can dupe the facial-recognition algorithm with almost 100 per cent accuracy.

“If you think about the locations of the eyes, the nose, and the mouth, those are indicators that there’s a face in an image, so if we perturb the pixels in those regions, chances are it’s easier to fool the detector,” Bose said.

Mark Hayes, a privacy and technology lawyer in Toronto, said facial recognition detectors tend be rather inaccurate, even by today’s standards. He said this new research could lead to a “technology arms race.”

“As soon as somebody comes out with something like this, the facial recognition people are going to then tweak their algorithms to try get around the disabling algorithms. It’s a ping-pong back-and-forth,” he said.

Facial recognition, Hayes said, can be a positive for some institutions. Police are going to start using the technology more frequently, so impeding processes to solve crimes could jeopardize important investigations, he said.

“Because there is such pervasive surveillance, law enforcement is increasingly trying to get a hold of these videos that are created and then using facial recognition software to try and match it up with known people that they have,” he said. “I think ultimately there’s going to be a lot of push-and-pull here.”

Brenda McPhail, director of privacy, technology and surveillance for the Canadian Civil Liberties Association, said technology used to subvert surveillance “makes sense.”

She said unbridled surveillance, including facial recognition technology, could “chill” protests and other forms of dissent that are constitutionally protected.

“The consequences can be a severe erosion of democratic participation,” McPhail said, noting that facial recognition is already utilized in predictive policing in Canada — in Vancouver, for example, she said there’s a police program that uses it to determine which areas have high levels of crime.

“With cameras and facial recognition your transient movements through space and time become tracked, fixed and recorded, in ways that can be later used for your benefit or against you. I think that line is very thin, and I think as a society we haven’t come to grips as to where that line should be,” she said.

Loading... Loading... Loading... Loading... Loading... Loading...

“Basically what the researchers are doing is creating a tool that if you want to put that picture of your kid up and share it with Grandma and, at the same time, protect them from being tracked over time, you can do that,” McPhail said.

The University of Toronto research proves it is possible to “break” facial recognition detectors, Bose said.

“The next step is to make this attack stronger, better and make it work against multiple different types of detectors,” he said. “This is more of a first step.”