Only MasterCard knows I'm Manny the cat

In 2015, MasterCard's pilot program for Selfie Pay took place with Silicon Valley's First Tech Federal Credit Union. So I'm going to make a guess that the opportunities to troubleshoot user skin color were few and far between. I say this because facial recognition technology has a well documented problem "seeing" black people.

HP got a lot of bad press in 2009 for its cameras' inability to "see" black faces. Horrifyingly, Google's facial recognition software in 2015 tagged two African-Americans as gorillas. Google's Yonatan Zunger reacted appropriately, yet noted in a tweet that "until recently, Google Photos was confusing white faces with dogs and seals. Machine learning is hard."

Machine learning is indeed hard. So is security.

And don't let current headlines fool you -- the whole selfie-security plan wasn't entirely a security-based decision.

"Selfie pay" was aimed at MasterCard's millennial customers when it was announced in July 2015. Ajay Bhalla, MasterCard's president of enterprise security solutions, told the press it would be a way for the company to engage with young people. He added, "The new generation, which is into selfies ... I think they'll find it cool. They'll embrace it."

Reassuringly, college students reacted to Mr. Bhalla's remarks with an appropriate amount of skepticism and mistrust. I just hope everyone in Bhalla's security chain "is into" encryption as much as selfies.

We may share your password with our advertisers

We can yell "encrypt or GTFO" at MasterCard all we want, and it won't change our other big problem with all of this: The breach that comes from within. Meaning, when companies sell our personal data in backroom deals to greedy brokers, or let it get siphoned into government databases behind the scenes.

Did you ever think someone might sell your password to advertisers as marketable information about you? That's the intersection we're approaching.

Welcome to the entirely messed-up, behind-the-scenes free-for-all of facial recognition technology in the private sector. There is nothing preventing private entities (businesses, app developers, data brokers or advertisers) from selling, trading, or otherwise profiting from an individual's biometric information. Distressingly, the U.S. government has only gotten as far as a working group to develop rules around companies using facial recognition. Voluntary rules, that is.

This gets super-worrying when you consider there are companies, hell-bent on using every scrap of user data for profit, that are pouring money into making facial recognition both accurate and ubiquitous. Like Facebook, whose "DeepFace" project will most likely commingle with its billion-user-rich stash of identified photos. Even though its name is a face-palm, DeepFace's ability to identify someone by photo alone is up to a remarkable 97% accuracy.

Entities like Facebook are a great example of where facial recognition and data monetization are coming together in ways that are troubling. In fact, Facebook has been using facial recognition to increase the worth of its data since at least 2011 -- when the Electronic Privacy Information Center appealed to the FTC to "specifically prohibit the use of Facebook's biometric image database by any law enforcement agency in the world, absent a showing of adequate legal process, consistent with international human rights norms."

#NoFilter surveillance

EPIC isn't alone in its worries about protecting consumers from facial recognition databases. At a Senate Judiciary subcommittee hearing in 2012, Sen. Al Franken remarked that "Facebook may have created the world's largest privately held database of face prints without the explicit knowledge of its users."

Franken continued, linking the deficits in consumer protections with the FBI's then-new facial-recognition program designed to identify people of interest called Next Generation Identification (NGI). "The FBI pilot could be abused to not only identify protesters at political events and rallies, but to target them for selective jailing and prosecution, stifling their First Amendment rights," he said. NGI became fully operational in 2014.

MasterCard's Ajay Bhalla probably wasn't thinking about that when he was trying to get down with the kids. He probably also doesn't know that Selfie Pay might cross-match and compare really well with commercial surveillance products like TrapWire, which is sold to and implemented by private entities, the U.S. government "and its allies overseas."

TrapWire combines various intel surveillance technologies with tracking and location data, individual profile histories from various sources (data mining and social media) and image data analysis (such as facial recognition; TrapWire's video component) to monitor people under the guise of threat detection.

Upon the 2012 release of Wikileaks' Stratfor documents, news about TrapWire and sibling surveillance technologies (like Europe's INDECT) were met with surprise, fear, outrage, and protests. A significant number of TrapWire and INDECT's opponents believe the surveillance systems to be direct threats to privacy and civil freedoms, and that their implementation could constitute human rights violations.

MasterCard's Selfie Pay will very likely be opening the door to consumer-level biometric security, and -- if done properly -- that could be a really good thing. I just hope the methods of storing and protecting this data are as shrewd and clever as the people profiting off it by passing it around in the background.