Back in July, we reported on how Google conducted "field research" for its new face unlock feature. It consisted of sending teams out to approach people and offering them $5 gift cards in exchange for scanning their faces.

The tests were meant to help Google collect data to help train algorithms and make the face unlock feature as accurate as possible. It was carried out by temporary employees (called Google TVCs, otherwise known as temps, vendors or contractors), and they were paid through a third-party company known as Randstad.

Now, we're learning of some of the eyebrow-raising techniques used by some employees thanks to interviews from New York Daily News via The Verge. While there are many examples in the original article, here are some of the most alarming excerpts.

"They said to target homeless people because they're the least likely to say anything to the media," the ex-staffer said. "The homeless people didn't know what was going on at all." "Some were told to gather the face data by characterizing the scan as a "selfie game" similar to Snapchat, they said. One said workers were told to say things like, "Just play with the phone for a couple minutes and get a gift card," and, "We have a new app, try it and get $5." "Another former TVC said team members in California were specifically told they could entice cash-strapped subjects by mentioning a state law that says gift cards less than $10 can be exchanged for cash."

As you can see, some of the techniques ranged from misleading or flat out lying to people about what the research entailed. Either by telling them it was only a game or to just play with this phone for a couple of minutes. Some were even failing to mention that your face was being recorded or that they were working for Google.

Furthermore, they were encouraged to "target" homeless people because they would be less likely to speak to the media. And there were also reports that contractors were told specifically to collect data on people with dark skin.

As alarming as that might sound, there is a good reason Google wanted to single out people with dark skin. In the past, facial scanning has come under scrutiny for not working as well on people of color. Looking to avoid such troubles, it's no surprise Google wanted to collect samples from people with darker skin tones.