Facebook's algorithms may deliver advertisements to users based on gender and race stereotypes, according to a new study published Wednesday night.

The study, produced by researchers at Northeastern University, the University of Southern California and digital rights nonprofit group, Upturn, suggests that Facebook targets ads in ways that can be discriminatory even when the advertisers do not intend to target or exclude certain groups.

ADVERTISEMENT

"Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive," the researchers wrote.

The study, which has not yet been submitted for peer review, comes a week after the Department of Housing and Urban Development (HUD) charged Facebook with "enabling" and encouraging housing discrimination through its ad-targeting practices. HUD alleges that Facebook's ad-targeting platform allows advertisers to discriminate against minorities, women and other protected classes.

Upturn's study, titled "Discrimination through optimization: How Facebook's ad delivery can lead to skewed outcomes," seeks to support some of HUD's claims.

The researchers spent more than $8,500 placing ads over the course of their research. They did not specify demographic groups that they hoped to target, allowing Facebook's algorithms to decide which users would see the ads.

They found that Facebook's algorithms delivered lumber job ads mostly to white men, while it mainly delivered secretary jobs to black women.

Ads for jobs with taxi companies went to a 75 percent black audience, while ads for supermarket cashier positions went to an audience of 85 percent women, without any demographic specification from the researchers who placed the ads.

When it came to housing, the researchers found ads promoting houses for sale went to mostly white users, while ads promoting rental homes went to largely black audiences.

"We observe significant ad delivery skew along racial lines in the delivery of our ads, with certain ads delivering to an audience of over 85% white users while others delivering to an audience of as little as 35% white users," the researchers wrote.

During the study, the researchers wrote that they placed gender-stereotyped images onto ads, then altered the images so users would not be able to see them. Researchers found that, even when users could not see the images, Facebook's algorithms targeted ads with images of wrestlers or the military to men, while ads with invisible roses went to women.

They concluded that "Facebook has an automated image classification mechanism in place that is used to steer different ads towards different subsets of the user population."

The researchers noted that they inferred race data by cross-referencing with location and voter registration data.

"Our findings underscore the need for policymakers and platforms to carefully consider the role of the optimizations run by the platforms themselves—and not just the targeting choices of advertisers—in seeking to prevent discrimination in digital advertising," they wrote.

Federal laws prohibit discrimination in housing and employment advertising, the researchers noted.

A spokesperson for Facebook told The Hill that the company opposes discrimination in any form.

"We’ve announced important changes to our ad targeting tools and know that this is only a first step," the spokesperson added. "We’ve been looking at our ad delivery system and have engaged industry leaders, academics, and civil rights experts on this very topic — and we're exploring more changes."