Facebook allowed advertisers to target a “Jew hater” ad category among its potential audiences for exposure on the world’s largest social media network, a new report from ProPublica finds.

The nonprofit investigative journalism organization says Facebook only removed the automated ad category, along with several other anti-Semitic categories that auto-populated such as “How to burn jews” and “why Jews ruin the world,” after being contacted for comment. And although the ad category audiences were relatively small given Facebook’s 2 billion monthly users, ProPublica was given approval by Facebook for several ad placement campaigns that reached thousands of people.

Facebook’s post-boosting tool is used by publishers and advertisers to pay for targeted exposure to specific audiences. The company’s business portal recently came under fire after it was revealed last week that $100,000 worth of ads linked to “inauthentic” Russian accounts were placed during the 2016 U.S. presidential election.

Photo: Benjamin Fearnow

In a strange twist, Facebook’s automated system flagged the pending advertisement purchases as “Antysemityzm,” the Polish word for anti-Semitism. But after the category change, additional ads were approved by Facebook. And in another odd twist – lending to users’ own self-identification tools – the Nazi SS and “Nazi Party” were both listed to advertisers as “employers” with audiences in the thousands. Several categories involving “Hitler” were designated as “fields of study.”

The far-right National Democratic Party of Germany included a far larger potential viewership of nearly 200,000 Facebook user accounts.

Facebook’s hands-off approach to the advertising business has led to numerous “promoted posts” or “boosted posts” that are often approved within 15 minutes of submission despite originating from dubious sources.

After ProPublica contacted Facebook regarding the anti-Semitic advertisement audiences – which are spawned from an algorithm rather than people – they responded by removing the categories and vowing to improve proactive analysis.

“There are times where content is surfaced on our platform that violates our standards,” said Rob Leathern, product management director at Facebook, to ProPublica. “In this case, we’ve removed the associated targeting fields in question. We know we have more work to do, so we’re also building new guardrails in our product and review processes to prevent other issues like this from happening in the future.”

Additional advertising campaigns for various other religions submitted by ProPublica, including “Muslim haters,” found no results.

A Facebook official who asked not to be named in the report said the tiny audiences were likely the cause of the oversight. “We have looked at the use of these audiences and campaigns and it’s not common or widespread,” he said.