Over a month after Facebook announced a ban of a number of white nationalist, white supremacist, and other hate groups, they are still on the platform and continue to use it for recruitment.



After the deadly attacks in Christchurch, New Zealand, where the gunman went live on Facebook for several minutes before killing 51 people, the platform announced several bans around the world of known extremist groups. The bans were imperfect.

In Canada, many of the groups remained on the platform. In the US, an early announcement of the ban allowed those who were de-platformed to ask their supporters to follow them elsewhere. Now researchers are saying some of the banned groups are still active on Facebook, and attempts to report them have been ignored by the company.

“Facebook likes to make a PR move and say that they’re doing something but they don’t always follow up on that,” Megan Squire, an Elon University computer science professor who researches online extremism, told a joint BuzzFeed News–Toronto Star investigation.

Kevin Chan, one of Facebook’s global policy directors, said while they proactively removed some hate groups, the company also relies on users, journalists, and other sources to report when banned personalities make it back on the platform.

Chan said that sometimes it may feel like whack-a-mole, but he considers it more of an arms race — with Facebook trying to get better at keeping listed hate groups off its platform, and those banned users figuring out new ways to find their way back online.

“Every time we are learning. Now, we presume they’re also learning … I think it’s really more of an arms race,” Chan said.

“But the trend line is that it is going to get really hard for people to do this, so hard to the point where … there’s going to be so much friction in the system that they’re probably going to go somewhere else,” he said.

Squire has been researching extremism on Facebook for years and said the ban didn’t capture most of the groups she has been monitoring. Squire provided BuzzFeed News and Toronto Star a list of groups that have made a comeback on the platform, including those that participated in the deadly Charlottesville Unite the Right rally.

Facebook removed all examples of the groups BuzzFeed News and the Toronto Star sent to the company.

“Individuals and organizations who spread hate, attack, or call for the exclusion of others on the basis of who they are have no place on our services,” a Facebook spokesperson said in a statement. “We proactively look for bad actors, and investigate concerns when they are raised.”

One of those groups is the Proud Boys, which was banned last October after its members were filmed violently attacking three protesters in New York. The group used Facebook to organize, promote events, and recruit new members. According to an investigation by TechCrunch, the Proud Boys had over 35 functioning regional chapters.

Squire said the group was able to return to Facebook by slightly altering its name. One of their new pages was called PB Canada and included a link to a Telegram channel used to communicate with supporters.

Another page Squire is tracking, West is the Best II: Electric Boogaloo, directly pointed its fans to the Proud Boys USA website, showing how easy is it to circumvent enforcement of Facebook’s own rules, she said.

“They’re not so great at following up if the groups rebrand,” Squire said of Facebook.

Another issue is that while the groups themselves are banned, individual members remain on Facebook and continue recruiting.

A Canadian blog called Anti-Racist Sudbury documented how members of Soldiers of Odin, a group banned on Facebook, are still active on the platform. One member created a profile with the username “S.O.O.Recruiting.Sudbury.” He invited people to join the group’s activities and cross-posted content to groups such as Yellow Vests Canada and the conservative Canada Proud page.

A group with nearly 7,000 members called Sons of Odin is also still on Facebook, along with copycats that sell Soldiers of Odin swag.

Evan Balgord, executive director of the Canadian Anti-Hate Network, provided screenshots of him reporting the group to Facebook for hate speech. The company responded to say its Community Standards were not violated.

“If the group slightly rebrands to Patriots of Odin, they seem to be unable to figure [it] out,” Squire said.

League of the South is another example of a group that was banned, “but all of their members are still on Facebook, but promoting their own pages that maybe aren’t branded,” she said.