Why Facebook banned anti-Muslim group Britain First

Jessica Guynn | USA TODAY

Show Caption Hide Caption May: Trump far-right retweet 'wrong thing to do' British Prime Minister Theresa May is criticizing President Trump for retweeting anti-Muslim videos from the far-right group Britain First. Speaking in Jordan on Thursday, May described the president's actions as "the wrong thing to do." (Nov. 30)

SAN FRANCISCO — Under mounting pressure from lawmakers, Facebook says it has booted Britain First and its leaders for inciting "animosity and hatred against minority groups."

The far-right group came to the world's attention when President Trump started a diplomatic incident in November by retweeting inflammatory anti-Muslim videos posted by one of the group's leaders.

"We are an open platform for all ideas and political speech goes to the heart of free expression," Facebook said in a statement. "But political views can and should be expressed without hate. People can express robust and controversial opinions without needing to denigrate others on the basis of who they are."

With more than 2 million likes, Britain First’s Facebook page helped drive its momentum. A report released earlier this month by anti-facist organization HOPE Not Hate found that Britain First had the "second most liked Facebook page in the politics and society category in the UK – after the royal family."

That page — and the pages of the two leaders, Paul Golding and Jayda Fransen, which also had large followings — have been banned from the social network for repeatedly violating Facebook's rules.

"We recently gave the administrators of the pages a written final warning, and they have continued to post content that violates our community standards," Facebook said in a statement.

Among the content that triggered the crackdown: a photo of the group's leaders with the caption "Islamaphobic and Proud," a caption comparing Muslim immigrants to animals and videos posted to incite hateful comments against Muslims.

Last week, Fransen and Golding were jailed after being found guilty of religiously aggravated harassment.

The move comes as scrutiny of Facebook's handling of hate speech intensifies. Facebook was criticized this week by United Nations officials investigating a possible genocide against the Rohingya Muslim minority for the social network's role in spreading hatred and violence in Myanmar. The Sri Lankan government blocked Facebook last week after it said hate speech on the social network contributed to anti-Muslim riots that left three people dead.

More: Trump reacts to Theresa May's criticism of his far-right retweets: 'Don’t focus on me'

More: What is Britain First, the far-right group re-tweeted by President Trump?

Britain First will not be permitted to set up another official Facebook page. Matthew Collins, head of research of HOPE Not Hate, said the decision was "long overdue."

"We are delighted that Facebook has finally faced up to its responsibility as a publishing platform and removed this hate preaching organisation," he said in a statement. “Britain First used Facebook as a means to leverage its position and push out some of the most divisive and vile anti-Muslim hatred you could find online."

London's mayor, Sadiq Khan, also welcomed the decision to remove the "vile and hate-filled" group from Facebook. He called on social media companies to take more aggressive steps to eradicate content that sows hate and division.

"I call on social media companies to show a stronger duty of care so that they can live up to their promise to be places that connect and unify, not divide or polarize," Khan said.