HIDDEN gender exclusive Facebook groups Blokes Advice and Bad Girls Advice had more than 200,000 members from Australia.

Both groups attracted backlash for publishing sensitive material that went too far.

But Facebook only removed the male page.

According to the social media giant, Blokes Advice crossed a line, publishing a larger quantity of heinous material than its sister site.

This is despite Bad Girls Advice sharing unsolicited naked photos, condoning violence against men, promoting bestiality and memes making fun of the Tuesday’s Manchester bombing attack.

The later occurred just hours after the incident and members who voiced concern came under attack from members and admin alike.

So what is the threshold of content that needs to be crossed for a group to be shut down and who decides?

After countless emails and calls, this is what Facebook had to say:

“The content that is shared on Facebook must comply with our Community Standards. If we become aware of content that does not comply, we will remove it,” a Facebook spokeswoman told news.com.au.

“In relation to groups, a group will only be removed where the majority of the posts in the group violate our Community Standards. If only a small number of posts in a group violate our policies, then those specific posts will be removed and not the entire group.”

The Community Standards strictly condemn violence and graphic content, sexual violence and exploitation, and bullying and harassment, but Facebook is willing to let Bad Girls Advice continue because only some of the content is offensive.

Does this mean that a Facebook group could be set up with 200,000 members and approve unsolicited nudes, as long as it doesn’t represent more than 49.9 per cent of the posts?

And if the number is less than 49.9 per cent, who decides?

Posts in these groups also have to be approved by admin, so why is it only the person writing the content held accountable?

Man captures his own murder on Facebook live stream Man captures his own murder on Facebook live stream

WHAT CAN WE DO ABOUT THIS?

When it’s possible for a murder to be live streamed there is clearly a problem.

Professor Catharine Lumby, who has researched content regulation in a convergent media environment, believes the internet and social media are out of control with hate speech, trolling and widespread misinformation.

“There are real causes for concern over the fact that media content online is very difficult to regulate,” she told news.com.au.

“We have to ask a series question as a society globally about what are the limits to free expression.”

Facebook, for example, has more than 100 internal training manuals, spreadsheets and flowcharts offering guidance for how its employees moderate issues such as violence, hate speech, terrorism and pornography.

But moderators say they still struggle because they don’t have long enough time to make an informed decision in many instances.

“Facebook cannot keep control of its content,” an inside source told The Guardian. “It has grown too big, too quickly.”

Prof. Lumby said even though the digital world is evolving at a rapid rate, these multinational companies cannot use that as an excuse to be only reactive to issues.

“We are not going to put the toothpaste back in the tube and the internet is a force for good in many ways,” she said.

“We just have to confront the levels of abusive and hateful speech that goes on online and has real effects.”

A closer look at Facebook’s blueprint obtained by The Guardian highlight some troubling flaws with how it approaches handling graphic content:

• Livestream attempts of self-harm are allowed because Facebook “doesn’t want to censor or punish people in distress”

• While marked as disturbing, videos of violent deaths do not always have to be removed as they can help create awareness of issues such as mental illness.

Some do not have to be deleted or “actioned”

• Unless there is a sadistic or celebratory element, there is no need to action or delete photos of non-sexual physical abuse and bullying of children.

Prof. Lumby said we no longer live in an era with content only in print, TV and radio, although the laws still reflect that landscape.

“These platforms make a lot of money from people’s data and have to start taking accountability and continue to improve regulations and education on what is appropriate.

“The reform should be a transparent process involving the government, industry bodies and conversations with digital citizens.”

Prof. Lumby said people always talk about an internet filter, but that idea has so far been unworkable other than with China’s firewall — the combination of legislative and technological actions taken by the government to regulate the internet domestically.

“The problem is the liberal western world would never accept these type of restrictions, but you can’t have social media being like the Wild West either,” she said.

“The biggest issue is the sheer volume of content and the fact the sites are hosted offshore, so it becomes difficult to put accountability with these companies.

“A solution could be investing more money into funding third-party body which could act in the failure of these social media platforms.”

THE INTERNET IS BROKEN

Social media was supposed to be a tool of liberation, a pathway offering digital citizens to engage in conversation without gatekeepers.

And while it does offer these benefits, people, as they inevitably do, abused this freedom.

Rape, murder and suicide are live broadcast by Facebook to the world see, Twitter is home to endless trolling and abuse from hundreds of thousands of anonymous accounts and fake news is running rampant.

Things have gotten so out of hand that one of Twitter’s founders, Evan Williams, somewhat regrets the monster he helped nurture from birth.

“I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,” he told the New York Times. “I was wrong about that.”

Mr Williams said the online state of affairs had been in steep decline for a number of years and was only getting worse.

“I think the internet is broken,” he said. “And it’s a lot more obvious to a lot of people that it’s broken.”

The 45-year-old tech entrepreneur explained the problem with the internet is the fact it rewards extremes, likening it looking at a car crash as you drive past.

He said the internet interprets this behaviour by thinking everyone is looking for “car crashes” and tries to supply them — a problem he is now trying to fix by breaking the pattern.

“If I learn that every time I drive down this road I’m going to see more and more car crashes, I’m going to take a different road,” he said.

Mr Williams said has doubts he will ever achieve his Utopian dream for the internet.

“The problem is that not everyone is going to be cool, because humans are humans,” he said. “There’s a lock on our office door and our homes at night. The internet was started without the expectation that we’d have to do that online.”

What do you think can be done to stop these problems? Continue the conversation in the comments below or with Matthew Dunn on Facebook and Twitter.

matthew.dunn2@news.com.au