The ramifications of Justice Stephen Rothman’s decision in the Dylan Voller case haven’t filtered downstream. Media organisations now are on notice as to their hip-pocket liability for the mean, ignorant, beastly, defamatory muck that is frequently peddled by readers in response to story items posted on their associated Facebook pages. Yet the rest of the world hasn’t given a toss.

The quality and quantity of the comments continues apace. Only last Thursday when Mike Baird ruled himself out of contention for the top job at the National Australia Bank, the comments about the former NSW premier on the Sydney Morning Herald Facebook feed would be manna from heaven for a half-competent lawyer in the defamation business.

Bob Brown also might have needed compensation as a result of Facebook comments on 3 July in connection to a “discussion” between Andrew Bolt and Rowan Dean on Sky News about “activist” crowdfunding to stop Adani.

From now on lawyers will be cherrypicking the Facebook comments associated with mainstream media news stories, looking for nuggets of gold. Quite apart from any defamatory content, they will have to navigate through more than a fair share of illiterate, muddled or irrelevant remarks that characterise the online free-for-all.

It’s not just Facebook. Last week the New South Wales supreme court directed that tech behemoth Google be charged with contempt of court for failing to respond more urgently to orders for the removal of allegedly defamatory comments about a “prominent Sydney businessman”.

The applicant, who is suing Google LLC rather than the author of the remarks, has had his identity suppressed by the court. This may be a case where the platform’s defence of “innocent dissemination” has been jeopardised because it had been put on notice about the complaint.

Facebook does have the tools to let page owners turn off comments, it just chooses not to make them generally available

Dylan Voller is the young Indigenous man in a spithood, strapped to a chair, who was brought to national attention in 2016 by the ABC’s Four Corners’ report on conditions at the Don Dale youth detention centre in the Northern Territory.

During the trial of his defamation case it emerged that some of the Facebook comments were to the effect that there should be no sympathy for Voller because he assaulted a Salvation Army officer, leaving him blind in one eye, and that he was a violent rapist. Tom Molomby SC for Voller told the court these allegations were “viciously false”.

Instead of suing those who posted the comments, Voller’s lawyers turned their attention to news organisations Fairfax, Nationwide News and Australian News Channel which broadcasts Sky News and The Bolt Report.

Liability has to be sheeted to someone, and the media companies are the best first option with bigger pockets than slap-happy readers, some of who are using fake identities. To sue Facebook itself is interminably slow with success a distant prospect.

The threshold issue for the court was whether the media companies really are the publishers of these third-party comments. The usual rule had been that liability was mitigated if content had been removed after being given notice.

Justice Rothman turned that on its head, finding that even without notice or awareness of the content from the commenting public, the owners and managers of these Facebook pages were the primary publishers of their readers’ handiwork.

The judge even took it a step further, suggesting that by posting on their Facebook pages snippets or pointers to stories on associated news websites they were inferentially on notice that defamatory comments from readers were a prospect.

Getting to that point involved the judge carefully sidestepping other findings which he sought to distinguish on the facts. The outcome also involved accepting evidence that at the very least is controversial and at worst may be quite wrong.

For instance, he pointed to the case of people plastering defamatory comments about a conservative political identity on a bus shelter owned by the Drummoyne municipal council. Because the purpose of the bus shelter was not to facilitate comments from the public the council was found not to have authorised or approved the posters.

Conversely, the media companies in the Voller case had invited and welcomed third-party comments on their posts. Indeed, the judge found that the purpose of these media-connected Facebook pages involved “exciting the interest of Facebook users”, increasing the number of subscribers to the digital media publication or newspaper, and increasing the profile of the papers, TV stations and their websites.

So important is all this outsourced excitement that the public Facebook page of the Australian newspaper generates about 39% of the monthly visitors to its website and 53% of unique visitors on a nominated day.

Justice Rothman found comfort in a decision of the Hong Kong final court of appeal from 2013, Oriental Press Group v Fevaworks Solutions, where someone who had facilitated the speech of others in a hosting forum was found to be the primary publisher of the contentious material.

To further consolidate the judge’s conclusion as to liability he found that it was within the power of the managers and owners of Facebook pages to filter and control the remarks from readers. This could be done by using common words such as “and” or “the” as the filter triggers which would hide the majority of comments and give moderators time and opportunity to assess the content.

Various words such as “paedophile” or “Islamic State” are commonly used by page administrators to block the immediate publication to posts until they have been moderated. Rothman suggested that this could be extended so that all content could be pre-vetted and he went so far as to claim:

... the extended publication of the comment is wholly in the hands of the media company that owns the public Facebook page.

This is a contentious aspect of the judge’s conclusion. Dave Earley for one, the audience editor of Guardian Australia, and someone with 12 years’ experience with social media, news content and moderating, says it is impossible to completely control the flow of comments on to news organisations’ social media streams. A great deal can be hidden and delayed, but not everything.

Of course, moderators can bar obnoxious people from commenting and that is frequently the case. Likewise, Twitter users can be blocked. It seems that more often than not people are just reading and commenting on a social media story teaser without bothering to click through to the news website and get the full context of the article.

Even if the filtering and moderating process is ramped up, Earley says a fail-safe, defamation-free system is a far-fetched notion. People could post defamatory pictures or gifs or get around the filters by adding an asterisk to a word, such as Voll*r or s*x or LGBT@Q. Posting of a single word that escapes the filters could also be defamatory.

Moderators also have to deal with Twitter and comments posted directly on stories published on the masthead websites. This can amount to thousands and thousands of readers’ opinions each day. The boundaries of the at-home commentariat know no limits.

Facebook does have the tools to let page owners turn off comments, it just chooses not to make them generally available. To do so flies in the face of the platform’s commercial benefit, which is built on increased engagement.

That Rothman’s judgment may be based on reasoning open to ready contradiction is something that no doubt will be tested at the inevitable appeal.

The idea of the “public square” that was supposed to be social media’s enriching contribution has not withstood the test of time. It is further eroded now because the audience and engagement editors of the news organisations are tending to post softer, more anodyne stories that are less likely to attract snarky, damaging and costly comments.

So much for a robust democracy.