The government has been asked to formalise content blocking by Australian internet providers during “online crisis events” with new laws.

The proposal is part of a report [pdf] published by Prime Minister & Cabinet on Sunday that was prepared by a taskforce compromising “Facebook, YouTube, Amazon, Microsoft, Twitter, Telstra, Vodafone, TPG and Optus.”

Given the report represents a consensus position by the industry, the recommendations are likely to proceed.

The government indicated the only reason not to proceed would be if the actions were deemed insufficient to “ensuring the safety of Australians online.”

Telcos including Telstra, Optus and Vodafone instituted temporary ISP-level blocks on websites hosting footage of the Christchurch terror attack.

These blocks made use of subsection 581(2A) of the Telecommunications Act 1997, a somewhat vague power given to the eSafety Commissioner to issue written directions to service providers.

The taskforce made it clear on Sunday that it saw the clause as a stopgap measure and that specific laws were ultimately required to direct ISPs and telcos to block content - in some cases, automatically and without the need for human intervention.

For now, the stopgap use of subsection 581(2A) will continue, albeit with some more specific guidance on what is and isn’t permissible.

The taskforce recommended the eSafety Commissioner and Communications Alliance “develop a protocol to govern the interim use of subsection 581(2A) ... in the circumstances of an online crisis event.”

“This protocol would set out the arrangements and process for implementing blocks of websites hosting offending content, including the means of determining which ISPs would be subject to blocking orders, the length of time that the ISPs will be required to implement the blocks, and the process for removing the blocks,” it said.

The protocol would also define “the process to be used to determine whether the terrorist or extreme violent material is sufficiently serious to warrant blocking action, and to identify the domains that are hosting the material; [and] guidance on the circumstances in which it is anticipated that this power may be used by the eSafety Commissioner”.

In addition, it would define a standard “landing page for the blocks and the method of communicating the notice; and to the extent possible, ensure automated notification processes are used to their fullest extent and are as efficient as possible”.

It is then intended that the protocol be enshrined in Australian law.

The taskforce recommended that “the Australian Government pursue legislative amendments to establish a content blocking framework for terrorist and extreme violent material online in crisis events.”

“This new framework should: a) incorporate the matters stipulated in the interim protocol; and b) address additional factors including indemnity, notification processes for content hosts and the automation of blocks,” it said.

Parliament has already criminalised any failure of social media platforms to remove “abhorrent violent material”.

The proposed new site blocking powers, however, would apply to what is being termed “terrorist and extreme violent material” - which is defined separately.

This covers “audio, visual or audio-visual material that depicts an actual terrorist act targeting civilians (as opposed to animated or fictionalised); depicts actual (as opposed to animated or fictionalised) violent crime; or promotes, advocates, encourages or instructs a terrorist, terrorist group or terrorist act, or a person to commit actual (as opposed to animated or fictionalised) violent crime.”

Violent crime is not limited to “murder; attempted murder; torture; rape; and violent kidnapping”.

The definition is intended to net “categories of violent content and material prohibited by the digital platforms as part of their respective community standards and terms of service, such as graphic violence, violent content, or gore.”

Prime Minister Scott Morrison used the taskforce’s report at the G20 summit in Osaka to secure the support of other nations to similarly crackdown on terrorism-related and violent content.

The taskforce report also puts forward other technical solutions from industry, including ways to stop algorithms from promoting violent or terrorism-related content, and better ways to recognise and share URLs to be blocked.

The prospect of ISP-level content blocking of violent material could lead to the resurfacing of some of the same concerns raised when the then-Labor government tried to introduce ISP filtering more than a decade ago.

Labor, however, had been targeting the even more loosely-defined "inappropriate content". After years of opposition, it ultimately shelved the plan.

Other site blocking laws also exist in Australia but are used to block access to websites hosting "illegal" content, such as pirated material.