Worldwide, after decades of evolution, that system is robust. Software called PhotoDNA and an Interpol database rapidly identify illegal images. Takedown notices can be deployed through the INHOPE network — a collaboration of nonprofits and law enforcement agencies in 41 countries, including the United States.

In the last fiscal year, the Cyber Report team requested the removal of 35,000 images and videos through INHOPE, and in most cases, takedowns occurred within 72 hours.

“I think we can learn a lot from that,” said Toby Dagg, 43, a former New South Wales detective who oversees the team.

Experts agree, with caveats. Child exploitation is a consensus target, they note. There is far less agreement about what crosses the line when violence and politics are fused. C ritics of the Australia law say it gives internet companies too much power over choosing what content should be taken down, without having to disclose their decisions.

They argue that the law creates incentives for platforms and hosting services to pre-emptively censor material because they face steep penalties for all “abhorrent violent material” they host, even if they were unaware of it, and even if they take down the version identified in a complaint but other iterations remain.

Want more Australia coverage and discussion? Sign up for the Australia Letter.

Mr. Dagg acknowledged the challenge. He emphasized that the new law criminalizes only violent video or audio that is produced by perpetrators or accomplices.

But there are still tough questions. Does video of a beheading by uniformed officers become illegal when it moves from the YouTube channel of a human-rights activist to a website dedicated to gore?