A 3D plastic representation of the Facebook logo is seen in front of displayed logos of social networks in this illustration May 13, 2015. REUTERS/Dado Ruvic

LONDON (Reuters Breakingviews) - Tech companies and open-internet campaigners think of the web as a neutral, global platform. That vision may become a relic of the past as countries including Britain and Australia speed through their own standards for acceptable online videos, pictures and articles. As new laws give teeth to local regulators, web groups’ margins may suffer.

The UK government on Monday called for a statutory duty of care on social-media sites, file-hosting services, search engines and messaging apps to protect their users. Under the plans, which are subject to a 12-week public consultation, an independent regulator could fine the likes of Facebook and its senior managers for failing to remove harmful posts.

It’s a belated recognition that Western governments have few tools to police material on social media or search engines. That contrasts with newspapers or broadcasters, which are legally responsible for their content. This “era of self-regulation”, as UK Digital Minister Jeremy Wright calls it, has left children and other vulnerable groups at risk; almost one in 10 young adults have actively sought information about self-harm on the internet, according to the report. The new regime would make web groups responsible for removing harmful content posted by users.

The standard for unacceptable content looks ill-defined: potential harms include “coercive behavior”, “disinformation” and other terms with no clear legal definition. That’s similar to a recently-proposed, but vaguely-worded, Australian law which would make it a criminal offence not to “expeditiously” remove “abhorrent violent content”. The risk is that each government produces its own standards for acceptability, yielding a patchwork of different regulations.

Shareholders in companies like Facebook will bear the costs. Mark Zuckerberg’s group has in the past two years tripled its “safety and security workforce”, which includes content moderators, to 30,000 people. Its employee and contractor headcount will rise further to cope with regulations. Assume Facebook adds another 20,000 staff over the next two years, and pays them the average $30,000 a year salary for its U.S. contractors, based on reporting by the Verge. Content-moderation could then cost the company $1.5 billion a year, or 6 percent of last year’s operating profit.

The internet’s global reach has enabled tech companies to reach vast markets with relatively modest investment. A balkanised web therefore spells danger for their bottom lines.