Whatever your feelings about it, that regime has grown in sophistication as the internet has become more pervasive in China. Companies used to rely on a combination of simple blacklists of banned words and teams of manual reviewers. Those still play a role, but large companies such as Tencent now have more powerful automation to help identify what to block.

Tencent’s WeChat, which dominates mobile life in China, provides a handy example. Send a friend or group the “Tank Man” photo of a protestor holding only a plastic bag while facing down a convoy of armored vehicles in Tiananmen Square and it may never arrive. The app’s built-in censorship system can scan for a blacklist of sensitive images, says Jeffrey Knockel, a researcher at the University of Toronto’s Citizen Lab.

Knockel contributed to research last year that found WeChat can also look inside images to check any text—for example in an image meme—for sensitive content. The researchers reverse engineered the WeChat app to uncover that mechanism inside a section of the app called Moments, where users can share posts with a broad circle of friends, somewhat like Facebook. The image filtering also showed some ability to detect banned images that had been modified or resized.

“Following local laws is not a good excuse.” Charles Mok, Hong Kong legislator

Tencent’s project has other similarities with Facebook. The US company does not operate in China, despite CEO Mark Zuckerberg courting its leaders. But Facebook has also developed software that reads text inside images, as part of efforts to block hate speech. At both Chinese and US social networks, the automatic filters are far from perfect.

Knockel says China’s censorship system is less an impermeable barrier and more of a “kluge”—a patchwork of scrappy systems built by companies apparently aspiring to do the minimum necessary to meet government expectations. “I think they want to implement it barely enough that they don’t get into trouble,” he says.

All the same, there’s evidence that the kluge works. US teens are warned that the internet never forgets youthful indiscretions. China’s censorship experiment suggests that if you force the internet to forget, a society may develop amnesia.

In December, researchers at Peking and Harvard Universities published results from a controlled experiment in which 900 Chinese students received an unfiltered connection to the outside world. Those given a chance to sidestep censorship displayed little interest in blocked sites and information.

“It’s not the case that people are hungering for banned content,” says Jennifer Pan, a Stanford professor who studies the digital strategies of autocratic regimes. “You could say the system has worked in that people don’t know what they're missing and don’t demand it.” It helps that China’s vast internet industry provides plenty of government-compliant content and services, she says.

Since coming to power in 2012, China’s President Xi Jinping has tightened the Communist Party’s control of the country’s economy, society, and internet. Restrictions on foreign and domestic media have increased, and the United Nations says that as many as 1 million Uyghur Muslims in China’s northwest are being held in internment camps.

In 2014, a new body called the Cyberspace Administration of China unified internet regulation under the president’s direct control, says Chris Meserole, a fellow at the Brookings Institution. He says Xi’s regime views the internet as a tool for surveillance and suppression as much as communication.

Apple and Microsoft must deputize themselves to that regime to operate in China. Microsoft has offered, and censored, Bing results in China since 2009 and picked up additional censorship obligations in 2016 with the acquisition of LinkedIn.

The service had entered China as a startup in 2014, arguing in a blog post that although the company “fundamentally disagrees with government censorship,” staying out of China would hold back its people from accessing “the economic opportunities, dreams and rights most important to them.”