Never before has a platform grown as fast as TikTok. We have gained exclusive insight into its content moderation and are publishing excerpts from its moderation rules: TikTok operates a sophisticated system to identify, control, suppress and direct content. The platform can throttle videos of protests and demonstrations according to its rules.

No app has been downloaded as often as TikTok in the past year. The video sharing platform is growing rapidly. In November 2019, TikTok broke the 1 billion-user barrier – faster than any other social network ever before. The video app and its culture are currently so popular with children and young people that even the Tagesschau (the major German public TV News programme) now has its own account there.

However, research by netzpolitik.org shows that TikTok is currently able to suppress videos of political protests and demonstrations and additionally determine which content is visible, through a variety of means.

Exclusive insight into content moderation

For this research, netzpolitik.org spoke to a source at TikTok, looked at moderation criteria and communications, and experimented with specially created accounts to see how well videos with China-critical content are visible on the platform.

TikToks moderation rules, of which netzpolitik.org was able to see different versions, are remarkably thin and widely interpretable – even for the moderators themselves. The strategy, however, is clear: certain content is given the widest possible reach, while others are systematically suppressed.

The successful platform belongs to the Chinese technology company ByteDance. Already in September, the Guardian reported on leaked documents that detailed how TikTok censored political statements on the Tiananmen massacre or the independence of Tibet. The protests in Hong Kong, which are currently attracting worldwide media attention, are virtually invisible on TikTok between selfies and singalongs, even though the app is available in Hong Kong.

Beijing moderates at night

The German-language videos on TikTok are moderated from three locations, reports the source: Berlin, Barcelona and Beijing. At the German location, the leadership is Chinese. Work is carried out in 8-hour shifts, during which around 1,000 tickets have to be processed. At just under half a minute per ticket, this is very little time for video content. Wages differ widely, and pressure on individual moderators is high, with mood in the team being „toxic“, the source reports.

According to the source, moderation takes place in three review stages. The first review already takes place in Barcelona after 50 to 150 video views. Berlin is responsible for the second review from 8,000 to 15,000 views and the third review from about 20,000 views. At night, German-speaking Chinese moderate content from Beijing. TikTok confirmed this to netzpolitik.org.

In contrast to Facebook, moderation rules are quite skimpy: according to the source, they fit into a table just a few pages long. The instructions netzpolitik.org was able to see are confusingly vague. This leaves a lot of room for interpretation, and the restriction of content can be interpreted very broadly, because the set of rules does not have the –in many cases much clearer- character of Facebook moderation criteria.

Delete, demote, push

The rules divide unwanted content into four categories. This is reported by the source and shows in the moderation rules that netzpolitik.org was able to see. Videos that completely violate the conditions of the platform are deleted („deletion“). Other contents are marked as „visible to self“. This means that the user can still see the video him- or herself, but it is no longer visible to others.

A marking as „not for feed“ or „not recommend“ means that the video no longer appears in the algorithmically curated newsfeed that the user sees when opening the app. The marking can also lead to disadvantages in the search and visibility in hashtags, says the source. Strictly speaking, such contributions are not deleted – but in fact they no longer have an audience.

Featured: Pushed by marketing department

General: Limited as „risk“ in certain markets

Not recommend: Not in For You Feed / can be shared in profile

Not for Feed: Not in For You Feed / disadvantages in search

Visible to self: Only visible for user him/herself

Deletion: Deletion of content

For non-restricted videos, there are two levels. Most of them are categorized as „General“. However, by marking „Risks“ you can block or throttle content regionally.

Videos whose distribution the marketing department wants to increase can be pushed with the „Featured“ mark. TikTok confirmed to netzpolitik.org only the existence of „Deletion“, „Visible to self“ and „Risks“. The „Risks“ are necessary so that the videos do not violate local laws in certain countries.

The source tells us that not only content can be slowed down, but also entire hashtags. In general, TikTok seems to drive a system of promoting and throttling, where certain content is visible and viral, while others never take off or gain visibility. Control over what people see on TikTok is primarily in the hands of the company.

Controlled content policy

Protests are generally not welcome on the platform, says the source. Due to the orientation of TikTok, there is less protest content than on other platforms, but often such videos never even make it to marketing, instead being deleted right at first viewing in the moderation process at other locations such as Barcelona. The „deletion teams“ do not watch the whole videos, but only individual frames, and sound is only listened to on suspicion. Tiktok denies the moderation of content on the basis of political orientation.

If content gets through to the marketing team, this determines the composition of the curated For-You feed, which is algorithmically tailored and played out to the individual users.

During the moderation, the moderators not only mark content for deletion or throttling, they also classify what they see. „This is done to help build an automated moderation,“ tells us the source, who has insight into the moderation. To netzpolitik.org, Tiktok denied tagging to teach artificial intelligence, but that algorithmic systems are used „to check when posting content“.

Criticism of politics was previously excluded from Feed

According to the source, TikTok changed its moderation rules after the Guardian’s September reporting and subsequent criticism. The source says that the company explicitly referred to the bad press in front of employees. The extent of the changes has been unique to date, but smaller modifications are more frequent.

TikTok told the Guardian that these changes were made this May; however to netzpolitik.org the company now states that these significant changes were made „much earlier“.

Until this major adjustment, moderation rules had almost completely ruled out criticism of politics and political systems. Those who criticised constitutional monarchy, parliamentary systems, separation of powers or socialist systems were throttled. Only with the major changes was this „ban on politics“ removed from the moderation rules.

An excerpt from TikTok‘s moderation policy (PDF), which documents several rules before and after the change, is published here. For reasons of source protection, this is not the original document, but a copy.

Demonstrations still easy to censor

The throttling of depictions of so-called „controversial events“ has also been changed. Until then, this had generally included protests, riots and demonstrations. A list also included examples such as the Kurdish, Tibetan and Taiwanese independence movements. After revision, the depiction of demonstrations and protests is no longer restricted per se.

However, demonstrations with the reference to „possible violent conflicts“ can still be marked „not for feed“ and downgraded according to the current rules. TikTok says it does not remove such content. However, this was not an answer to the question posed. Netzpolitik.org had asked for the throttling classification „not for feed“, as the moderation policy versions available to us did not dictate complete removal in the first place.

Previously, criticism of public political figures, the police and the military had been banned from the feed. TikTok has meanwhile abolished these rules in Germany, which the company also confirms.

LGTBQI content is not played out in many countries

In addition, certain content used to be marked as „Islam Offence“. Content marked with this keyword, for example two kissing men, triggered a geoblock for certain regions. LGTBQI content in particular was affected. This regulation was abolished after the Guardian report. Or rather: renamed.

Contents that deal with sexual orientation are now given the keyword „Risk 3.4“ – the consequence being, as with the „Islam Offence“ before, that these contents are throttled in Islamic countries. The marking with a risk (there are many more), lead to geoblocking, says the source. TikTok argues that one has to abide by local laws.

Protests in Hong Kong hardly visible

Since TikTok belongs to a Chinese company, the handling of the democracy protests in Hong Kong in particular is a good yardstick for censorship attempts.

Already in September we had searched the app and the web version for certain hashtags prominent on other networks, such as #hongkongprotest, #freehongkong or #antielab, and found no or very few results. Instead, under these hashtags, the app showed videos that had nothing to do with the protests. Only after a press inquiry at TikTok in Germany, some videos became visible, also those we tried to upload with a specially created account.

When we did our research in September, the protests in Hong Kong were mostly invisible. Hashtags such as #joshuawong were not displayed (far left), displayed videos had no reference to the hashtags (see other three screenshots). All rights reserved Screenshots App TikTok

Back then, the company spokeswoman said to netzpolitik.org:

Users are on TikTok because the app gives them a positive, fun experience where they can exert their creativity. Our users mainly upload and watch short and entertaining videos on TikTok. TikTok’s moderation follows our community guidelines and terms of use and does not remove videos related to the Hong Kong protests.

This overspecific statement can mean: Our users do not upload political content, there are hardly any videos about the protests in Hong Kong and we don’t delete them either. However, the statement leaves completely open whether TikTok has systematically disadvantaged videos about Hong Kong via the different levels of visibility and thus made them invisible to the public. The search for the Hashtag #JoshuaWong, for example, did not lead to any results in the TikTok app in September. Indeed, there was no Hashtag at all.

TikTok tells netzpolitik.org that today there are no content restrictions on protests in Hong Kong and Joshua Wong.

In a recent test, the German newspaper Welt am Sonntag had found that the search for keywords controversial from the point of view of the Chinese Government, such as „falungong“, „tiananmenmassacre“ and „Tiananmensquare, did not yield any or only very few fitting content. Numerous app searches from netzpolitik.org come to a similar result.

„Tame and Steered“

For Christian Mihr of Reporters Without Borders, netzpolitik.org’s research confirms fears that TikTok is under the influence of the Chinese state – even though it is a private company. It is part of Beijing’s media strategy to internationally enforce its totalitarian vision of tame, controlled media.

„If the reach of for example protest content on TikTok is throttled, this fits the picture perfectly, since protests in Hong Kong or Xinjiang are a taboo topic for media in China, just as they are abroad,“ says Mihr. He criticizes TikTok’s approach as a „sign of great lack of transparency“.

Update November 25:

After publication of this article, TikTok insists on adding the following statement:

TikTok does not moderate content due to political sensitivities. Our moderation decisions are not influenced by any foreign government, including the Chinese government. TikTok does not remove or demote videos due to the presence of Hong Kong protest content, including activists.

About this research and the sources:

Our knowledge about the moderation at TikTok in Germany is based on several hours of conversation between netzpolitik.org and a source that has insight into the moderation structures and the policy. We checked the identity of the source and its employment contract. We cannot and do not want to describe the source in more detail for reasons of whistleblower protection.

—

Should you have further information or documents about TikTok, we would be pleased to hear from you. You can also directly send information directly – encrypted if you wish. Do not use e-mail addresses, telephone numbers, networks or company devices for this purpose.

Thanks to Carolin Moje for this translation.