by Brian Hioe

語言:

English

P hoto credit: Brian Hioe



New Bloom interviewed Puma Shen (沈伯洋), assistant professor at the National Taipei University’s Graduate School of Criminology and the director of the DoubleThink Labs, on issues regarding fake news and disinformation in Taiwan before the election and efforts to combat this.

Brian Hioe: First, could you introduce yourself for readers that might not know you?

Puma Shen: I’m Puma Shen. I’m an assistant professor at National Taipei University. I’m also the director of the DoubleThink Labs.

BH: What do you think are the key means by which fake news disseminates in Taiwan?. What is different about the way fake news spreads through these platforms?

PS: According to our research, there are two key distinctions. One is “online”. The other is “offline.”

Photo credit; Pixabay/CC

Online, it’s through Facebook, or through content farms that fake news spreads. Content farms are often based in Malaysia, with some in Taiwan. When they operate content farms, they are usually operating fan pages at the same time. They copy and paste content from content farms onto fan pages and spread fake news this way.

This is how they were doing it last year, anyway. This year is different. Many fan pages were deleted this year, so now they first create content farm articles, then put out advertisements that if you want to make money, you can set up a fan page yourself and help them spread their content.

You’ll make money from them. So this is often Malaysian overseas Chinese or Taiwanese people. They’re motivated primarily by just wanting to make money, so they’ll log-in, set up a fan page themselves, and spread these articles to make money.

There are some more pro-China political parties. They will also operate their own content farms in Taiwan. They are quite often in communication with China.

Regarding the offline element of fake news, this is more difficult to tackle. First, there are “rumors.” Rumors depend on village or borough chiefs who are often in communication with China. Or heads of temples. While holding events, they’ll use this as an occasion to spread rumors. This spreads by word of mouth.

These rumors will circulate in Taiwanese society, leading to some fears, or leading people to have a good impression of China.

This has become more advanced now. They’ll put text online, using Line groups to spread this. In the last few months, we can see half of fake news in Line groups originates from China.

This is hard to deal with, since it’s not always text. Half consists of video. These are all YouTube videos. China has many channels on YouTube and many seem to come from Taiwanese people with connections to China, who are running their own YouTube channels.

This is mass-produced on YouTube. These are seen in certain groups only. They’re not on Facebook. So it’s more offline that is the more severe issue at present.

BH: Who is it behind content farms?

PS: Sometimes you’ll be able to track down companies that are from China. But sometimes they’re just ethnically Han Chinese. They’re not from China. They started these content farms to make money. This is to make money from China’s United Front funding. If you can obtain this funding, you can make money this way.

If it’s those who have direct links to the CCP, not those who are taking money from outside, they seem to be present in Hong Kong or in China. These are more directly operated. In Malaysia, many seem to be just focused on making money.

Photo credit: Brian Hioe

BH: Some big cases of fake news have come up recently, such as regarding the New Party spokespersons and the platform they tried to run or regarding Slow Yang’s supposed role in the suicide of Su Chi-cheng. What are your views on this?

PS: With the explosion in news information production, when people receive information, they have no way to choose what information they end up consuming. So everyone is fighting for attention.

What kind of news is it that can attract everyone’s attention? It’s hard to say. So you have to create a lot of information. Such as the content farms I mentioned earlier. In one day, one website might produce 500 articles.

Our media industry as a whole only produces so many articles. If just one of your website has so many articles, with 20 websites, you might have 30,000+ articles. With over 30,000 articles being produced every day, this is produced in such a large quantity as to cover up the actual news.

Whether blue or green, political parties in Taiwan, will also try and find ways to do something similar. But they may not be able to produce as much news as China is able to or to do so as quickly.

China aims to blot out the sky and cover the earth, you could say, since they have many companies, many people, and many resources. So after grabbing your attention, all you might see is junk news.

What you see may all be critical of the government, or articles about how great China is. This is known as a “cognitive domain attack” (認知領域攻擊).

BH: What’s different about the fake news strategies adopted for this election, as compared to the past?

PS: This time around, because of the changes in Facebook’s policy, it’s become very difficult to run content farms on Facebook. So I believe that they are now putting more effort into other aspects.

First is Line. Disinformation continues to spread on Line. But I feel that the harm of disinformation on Line is comparatively lower. Because much disinformation on Line comes from Weibo or Weixin.

Weibo and Weixin are not systematically producing disinformation. Many are just “little pinks”. They look at Taiwan and it makes them angry. Or they want to make fake news, so they decide to do so.

This kind of fake news, one person could produce on their own. Such as fake news regarding events in Hong Kong, an example being that the Taiwanese Ministry of Justice was going to block all visitors from Hong Kong from entering Taiwan. This sort of disinformation could just be one person saying this on Weibo and then these rumors spreading to Taiwan.

This is harmful, seeing as this is fake news. But it’s not a systematic attempt to spread this kind of information. So there may be visible irregularities.

There is a lot of this currently, but I think this is not the most severe problem. What’s more of an issue is YouTube. Because there are many videos on YouTube now which began appearing from October onward, with many videos from YouTubers connected with China or connected with United Front organizations linked with China. There has been an increase of five or six times in a number of these.

What’s troublesome about this is that first, Youtube can allow for the mass-scale absorption of information. It can also allow for mass dissemination of information.

It requires one to be systematic to make videos, because to make a video, you need someone to write a script, you need someone to edit the video, you need someone to put in music, and then someone to put in subtitles. It’s usually a team which is making a video. It’s difficult to make a video on just your own.

Two examples of job ads recruiting pro-unification streamers. Photo credit: 林雨蒼/Facebook

But some channels can upload three videos in a day. It seems probable to me that a team is working on this. If a team is working on this, this is more sophisticated. Because you need resources in order to run a team like this. It could be an outsourced team that is taking on contracted work, or it could be done directly by the Chinese authorities themselves.

So I feel like the current threat from YouTube regarding fake news is bigger.

Last year, there was a questionnaire distributed to Han Kuo-yu supporters. The results of that were that the key sources of information of the people that voted for Han Kuo-yu came from these articles or from YouTube. YouTube does have a large influence. And then if they put a large amount of resources into YouTube, then the influence on us will be larger.

BH: I remember that there were attempts by Chinese people to buy Taiwanese Facebook pages last year.

PS: Yes. I think that’s less effective now. Because after buying these pages, they want to post these content farm articles. This is not as effective. That only took place across about one month.

But so many people were selling fan pages then. They were selling fan pages that they weren’t using anymore. This was just a way to make money.

BH: People noticed that these were Chinese people they were being approached by, trying to buy up their Facebook fan pages, because the way that Chinese people write is different than how we do. They use different phrases and vocabulary. Like I would look at it and feel that it looks strange.

Do you think that they’ve gotten better at imitating the way we write or generally mimicking Taiwanese discourse?

PS: I don’t think it’s become that sophisticated for YouTube yet. For YouTube, some channels directly use simplified Chinese in their subtitles and captions.

Some YouTubers also use some idiomatic phrases from China when they talk. Or their subtitles will have simplified Chinese mixed in. That it was converted incorrectly from simplified. It’s not so detailed.

But if they get Taiwanese YouTubers to do this, Taiwanese YouTubers are Taiwanese, after all. If Taiwanese YouTubers are doing it, they won’t sound as suspiciously different. Using this kind of agent is probably more common now.

BH: So would you say that they’ve switched to using more Taiwanese agents? As with Taiwanese YouTubers or Taiwanese that are running fan pages.

PS: Yes. Because it’s too easy to tell if it’s Chinese themselves doing it. Getting Taiwanese to do it is more effective.

But the people that take up this kind of job are primarily interested in it for the money. They might not be able to write the necessary script for a video themselves. So if you analyze the script, you can maybe find some way to analyze to see whether it’s from China. Because there will be some differences between what they and pro-unification advocates in Taiwan are saying, discursively.

BH: How do you propose to combat fake news?

PS: Tracking down the sources of fake news is important. Particularly if it is coming from China. And if it’s being created by China. If so, then you can tell everyone that this is a “made in China” message aimed at influencing the election.

What is most important is making people aware of this. Just there’s no way to erase fake news. There’s no way to make it disappear.

Examples of administrators of Facebook pages being approached by suspicious accounts hoping to buy their pages. Photo credit: 林雨蒼/Facebook

Providing clarification about fake news is very important. But the majority of this kind of information operation isn’t fake news. It’s primarily creating narratives. Or stories. These narratives may just be only focusing on the positives of something and not the negatives. Or focusing only on the negatives. Just there’s not any fake information inside.

For example, continually telling you that the American economy is bad. You can, of course, find at least some aspects in which the American economy is doing bad. But reporting on this over and over again every day, seeing this for a month continuously, your average person really will feel that America’s economy is doing very badly. They’ll wonder, then, if we should have less close economic relations with America and closer economic relations with China, because China is big and powerful.

These articles about China being big and powerful aren’t fake news either. Because there are definitely some aspects in which they are performing quite well. They’ll only report on the positives and none of the negatives. The government has no way to respond to this.

This is the most crucial means of attack in information warfare. When there is a big issue in society, it’s not necessarily fake news that causes chaos.

BH: Who does better in addressing the issue of fake news currently? Sectors of government? NGOs?

PS: Taiwan has many local organizations that attempt to bring the truth to light about disinformation efforts. Such as revealing which platforms are content farms, what disinformation looks like, and what the issues that disinformation efforts are focused on are. Quite a number of NGOs are also working on efforts to educate now.

The government also responds to fake news very quickly. Because we have fact-checking centers now, like the Taiwan FactCheck Center. And Co-Facts. There are a few groups working on this now.

But regarding disinformation efforts aimed at creating political narratives, this is something that we don’t have anyone working on. Because this is not able to be combatted through fact-checking.

With regard to this, you can only rely on disclosing the source of this information. To allow people to know that this is coming from China. This is what our organization is working on now.

The most important thing that the government can do is set up regulatory laws. But outside of passing laws, there’s less it can do.

BH: Some organizations are collaborating with Facebook, I’m aware.

PS: Facebook proactively decided to do this. Facebook decided to approach civil society and asked if there was any information they could provide to them.

BH: How would you describe the relationship between fake news and traditional media in Taiwan? Because many media organizations in Taiwan are pro-China, so they disseminate fake news in and of themselves.

PS: Yes, fake news spreads through traditional media as well. Traditional media has in itself become an agent for fake news to spread. This makes the issue complicated, since it looks like a source you can trust, but it’s actually propaganda. When you call something out as fake news, you will be accused of being fake news yourself. If you call it out as fake news, they still won’t pay attention to you.

This represents that these disinformation efforts have been successful because there’s a demographic that is always paying attention to this kind of news. This is something we have to work on gradually.

BH: Is the experience of Taiwan regarding fake news generalizable to other places in the world?

PS: I think so. Like the example of Russia. Russia has engaged in disinformation efforts against a number of countries, such as the Czech Republic. As a result, the Czech Republic can share the experience of dealing with Russian disinformation efforts with other countries. Similarly, Taiwan can share its experience with other countries confronting China.

Taiwan may be highly effective in analyzing Chinese propaganda. We often say that Taiwan is a very good testing ground.

Photo credit: Brian Hioe

Like there was a wave of people on Instagram before, putting their hand on their chest, and talking about how they were going to vote. Even if a non-native speaker knows Chinese, they might not think anything of this. It might look like an organic phenomenon.

But Taiwanese can see which linguistic usages seem to have issues. Taiwanese would see these videos and see that there’s something that is not right immediately. Because the hashtag for those videos was “Declaring my voting intention” (#宣告我的投票意志).

Taiwanese don’t talk that way. The cultural terms that we use are not the same. As such, we might have the best sensitivity to analyze their disinformation attempts and share the tools we develop through our analysis with other countries. Our hope is to do this, so that we can help other countries resist disinformation attacks from China and the CCP.