Járvány hírlevél A koronavírus és a küzdelem ellene. A helyzet itthon és a világban.

There was a debate during the afternoon shift about the video the cabinet minister had uploaded to Facebook. A lot of users reported it and I thought it was over the line too, but didn't touch it because it wasn't a clear violation of the community standards. The next morning a new girl from the Hungarian team decided to take it down, and most of the moderators agreed with her decision.

Zoltan, who was working as one of Facebook’s moderators during the Hungarian general election campaign recalled the events of early March. The video in question was uploaded by Janos Lazar, who was the right hand man of Hungarian prime minister Viktor Orban at the time. During the 2,5-minute clip, Lazar attempts to show the impact recent migration had on a certain district in Vienna. The video was initially taken down by Facebook due to the violation of the community standards, then it was made available again after a few hours.



It’s shocking we can't talk about migration in an honest manner because of Facebook’s censorship.



This is what Lazar said on the day of his video’s removal in another clip uploaded to Facebook titled “We stand against censorship”. Following his lead, the whole pro-government media machine swung into motion, saying that the Western liberal elite aided by Facebook is trying to interfere with the Hungarian election process. This is the line of thinking PM Viktor Orban followed, and took even further during his annual summer speech at Băile Tușnad. In late July, a few months after winning a landslide victory at the parliamentary elections, Orban spoke about the end of free speech and democracy in the West. Exhibit no. 1: Western leaders conspire with Facebook to suppress negative news related to migration.



After we took Lazar’s video down, and they reinstated it above our paygrade, we received instructions not to touch any of his posts, along with posts of the pro-government news portal Orgio.hu, but to forward all reports concerning these straight to the Irish Facebook headquarters.



- Zoltan recalled.



During the past few months, he spoke to us at length about his time as a Facebook moderator. We traveled to the social media giant’s Central-European headquarters in Warsaw, and spoke to a number of Facebook employees responsible for election integrity, community standards and fighting misinformation. These discussions shared a common goal:



To verify what Zoltan told us about his work;



And to better understand what’s allowed on the most important platform for public debate in the 21st century;

It’s called the “Horizon-project”, sort of a codename for Facebook. When you apply you don't really know what the job is, only that content moderators are wanted in Berlin. You only realize it’s Facebook after you sign the papers.

Zoltan and his fellow moderators are not employees of Facebook. They work for Arvato, owned by the German Bertelsmann-group. The job offers are placed by Arvato, this is why many of the applicants don't even know that they are going to work for Facebook until after they are hired.

There is a team for every country, or for every “market” as they say at the company. Zoltan told us that a couple of dozen people moderate the Hungarian market. The teams work in a Berlin office, moderators have a monthly base salary of 1400 euros with bonuses for working night shifts and weekends, and they can also get extra money for accuracy. The system basically works like this:



A user sees something on Facebook which he/she doesn’t like and reports it.

An algorithm highlights certain violations using color codes: curse words, nudity, etc.

The Arvato moderators in Berlin get the reported (and possibly highlighted) Facebook posts.

If it violates Facebook’s community standards, they remove it.

A number of already moderated posts are randomly selected and forwarded to Facebook’s EU headquarters in Ireland.

Here employees of Facebook check if the Arvato moderators made the right decisions.

If an Arvato employee has a high rate of incorrect decisions, their rulings will be monitored more frequently, and they might be retrained if deemed necessary.

Very high accuracy will result in a bonus.



It was newsworthy

A Facebook manager in charge of the community standards in the Irish HQ told us they reinstated the “Vienna-video” of the Hungarian cabinet minister because it was deemed “newsworthy”. When we asked the manager why Hungarian moderators in Berlin were told not to touch posts of the high profile government politician and the leading pro-government website after the incident, the Facebook manager told us that



no one is above the rules, every post can be reported, and all reported posts are reviewed, but said there are outlets that deserve special attention. Not mentioning the pro-government Hungarian website specifically, the official said that Facebook can't just delete the page of a country's leading newspaper because of a debated post.



After reinstating the “Vienna-video”, Hungarian moderators noticed a sudden shift in the way some Hungarian government officials and pro-government politicians used Facebook, Zoltan said.



As if someone had trained them. One word can make a big difference for moderators. It does matter a great deal whether someone writes “All muslims are criminals”, “In my opinion all muslims are criminals” or “In my opinion there are a lot of criminals among muslims”. Moderators felt as if some of the Hungarian politicians suddenly understood Facebook’s community standards a lot better than before.



Facebook started dedicating resources to election integrity after the 2016 US presidential election. They increased transparency for political ads (now ads with “politically charged” topics can only be run on the platform after having provided a proof of identity and users can easily check who paid for the ads and what sort of target audience the advertisers were trying to reach). Facebook also fosters an active dialogue with national institutions in charge of election integrity as well as with politicians. A Facebook employee in charge of their election integrity efforts told 444 about these developments. He didn't talk about “educating” Janos Lazar or any other Hungarian politician or government official, but generally spoke about Facebook’s willingness to help legitimate political actors to use their platform properly.

After Facebook took the Vienna-video down, Lazar wrote an official letter to the company, asking them to reinstate the clip - former staff members of the cabinet minister told 444. They said there was no further contact between their team and Facebook.

Posting a certain logo will get your page deleted

It's not always clear what kind of content Facebook allows on its platform. The company has a set of basic policies for content: users can’t incite violence, post pornography, etc. Detailed community standards became public in April this year, before that users only had a few vague paragraphs to go by. Beyond the now public community standards there are still internal guidelines to help moderators decide what to remove.

New guidelines arrive almost every day. One week we have to remove all pictures where nipples are visible, the next week pictures of breastfeeding women are allowed. There is a list of hate groups and terrorist organizations, and any page posting their banners or logos would be deleted. These guidelines are not public so that it's harder to circumvent them.

Zoltan told us he sometimes felt like a lawyer navigating rules and guidelines of Facebook, debating with his fellow moderators whether to remove a post or not.

The most important thing I learned while working for Facebook is that words matter a great deal. Not only on Facebook but in real life, too.

During the Hungarian election campaign, one specific word definitely gained special significance.

“Migrant” is not a protected characteristic

Facebook recognizes certain protected characteristics like etnhicity, sexual orientation, religion, etc., and doesn’t allow attacks based on these attributes. In practice, this means any post demeaning and generalizing about jews, muslims, christians or LGBTQ people, etc. is likely to be removed by the moderators if reported.

There are too many muslims in Berlin, they should piss off

- would probably not be allowed as opposed to

There are too many migrants in Berlin, they should piss off

- which probably wouldn’t be removed by moderators, since being a “migrant” is not a protected trait. Zoltan told us that advocating or wishing for the death of “migrants” or calling them “inferior” is not acceptable on the platform, but basically everything else is.

This approach had a serious impact on the Hungarian election campaign, which was built on the Fidesz government’s anti-immigrant rhetoric. Zoltan talked about how upsetting it was for him to see hate speech on Facebook, but not being able to act, because the posts targeted “migrants” who don't have special protection in the eyes of the company.

When asked, Facebook officials told us only immutable characteristics and traits that people are born with warrant special protection. Being a migrant is not such a trait , therefore freedom of speech takes priority.

Mark Zuckerberg to decide whether God exists and what they thinks of homosexuality

Balancing between fighting hate speech and respecting freedom of expression is not a trivial exercise. In recent years, Facebook became the most important platform of public speech in many parts of the world, and the company is now tasked with answering ever more complex and far-reaching questions.

I'm really curious about the news on God. “God hates gays” has long been a problem for us. Now they take it down.

Zoltan wrote in an email.

Sexual orientation is a protected characteristic and attacks or demeaning posts based on sexuality could be considered hate speech. On the other hand, many religions do consider homosexuality a sin, and it could be within freedom of religion and expression for someone to post on Facebook that, according to their belief, homosexuality is wrong.

An employee in charge of the community standards at the Irish Facebook headquarters spoke about how immensely difficult it is to create policy for such cases. They have working groups with lawyers, philosophers and other freedom of speech experts trying to find the best answers. When it comes to God and homosexuality, it seemed they don't have a general solution, it would depend on the exact wording of a Facebook post whether moderators remove it or not.

A mother worried about vaccination

During the summer of 2018, Facebook organized a conference in their Central-European headquarters in Warsaw to talk about their efforts fighting fake news and propaganda. Among other things, participants learned that simply lying on Facebook is not forbidden.

Moderators won't remove posts just on the basis that the information contained within is not correct. When it comes to truth and lies, the company tries to account for the intention of the poster as well as the factuality of posts.

When the user is not trying to mislead anyone, and the post contains no false information, there is obviously nothing for Facebook to do.

When there is false information but the the poster is not trying to mislead anyone, that's just someone on the internet being wrong, and there is no Facebook policy against it.

There are posts that contain information partially or completely true but intent on mislead the reader. This is what they call “Propaganda” at Facebook, the cherry picking of facts and data. They say it's extremely difficult to act in these cases while adequately respecting the freedom of speech.

This is why Facebook concentrates most of its efforts on the last category, when someone spreads lies with the explicit intention to mislead.



A Facebook employee in Warsaw said they recently removed 590 million fake accounts that could have been used to spread misinformation on the platform. They are also trying to teach AI to identify bad actors (for example, when someone logs into multiple accounts in quick succession from the same IP address and shares the same piece of misleading content). Facebook is also partnering with fact-checking organizations to help filter misinformation from the newsfeed.

If a fact checking partner determines that a piece of content is false, Facebook decreases its visibility. An article about flat Earth will reach a lot less people than a piece of content not deemed fake.

When one participant at the Warsaw conference asked a question in connection with the posts of an anti-vaxxer politician, Facebook employees spoke cautiously, emphasising freedom of speech. The situation seems to be similar to God and homosexuality, meaning that the exact wording of the posts would determine Facebook’s action. If someone states that there is scientific proof of vaccines causing autism, Facebook may do something. If it's more vague, they probably won't interfere.

If a mother is worried about possible side effects and therefore not going to vaccinate her child, is it really Facebook’s job to prevent her from sharing this with friends and family?

- one Facebook employee asked.



Zoltan is a pseudonym for our source, whose name we changed to protect his identity. Employees of the social media giant, including Arvato staff sign strict NDAs. Moderators are not allowed to bring mobile phones into the office, and they can only access Facebook’s internal system through their workstations. We spoke to Zoltan a number of times about the possible legal trouble he could get into for talking to a journalist, but he didn't seem to care. He said it was more important to him that people see how Facebook works and have an honest discussion about it. While researching this article Facebook provided us access to a number of their employees on the condition that we don't use their names, only their job descriptions.



An extended, Hungarian version of this article is available here.