Facebook’s new effort to flag news deemed to be “fake” began on Friday, as new questions emerged about the limitations of the system the social media giant has put in place to outsource the fact-checking process.

The tech company’s decision to swiftly test a system to identify fraudulent news stories has won plaudits from fact-checking experts, including some involved in the project.

However, its first day of operation raised questions about the mechanics of the process – and Facebook’s apparently unwillingness to pay for a fact-checking process that relies entirely upon voluntary action of users and a handful of non-partisan organizations.

“I think it’s tremendously promising. This is exactly the kind of thing that I would hope a company like Facebook would do in taking advantage of the very diligent effort being done by these professional fact-checking organizations,” said Lucas Graves, author of Deciding What’s True: The Rise of Political Fact-Checking in American Journalism.

Facebook’s plan for combating fake news, a response to mounting criticism over the spread of misinformation, particularly during the US presidential election, is theoretically relatively simple. When enough users flag a news article they think is factually inaccurate, Facebook sends the link to a digital clearing house accessible to a handful of fact-checking organizations.

Those organizations can choose which stories they would like to assess and, if their investigation deems the article to be a hoax or containing false information, it will be marked as “disputed” whenever it appears on the social network. It will also be deprioritized in Facebook’s news feed.

Examples from Facebook of measures they are taking to curb the spread of fake news. Photograph: AP

One of the first articles to be sent by Facebook to its team of specially selected fact-checking organizations on Friday was an article by Bipartisan Report that falsely claimed that Florida was holding an election recount over voter fraud. The story was deemed false by ABC News, one of Facebook’s fact-checkers.

However, potential problems with the system are already emerging.

The organizations tasked with policing Facebook’s content are not getting paid for their troubles.

This could create a huge burden for some of the smaller organizations to make a dent on viral misinformation, particularly as each story has to be addressed individually. (Facebook will not apply the “disputed” flag across similar stories, even if they report the same hoax or misinformation.)



FactCheck.org, for example, has a team of five full-time staff and five undergraduate students who each work 15 hours a week. “I’d like to say we could dedicate someone to this, but it depends on our resources. Facebook isn’t providing any funding for this, so we’ll fit it in along with other things we do,” said director Eugene Kiely.

The process involved in establishing the veracity of stories is also complicated. Typically, Kiely’s organization fact-checks statements made by politicians, rather than news articles themselves.

“Here the job is going to be labelling the story as reliable or not reliable, rather than an individual claim, and that’s going to require some adjustment by the fact-checkers,” Graves said.

Better-resourced groups have allocated whole teams to the problem. ABC News said it had a team of about six people dealing with fake news, which means that the Walt Disney Company, which owns ABC News, is effectively subsidizing a solution to a problem faced by Facebook.

This, despite the fact that Facebook is among the most valuable companies in the world, with a market cap of more than $340bn.

Facebook, a company that makes $7 billion in revenue every 3 months, is getting its fake news filter subsidized by Disney pic.twitter.com/kMf25vNE5T — Christopher Mims™ (@mims) December 15, 2016

“I think an ideal model would be one in which companies that really dominate the lion’s share of ad revenue, so companies like Facebook and Google, would direct some support to an independent fund, which would in turn be used to support third-party fact-checking,” Graves said.

“We could use some money from Facebook or anyone else willing to help out,” added Kiely.

The fact-checking organizations helping Facebook identify and flag fake news do, however, stand to benefit from the tool. If a Facebook reader wants to find out why a story has been labeled “disputed”, they can click on a link to an article by the fact-checking organization explaining the decision. This could mean a traffic boost to those sites.

Another issue is that the system is currently built to flag only articles hosted by external sites. That means that videos, memes or photos will be unaffected – even if they contain the very same false information or hoaxes. The Guardian understands that Facebook will, however, reassess this once they have tested the system with news articles.

There is also the possibility that the system could be gamed by disgruntled Facebook users who could work together to flag factually accurate stories as “fake news” in order to clog up the fact-checking clearing house.

This is a particular challenge given that some rightwing groups, which have a track record of corralling online activists, deem the fact-checking organizations to be biased against them. InfoWars, a rightwing website known to peddle conspiracies and fake news, has already declared that the partnership with Snopes “clearly opens the door for the outright censorship of conservative content and opinion”.

Facebook said that it will have some ability to detect when someone is trying to spam the system with false flags and that it would only send news articles, and not personal blogposts, through to its volunteer fact checkers.

“What Facebook is doing isn’t going to deal with the gray area. It’s looking at stuff that’s completely made up,” Kiely said.

Nevertheless, he accepts that the facts won’t matter to some people.

“President Trump goes around talking about the dishonest media and fact checkers and his supporters will trust what he’s saying and not what we’re saying,” he said.

“If we are saying something counter to their beliefs then they are not going to accept it. We are not writing for those people but the larger chunk in the middle between the two sides [of the political fence].”