In December of 2016, after receiving a firestorm of criticism about online disinformation during the presidential election, Facebook announced its Third Party Fact-Checking project. Independent organizations would debunk false news stories, and Facebook would make the findings obvious to users, down-ranking the relevant post in its News Feed. Now the project includes 50 partner organizations around the world, operating in 42 languages, yet it’s still very much an open question how effective the program is at stopping the spread of misinformation.

Recently, Full Fact, a non-profit partner in Facebook’s project, published an in-depth report on the first six months of its involvement in the program. Overall, the group says, third-party fact-checking is worthwhile, but the report has a number of criticisms to make about the way the project works. For example, Full Fact says, the way Facebook rates misinformation needs to change, because the terminology and categories it applies aren’t granular enough to be useful. Plus, Facebook has so far failed to speed up its flags and responses to fact checks.

Another concern mentioned in the report is more fundamental: that Facebook simply doesn’t provide enough transparency or clarity on the impact of the fact-checking that its independent checkers do. How many users did the fact-checks reach? How many people clicked on the related links from a false story? Did the project slow or even halt the spread of that misinformation? Facebook doesn’t divulge enough data to even begin to answer those questions. Its only response to the Full Fact report, which contained eleven recommendations for how to do better, was to tell the group that it is “encouraged that many of the recommendations in the report are being actively pursued by our teams as part of continued dialogue with our partners, and we know there’s always room to improve.” There was no response to the criticism about a lack of data.

ICYMI: Journalists on the ‘aha’ moments that changed the way they work

Full Fact’s critiques are not new. Earlier this year, a number of Facebook’s fact-checking partners told the BBC that they were concerned about having no way to see whether their work was having an effect, and suggested that Facebook didn’t care about the efficacy of the project. “Are we changing minds?” a fact-checker based in Latin America wondered. “Is it having an impact? Is our work being read? I don’t think it is hard to keep track of this. But it’s not a priority for Facebook.” Last year, a number of partners seemed deeply cynical about Facebook’s intentions. “They’re not taking anything seriously, Brooke Binkowski, former managing editor of fact-checking site Snopes.com, who now works for a similar site called Truth or Fiction, told The Guardian. “They are more interested in making themselves look good and passing the buck.”

It’s a common theme with Facebook: a crucial project is given the minimum enthusiasm necessary for good PR. But users—including entire democracies—are owed an explanation. If the world’s most powerful social network wanted to give the impression that it takes fact-checking seriously, it should open up its vast database and share more information about how the project is working.

Sign up for CJR 's daily email

Here’s more on Facebook and fact-checking:

Checking in with the checkers : Last year, Mike Ananny wrote for CJR about a report he helped write for Columbia University’s Tow Center for Digital Journalism, which looked at the Facebook fact-check project and criticisms participants had, including why some posts and news stories were chosen for down-ranking but others were not.

What about Instagram? Among the recommendations in the report from Full Fact is that Facebook extend its fact-checking program to Instagram, the photo-sharing network it owns, which is much more popular with younger users than Facebook itself. “The potential to prevent harm is high [on Instagram] and there are known risks of health misinformation on the platform,” the group wrote.

A booming business : Fact-checking groups in Uruguay, Bolivia, Argentina, and Brazil have joined forces to create a national coalition in order to fight misinformation being spread both on Facebook and through WhatsApp, the encrypted messaging network Facebook owns. The groups are also working with organizations like First Draft, a fact-checking and training network based in the UK that is affiliated with City University in New York.

Fact-checking Boris : The British TV network Channel 4 has done some fact-checking of government statements in the past. Now, in the wake of Boris Johnson’s ascent to the office of Prime Minister of the UK, Channel 4 says it is committed to fact-checking every public statement Johnson makes during his tenure, and has asked viewers to help.

Other notable stories:

ICYMI: Native Hawaiians on coverage of Mauna Kea resistance

Has America ever needed a media watchdog more than now? Help us by joining CJR today

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.