The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Fact checkers need to move from ‘publish and pray’ to ‘publish and act.'” “The idea that fact checking can work by correcting the public’s inaccurate beliefs on a mass scale alone doesn’t stack up,” write representatives from Full Fact (U.K.), Africa Check (Africa), and Chequeado (Argentina), in a manifesto of sorts published Thursday to all three sites.

“First-generation fact-checking” — the approach of simply publishing fact-checks, which sites like FactCheck.org do — is a worthy effort, the authors write, but it isn’t enough if you actually want to change people’s minds. “Nobody should be surprised when, despite fact checkers publishing lots of fact checks, people still believe inaccurate things and politicians still spin and distort. Fact checking can work but not if this is all we do.” Full Fact, Africa Check, and Chequeado argue instead for their second-generation approach that includes not just publishing but also pressure and working for system change:

First, we move from just publishing to “publish and act.” We seek corrections on the record, pressure people not to make the same mistake again, complain where possible to a standards body. In other words, we use whatever forms of moral, public, or where appropriate regulatory pressure are available to stop the spread of specific bits of misinformation. Secondly, we recognize that our fact checking provides a unique evidence base that gives us important insight into where misleading claims come from in public life and how they are spread… Thirdly, we work for system change. Using the evidence from our fact checks we identify patterns and common causes, points where we can intervene to significantly reduce particular kinds or sources of information. The pattern might be who’s publishing something, where it’s published, a particular subject that there’s a lot of false information about, or something else. The interventions can range from educating children or adults to advocating for policy changes. Finally, culture. We are trying to create institutions in different societies that can help anchor public debate to reality and to challenge the casual acceptance of deceptive and misleading behavior. This is a long-term task: it involves earning good trusted reputations and not just getting attention. It needs funders to think long-term as well as fact checkers.

And, they acknowledge, they’re striving for a third generation of fact-checking that “will have to address all these issues — but also be able to function at internet scale, be massively collaborative, and work across international borders.”

The sixth-annual fact-checking summit, GlobalFact, took place this week in Cape Town. Here are a few highlight tweets:

I appreciate we are mostly small organisations who are just trying to get by, but we have to be striving for more than just checking. Checking alone is not enough to diagnose problems, or to eventually solve those problems at scale. Let's not lose sight of that goal. #GlobalFact6 — Mevan (@MeAndVan) June 19, 2019

Without this we are curing just symptoms but not the disease! #GlobalFact6 — Mevan (@MeAndVan) June 19, 2019

The big theme at #GlobalFact6: the multitude of innovative ways fact-checkers are getting their content to new audiences — from catchy short videos to “radio" programs on WhatsApp. The creativity is really impressive. — Bill Adair (@BillAdairDuke) June 20, 2019

To start the panel, @HannahOjo presented her work as an @ICFJ #TruthBuzz fellow at @AfricaCheck_NG As a fellow, her focus is to make the truth go viral. Understanding that the majority of Nigerian #socialmedia users access via mobile is at the heart of her strategy #GlobalFact6 pic.twitter.com/b3YMwfoz3m — Rosemary: is the R in VAR! (@RMAjayi) June 20, 2019

“If all goes to plan, @AfricaCheck will, later this year, launch a project in Nigeria tackling health misinformation,” @PCunliffeJones at #GlobalFact6. Big up @AfricaCheck_NG ! — Alphonce Shiundu (@Shiundu) June 19, 2019

At @rapplerdotcom, @GemmaBMendoza says team doesn't just fact-check claims, they also investigate the make-up of the distribution network behind a piece of mis- or disinformation.

I co-sign this.#GlobalFact6 — Rosemary: is the R in VAR! (@RMAjayi) June 19, 2019

Hi #GlobalFact6: here you have the transcript-o-matic tool where you can transcribe every YouTube video that has subtitles (automatics or not). Thanks for all the tweets. https://t.co/qcQZy6HvRN #Chequeabot pic.twitter.com/bnw3jaA7ZC — Pablo M. Fernández (@fernandezpm) June 19, 2019

“It’s a sweatshop in America.” The Verge’s Casey Newton published another horrifying story (here’s the first one) on working conditions for Facebook content moderators, this time at the Cognizant site in Tampa that is “Facebook’s worst-performing content moderation site in America.” The content moderators are contract workers, not Facebook employees.

The story outlines the gross working conditions that the content moderators face and shows them horribly affected by the videos they have to moderate, especially those that feature animal abuse. But I kept thinking about this part of the article:

In June 2018, a month into his job, Facebook began seeing a rash of videos that purportedly depicted organs being harvested from children. (It did not.) So many graphic videos were reported that they could not be contained in Speagle’s queue. “I was getting the brunt of it, but it was leaking into everything else,” Speagle said. “It was mass panic. All the SMEs had to rush in there and try to help people. They were freaking out — they couldn’t handle it. People were crying, breaking down, throwing up. It was like one of those horror movies. Nobody’s prepared to see a little girl have her organs taken out while she’s still alive and screaming.” Moderators were told they had to watch at least 15 to 30 seconds of each video.

The debunk of the organ harvesting was an update added to the story after publication: “This article has been updated to reflect the fact that a video that purportedly depicted organ harvesting was determined to be false and misleading.” These videos have, in fact, been debunked multiple times. That makes them no less horrifying to watch since they do actually show children receiving medical care after airstrikes. To the moderators, they felt real. What I don’t understand is why moderators were seeing so much content that had been debunked already, presumably including by Facebook’s own fact-checkers, but, seemingly, not being educated on the fact that it was fake, which might have made the moderation of it less scarring. If the education part isn’t happening, why not?

The picture of kids whipping a screaming lizard around by its tail doesn't ring true given their biology. Add that to what other sources in the story claim about an "organ harvesting" video that's almost certainly false and you really start to question their reliability. — Christopher Ingraham (@_cingraham) June 20, 2019

I don’t question the moderators’ reliability — I think this just goes to show, again, that this work is hard and mentally scarring and shouldn’t be foisted off on contract employees without proper training. But that’s the whole point of this article!

At some big tech companies, more than half the workforce is already contractors. NDAs typically prevent all but the worst stories from coming out. My thanks to Shawn Speagle, Michelle Bennetti, and Melynda Johnson for having the courage to share their experiences. — Casey Newton (@CaseyNewton) June 19, 2019

So Facebook has been around 15 years, always with a problem of some users uploading horrifying content. Scale intensifies the issues. But the company talks about these consequences like they’re new. https://t.co/aTwa21bPo2 read @CaseyNewton pic.twitter.com/TM7QseyMhP — Sarah Frier (@sarahfrier) June 19, 2019

The report came out the same day that Facebook announced it’s launching a global cryptocurrency.

Judging by this story on Facebook’s “content moderation”—which adds to the pioneering work by @ubiquity75 and @TarletonG—Facebook’s expansion into financial services should be smooth, since it’s such an easy thing to manage, especially across borders. https://t.co/AGk3QV6R9R — zeynep tufekci (@zeynep) June 19, 2019