For the last several years, Facebook has publicly touted its efforts to prevent interference in elections. In September 2018, for example, CEO Mark Zuckerberg wrote a post touting the "focus and rigor" the company was bringing to election protection. Zuckerberg said Facebook was employing "sophisticated approaches" to ensure the integrity of elections in the United States and around the world. According to Facebook VP Nick Clegg, a key part of this initiative was "recruiting an army of people" to "take down harmful content" related to elections.

But internal Facebook documents obtained by Popular Information and interviews with people involved in election-related content moderation present a starkly different picture.

Training materials produced by Facebook and provided to content moderators in the critical weeks before the 2018 election were riddled with basic errors.

A slide deck distributed to content moderators in September 2018, for example, falsely stated that "U.S. citizens must vote in-person at a polling location." Actually, all 50 states allow absentee voting. In 33 states, "no excuse or justification" is required to vote absentee. Oregon's election is conducted almost entirely by mail.

The materials also state that "General Election Day is November 6th, 2018," without noting that in "39 states and the District of Columbia, any qualified voter may cast a ballot in person during a designated period prior to Election Day."

The deck was produced by Facebook and provided to the staff at Cognizant in September 2018. Cognizant was one of several companies Facebook hired to perform content moderation on its behalf. The presentation came shortly after several Cognizant staff participated in a Facebook-run "elections boot camp" in Austin.

This misinformation was provided to content moderators who were tasked with deciding what kind of voting information to delete from Facebook. The presentation instructed Cognizant content moderators to delete "misrepresentation of the dates, locations, and times for voting."

Email correspondence obtained by Popular Information reveals that the initial version of the Facebook presentation also falsely stated that Louisiana did not vote on November 6. That error was corrected only after it was noted by a former Cognizant staffer who did not attend the Facebook training.

Other problems with Facebook's materials flagged by the same former Cognizant staffer, including the misinformation about in-person voting, were never fixed. Facebook's materials also described two people who had already lost primaries, Cynthia Nixon and Joe Arpaio, as active candidates and misclassified several candidates who were running for House seats as Senate candidates.

Facebook provided this misinformation to content moderators that were already ill-equipped to handle election-related content. A former Cognizant content moderator told Popular Information that "many moderators had very little knowledge of US politics." For example, a member of Cognizant's policy team, a group that reviewed decisions made by others, "did not know what GOP meant." Another former Cognizant staffer said many of content moderators at Cognizant had never heard of prominent candidates like Beto O'Rourke and Ted Cruz, leaving them unable to apply Facebook's policies to content involving the candidates.

The former Cognizant staffers spoke to Popular Information on the condition of anonymity, fearing that speaking publicly could negatively impact their present or future employment.

"[T]his presentation includes obvious errors and is no longer in use," Facebook spokesman Andy Stone told Popular Information. "It was intended as a high-level political overview, not an enforcement guide. It was updated multiple times in the lead up to Election Day 2018. Review teams are always provided more specific guidance and trainings particularly when events, such as elections, require additional support and clarity."

It makes sense that the guide is no longer in use since it pertains to the 2018 election. And while it's true that materials were "updated" with information about new topics, the errors highlighted above, with the exception of the date of Louisiana's election, were never corrected.

Life as a Facebook content moderator

The misinformation about elections was provided by Facebook to a group of people already struggling to perform a high-stress job for meager pay. In some ways, election-related content was the least of their concerns. A report from last February by The Verge revealed that Cognizant content moderators were "diagnosed with post-traumatic stress syndrome after being subjected to a daily onslaught of graphic and disturbing images." Workers would "cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions."

Constant exposure to "conspiracy videos and memes...gradually lead them to embrace fringe views." One former Cognizant employee told The Verge, "he has mapped every escape route out of his house...sleeps with a gun at his side," and no longer believes 9/11 was a terrorist attack. Moderators are allocated "nine minutes per day of 'wellness time,'" for use when they feel "traumatized" and need to take a break.

In return, Cognizant content moderators were paid just $28,800 per year, about 12% of the average salary at Facebook. A random selection of each moderator's decisions is reviewed by another Cognizant staffer. If the reviewer agrees with the moderator's decision and the reason, it is deemed "accurate." Any reviewer with an accuracy rate below 95% is at risk of termination.

In October, after a deluge of negative press, Cognizant announced that it "will exit the content moderation business." Cognizant will continue to perform moderation services for Facebook, however, as it fulfills "its commitments over the course of 2020."

"Cognizant’s content reviewers have played a valuable role in keeping our platforms safe for people all over the world and we thank them for the work they’ve done and continue to do," Facebook said in an October statement.

The PhRMA Playbook

Over the past few weeks, Politico Playbook, one of the most widely read political newsletters, has been featuring an advertisement from PhRMA, the lobbying arm of the pharmaceutical industry.

The ads claim that a prescription drug bill advocated by Speaker Pelosi "would siphon $1 trillion or more from biopharmaceutical innovators over the next 10 years."

The ad, however, was ruled "mostly false" by Politifact and Kaiser Health News. Pelosi's bill would result in somewhat lower profits from the pharmaceutical industry, between $500 billion and $1 trillion over 10 years. But PhRMA doesn't "offer much evidence" about why "the lost revenue could discourage drugmakers from researching new treatments for diseases such as Alzheimer’s, lung cancer and sickle cell disease."

“The lower prices envisioned by [Pelosi’s] bill would barely slow new drug discovery at all,” argued Dr. Peter Bach, who directs the Drug Pricing Lab at Memorial Sloan Kettering Cancer Center.

Moreover, a substantial portion of the most innovative research is "conducted in government-funded labs," with drug companies engaging in the process much later. This makes it even less likely that reduced profits "would meaningfully discourage breakthrough drug innovation."

While it possible that reduce profits might prevent some new drugs from making it to market, it's "unclear that the forgone drugs would have major clinical value ― little evidence suggests they necessarily would."

A letter from the group Accountable.us, obtained by Popular Information, requests "these false advertisements be removed from the Playbook website and that further advertisements from PhRMA using assertions that have been labeled false be rejected."

An inquiry to Politico about how it will respond was not immediately returned.

Give a gift subscription

Thanks for reading!