Facebook had a good election, but it can’t let up on vigilance

FILE -- Mark Zuckerberg, the Facebook chief executive, on Capitol Hill in Washington, April 11, 2018. A Kremlin-backed group of internet trolls that meddled in the 2016 presidential election appeared to be trying to influence American voters using Facebook days ahead of the midterm elections, the social network said Tuesday night. (Tom Brenner/The New York Times) less FILE -- Mark Zuckerberg, the Facebook chief executive, on Capitol Hill in Washington, April 11, 2018. A Kremlin-backed group of internet trolls that meddled in the 2016 presidential election appeared to be ... more Photo: Tom Brenner / New York Times Photo: Tom Brenner / New York Times Image 1 of / 1 Caption Close Facebook had a good election, but it can’t let up on vigilance 1 / 1 Back to Gallery

After an Election Day largely free of viral social media misinformation, and with little trace of the kind of Russian troll stampede that hit its platform in 2016, executives at Facebook may be tempted to take a victory lap.

That would be a mistake.

It’s true that Facebook and other social media companies have made strides toward cleaning up their services in the past two years. The relative calm we saw on social media Tuesday is evidence that, at least for one day, in one country, the forces of chaos on these services can be contained.

But more than anything, this year’s midterm election cycle has exposed just how fragile Facebook remains.

MBA BY THE BAY: See how an MBA could change your life with SFGATE's interactive directory of Bay Area programs.

Want a disaster-free Election Day in the social media age? You can have one, but it turns out that it takes constant vigilance from law enforcement agencies, academic researchers and digital security experts for months on end.

It takes an ad hoc “war room” at Facebook headquarters with dozens of staff members working round-the-clock shifts. It takes hordes of journalists and fact checkers willing to police the service for false news stories and hoaxes so that they can be contained before spreading to millions. And even if you avoid major problems from bad actors domestically, you might still need to disclose, as Facebook did late Tuesday night, that you kicked off yet another group of what appeared to be Kremlin-linked trolls.

I’ve experienced Facebook’s fragility firsthand. Every day for the past several months, as I’ve covered the midterms through the lens of social media, I’ve started my day by looking for viral misinformation on the service. (I’ve paid attention to Twitter, YouTube and other social networks, too, but Facebook is the 800-pound gorilla of internet garbage, so it got most of my focus.)

Most days, digging up large-scale misinformation on Facebook was as easy as finding baby photos or birthday greetings. There were doctored photos used to stoke fear about the caravan of Latin American migrants headed toward the United States border. There were easily disprovable lies about the women who accused Justice Brett Kavanaugh of sexual assault, cooked up by partisans with bad-faith agendas. Every time major political events dominated the news cycle, Facebook was overrun by hoaxers and conspiracy theorists, who used it to sow discord, spin falsehoods and stir up tribal anger.

Facebook was generally responsive to these problems after they were publicly called out. But its scale means that even people who work there are often in the dark. Some days, while calling the company for comment on a new viral hoax I had found, I felt like a college RA telling the dean of students about shocking misbehavior inside a dorm he’d never visited. (“The freshmen are drinking what?”)

Other days, combing through Facebook falsehoods has felt like watching a nation poison itself in slow motion. A recent study by the Oxford Internet Institute, a department at the University of Oxford, found that 25 percent of all election-related content shared on Facebook and Twitter during the midterm election season could be classified as “junk news.” Other studies have hinted at progress in stemming the tide of misinformation, but the process is far from complete.

A Facebook spokesman, Tom Reynolds, said that the company had improved since 2016, but there is “still more work to do.”

“Over the last two years, we’ve worked hard to prevent misuse of Facebook during elections,” Reynolds said. “Our teams worked round the clock during the midterms to reduce the spread of misinformation, thwart efforts to discourage people from voting and deal with issues of hate on our services.”

Facebook has framed its struggle as an “arms race” between itself and the bad actors trying to exploit its services. But that mischaracterizes the nature of the problem. This is not two sovereign countries locked in battle, or an intelligence agency trying to stop a nefarious foreign plot. This is a rich and successful corporation that built a giant machine to convert attention into advertising revenue, made billions of dollars by letting that machine run with limited oversight, and is now frantically trying to clean up the mess that has resulted.

As the votes were being tallied Tuesday, I talked to experts who have paid close attention to Facebook’s troubles over the past several years. Most agreed that Election Day itself had been a success, but the company still had plenty to worry about.

“I give them better marks for being on the case,” said Michael Posner, a professor of ethics and finance at New York University’s Stern School of Business. “But it’s yet to be seen how effective it’s going to be. There’s an awful lot of disinformation still out there.”

“On the surface, for Facebook in particular, it’s better because some of the worst content is getting taken down,” said Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University. Albright, who has found networks of Russian trolls operating on Facebook in the past, has written in recent days that some of the company’s features — in particular, Facebook groups that are used to spread misinformation — are still prone to exploitation.

“For blatantly false news, they’re not even close to getting ahead of it,” Albright said. “They’re barely keeping up.”

Jennifer Grygiel, an assistant professor at Syracuse University who studies social media, said that Facebook’s pattern of relying on outside researchers and journalists to dig up misinformation and abuse is worrying.

“It’s a bad sign that the war rooms, especially Facebook’s war room, didn’t have this information first,” Grygiel said.

It’s worth asking, over the long term, why a single American company is in the position of protecting free and fair elections all over the world. But that is the case now, and we now know that Facebook’s action or inaction can spell the difference between elections going smoothly and democracies straining under a siege of misinformation and propaganda.

To Facebook’s credit, it has become more responsive in recent months, including cracking down on domestic disinformation networks, banning particularly bad actors such as Alex Jones of Infowars, and hiring more people to deal with emerging threats.

But Facebook would not have done this on its own. It took sustained pressure from lawmakers, regulators, researchers, journalists, employees, investors and users to force the company to pay more attention to misinformation and threats of election interference.

Facebook has shown, time and again, that it behaves responsibly only when placed under a well-lit microscope. So as our collective attention fades from the midterms, it seems certain that outsiders will need to continue to hold the company accountable, and push it to do more to safeguard its users — in every country, during every election season — from a flood of lies and manipulation.

Kevin Roose is a New York Times writer.