A fake video of House Speaker Nancy Pelosi, slowed down to make her appear drunk, went viral on Facebook this week. It was viewed more than 2.5 million times on the platform, with tens of thousands of shares and thousands of comments.

But as far as Facebook is concerned, the video did and does not violate any community guidelines, a position it defended even as backlash to the video grew.

Initially the social media giant viewed the video as a form of “self-expression” and therefore allowed it to remain on the platform. A Facebook spokesperson said on Friday that it had downgraded the video to limit the number of users who viewed it and said it would “attach a link to a third-party fact checking site pointing out that the clip is misleading.”

“Just because something is allowed to be on Facebook doesn’t mean it should get distribution,” the company said. “In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.” On Saturday a spokesperson added that the video had indeed been marked as misleading, which meant that anyone who saw or shared the video would be alerted that it was false.


But the fact that Facebook could simultaneously acknowledge the video was fake news while also allowing it to remain on the platform severely irked some Democratic lawmakers, especially since YouTube had deleted the video and its multiple duplicates. Monika Bickert, a Facebook vice president for product policy and counterterrorism, attempted to explain the decision to Anderson Cooper on Friday night.

“We think it’s important for people to make their own informed choice for what to believe,” she said. “Our job is to make sure we are getting them accurate information.”

But when Cooper asked how Facebook could simultaneously wash its hands of editorial decisions like this one while at the same time making money off sharing news content, Bickert fell back on a tried and trusted Facebook defense. “We aren’t in the news business,” she said. “We’re in the social media business,”

Bickert’s response to the Pelosi video controversy echoes a long and infuriating insistence on CEO and founder Mark Zuckerberg’s part that Facebook isn’t a media company and therefore isn’t responsible for the content on its site — despite more than forty percent of Americans getting their news from Facebook.

“I consider us to be a technology company,” Zuckerberg said when testifying on Capitol Hill last April. “The primary thing that we do is have engineers who write code and built product and services for other people.”


The social media giant has made some sweeping changes in reaction to a series of scandals over the past few years, notably the revelation that Russia used the site to sow misinformation during the 2016 U.S. presidential election. Following the Christchurch mass shootings in March, during which a far-right extremist used Facebook to livestream his attacks on two mosques, Facebook instituted more changes, banning content related to white nationalism and white separatism while also tightening live-streaming rules.

But when content is less immediately harmful but still malicious, as is the case with the Pelosi video, Facebook’s willingness to moderate its platform or enforce its community standards diminishes significantly.

The examples here are numerous. Facebook allowed far-right radio host Alex Jones’ content to fester for years before it finally banned the conspiracy theorist last August, following the lead of Spotify and Apple. Facebook’s fact-checking program has also proven problematic, with conservative fact-checkers denoting even factually accurate articles from left-leaning outlets as “false.” (Full disclosure: ThinkProgress was among those outlets subjected to conservative “fact-checkers” last fall.)

According to the U.N., Facebook was also painfully slow to respond to the Rohingya Muslim crisis in Myanmar, allowing hate speech against the minority to go unchecked in the run-up to the genocide against them carried out by the Myanmar military.

This story has been updated with additional information.