The first study came from three researchers between New York University and Stanford who looked at 570 sites known for spreading false stories between January 2015 to July 2018. They found that interaction with fake news sites rose on both Facebook and Twitter from 2015 to a few months after the 2016 presidential election. After the election, data shows that interactions with fake stories declined by more than half. On Twitter, however, interactions continued to rise.

That doesn't mean Facebook is off the hook yet. Study authors Hunt Allcott, Matthew Gentzkow and Chuan Yu show that "interaction with misinformation remains high, and that Facebook continues to play a particularly important role in its diffusion." Even considering the drop, fake news interaction on Facebook still averages about 70 million per month.

The University of Michigan's School of Information Center for Social Media Responsibility took a slightly different approach. It created what it calls the "iffy quotient." This metric measures how much content from "iffy" sites are amplified on Facebook. The team compiled a set of websites that published misinformation and labeled such as iffy. This was done to show that defining fake news is in it of itself a difficult challenge, like delineating between satire or opinion from actual false narratives. Back in 2016, engagement on stories shared by iffy sites were twice as high on Facebook than on Twitter. Now it's 50 percent higher on Twitter.

Les Décodeurs, French newspaper Le Monde's fact checking arm, surveyed 630 exclusively French-speaking sites, and analyzed its activity between January 2015 to September 2018. It showed that in France, engagement with "unreliable or dubois sites" has dropped 50 percent.

While all three studies give Facebook some credit for tackling fake English or French news, none delved into the problems facing countries like Myanmar, India and Sri Lanka that are being flooded by false narratives, some of which have led to real-world violence. Facebook had been very slow in addressing the fake news in Myanmar in particular, where racial tensions were being amplified by false stories, pushing public opinion so sharply against the Rohingya minority Muslim community, that mass genocide and ethnic cleansing occurred and is still occurring. Facebook had only a handful of Burmese-language content reviewers until 2017, but this year has upped that number to 60. Still, because Facebook is such an integral part of Burmese internet culture, it will be a monumental task to moderate.

While there's still more for Facebook to do on this front around the world, it does show that concerted efforts have made a difference in Western countries at least. The next course of action will be spreading this model around to Facebook's 2.23 billion users around the world.