In April 2018, Facebook said it would share detailed data with researchers and academics that could be studied to investigate and curb the spread of disinformation on its platform.

The idea was to see what went wrong in the 2016 election and prevent election interference ahead of the 2020 campaign.

18 months in, these researchers say that they haven't been given nearly as much data as they were promised and it's preventing them from doing their research, according to a new report from The New York Times.

For Facebook, the issue is sharing the necessary information while maintaining users' privacy.

Visit Business Insider's homepage for more stories.

Facebook's promises of sharing detailed amounts of data with researchers and academics to enable them to study and flag disinformation on the site ahead of the 2020 campaign seem to have fallen short, according to a new report from The New York Times.

In April 2018, the social networking giant outlined its plans to provide data to academics to "help people better understand the broader impact of social media on democracy — as well as improve our work to protect the integrity of elections."

But 18 months in these researchers say that they haven't been given nearly as much data as they were promised and it's preventing them from doing their research.

According to The Times, seven nonprofit groups that are funding the research efforts have threatened to end their involvement because of the lack of data.

For Facebook, the issue is sharing the necessary information while maintaining users' privacy. "We're continuing to make additional data available while making sure to do so in a way that safeguards people's privacy," a spokesperson for Facebook told Business Insider. "This data has already begun to allow researchers to answer important questions about the role that social media plays in democracy and elections."

Facebook partnered with an independent research commission, known as Social Science One, to determine what information could be sent to researchers. Privacy experts that were brought in by Social Science One raised concerns about disclosing too much personal information, according to The Times. The result was that Facebook started to apply "differential privacy," which means that researchers can learn a lot about a group from data, but virtually nothing about an individual, The Times wrote.

But some of the researchers say that Facebook's privacy concerns here have been overblown, and it's limiting their work.

Dipayan Ghosh, a former privacy and public policy adviser at Facebook and a fellow at the Shorenstein Center at Harvard told The Times that, "Silicon Valley has a moral obligation to do all it can to protect the American political process.

"We need researchers to have access to study what went wrong," he said, referring to disinformation campaigns that spread during the 2016 election campaign.

"At one level, it's difficult as there's a large amount of data and Facebook has concerns around privacy," Tom Glaisyer, chairman of the group of seven nonprofits supporting the research efforts, told The Times. "But frankly, our digital public square doesn't appear to be serving our democracy," he added.

Read more: Facebook is the most popular social network for governments spreading fake news and propaganda

In October 2017, Facebook admitted that 126 million Americans had likely seen Russian misinformation over a two-year period up till August 2017.

Disinformation is still rife on the platform and is continuing to grow. Last week, research from the University of Oxford showed Facebook was the number one global platform of choice for political parties and governments to spread fake news.