In recent years digital platforms have made independent scientific research into potentially consequential phenomena such as online disinformation, polarization, and echo chambers virtually impossible by restricting scholars’ access to the platforms’ application programming interfaces (APIs). The Social Science One [https://socialscience.one] initiative, specifically designed to provide scholars with access to privacy protected data, has made important progress over the last 18 months, but Facebook has still not provided academics with anything approaching adequate data access.

In particular, Social Science One’s work with Facebook has been continuously delayed; more than a year and a half into the venture and Facebook has yet to release the full set of URLs data it promised. Instead, scholars have received a ‘URL light data set’ that is extremely limited in scientific value. Some researchers have also been provided access to CrowdTangle, a platform for investigating posts from public pages, which was previously available primarily to journalists. This new access is appreciated. However, CrowdTangle also has significant limitations. What is more, the anticipated full URL dataset was intended to be just one of many datasets to come. Under the current circumstances, there is good reason to doubt whether other useful data will be forthcoming.

Because Facebook has not been able to provide even the initial, aggregated dataset of URLs shared on the platform, Social Science One’s philanthropic funders have begun to withdraw.

As members of the European Advisory Committee of Social Science One we – along with the co-chairs – are frustrated. On the one hand, we were genuinely interested in helping to build a model to support academic research, and we appreciate the efforts of the specific data science teams within Facebook have made to this end. On the other hand, the eternal delays and barriers from both within and beyond the company lead us to doubt whether substantial progress can be made, at least under the current model.

The current situation is untenable. Heated public and political discussions are waged over the role and responsibilities of platforms in today’s societies, and yet researchers cannot make fully informed contributions to these discussions. We are mostly left in the dark, lacking appropriate data to assess potential risks and benefits. This is not an acceptable situation for scientific knowledge. It is not an acceptable situation for our societies.

As the European Advisory Committee and co-chairs of Social Science One we call upon platforms and public authorities for the following actions:

Facebook should commit at the highest level of the organization, and with all necessary means, to make accurate and representative data available for scientific research into the most pressing issues of public concern. This should include the originally promised URLs dataset, but it should also include access to Facebook’s APIs. Facebook reviews and approves business access to its APIs. It should make them available for scientific inquiry as well. Delays in these matters can no longer be tolerated.

The same level of commitment should be required of other digital platforms. Facebook has taken a large part of the blame for the current situation, but deserves a great deal of credit for being the first company to explore a significant platform-academic partnership model.

The major digital platforms—Facebook, Google, and Twitter—should offer formal, written analyses of any legal barriers they claim prevent them from providing access for academic research, including with regards to the European Union’s General Data Protection Regulation (GDPR).

European and member state-level authorities around Europe should provide official and actionable guidance on what data can and cannot be shared for research, especially with regards to GDPR.

We recognize the responsibility of researchers to ethically receive and analyze any data. All appropriate steps should be taken to preserve platform users’ privacy and other digital rights. With this in mind, we call for support from both the platforms and public officials to create so-called research safe harbors, i.e., spaces within which scholars would directly access and analyze sensitive personally identifiable data. Modeled on certain research using public administration, health and medical data, such safe harbors would place clear and robust limits on the type and amount of data researchers could access, as well as the means of analysis researchers could undertake.

Public authorities should assist in developing independent verification of platform data. Because researchers are beholden to the platforms for virtually all data, their work risks appearing compromised or otherwise suspect to the public. Even if the researchers and their analyses are considered credible, all findings rest on trust that the platforms have provided complete, accurate data. Data verification by independent, third-party auditors is essential to generating confidence in research into digital platforms’ impacts across Europe.

We call upon platforms and public authorities to take steps very quickly, with each of the above actions completed within the first half of 2020. The consequences of failure—again, for both scientific knowledge specifically, and our democratic societies more generally—are too dire.

December 11, 2019

The European Advisory Committee Social Science One

Claes de Vreese, U Amsterdam (Chair)

Marco Bastos, UCL

Frank Esser, U Zurich

Fabio Giglietto, U Urbino

Sophie Lecleher, U Vienna

Barbara Pfetsch, FU Berlin

Cornelius Puschmann, U Bremen

Rebekah Tromble, George Washington U

&

Social Science One co-chairs

Gary King, Harvard U

Nathaniel Persily, Stanford U