Facebook is keeping a close eye on misinformation in the lead-up to 2018's elections. Which elections, exactly? All of them, according to the team working within the company to combat fake news. That means Turkey in June, Mexico in July, Rwanda in September, Brazil in October, and the US in November, to list just a few. It's a lot to keep track of, even—or perhaps especially—for a company as large, influential, and scrutinized as Facebook.

Which is why the company wants help. Last month, Facebook, together with the non-profit Social Science Research Council, announced an initiative that will connect independent researchers with Facebook's vast and, until now, largely inaccessible troves of data on human behavior. The goal: investigate social media's impact on elections and democracy.

The initiative is significant for many reasons, but here's the big one: It will, for the first time, enable researchers to not only access Facebook's data, but publish findings from that data without pre-approval from Facebook. That means if scientists uncover something in the social network's data that makes it look bad, Facebook won't be able to prevent them from making that information public.

At the time it was announced, Facebook and the SSRC provided few details about the then-unnamed initiative. Nearly two months on, the endeavor still has no official name, but some details are beginning to emerge—including how the initiative will protect Facebook users' data from the kind of misuse that landed Mark Zuckerberg in congressional hearings last month.

“Facebook is going to provide encrypted laptops," says political scientist Gary King, director of the Institute for Quantitative Social Science at Harvard University. He calls the laptops virtual clean rooms. They'll provide researchers remote access to Facebook's infrastructure while recording every click and keystroke. "It’s not a laptop you’d ever use to send personal messages. That’s not its purpose. Its purpose is to provide a level of security similar to what you'd find in a locked room in Menlo Park. Its purpose is to avoid another Cambridge Analytica.”

How will the auditing work? The details are TBD says King, who, together with Stanford legal scholar Nathaniel Persily, developed the industry-academic partnership model that Facebook will use to share its data. But some analyses will happen in real-time, via automated scripts. Others will be conducted on a post-hoc basis by experts trained to decipher log files—the record of activity on each laptop, including what information was requested, who requested it, and what they did with it.

That last bit is crucial: King says researchers will only be allowed to access data relevant to their proposed research; the hypotheses they are testing and the data they'll need to test them will be agreed upon ahead of time. Unlike the data Cambridge Analytica used, the data made available through the initiative with protect the privacy of individuals. No data will be stored on the laptop, and researchers will need to permission before removing any data (say, for publication purposes) from the device. If a team of scientists wants to investigate a different question or access other data, they'll need to submit a separate proposal for consideration.

While Facebook will provide the laptops, King says the company will not monitor how researchers use the hardware. Instead, auditing will be overseen by a commission of independent experts, who will be recruited not by Facebook, but King and Persily. "Facebook may provide some systems operators, but their purpose won't be to spy on us," King says. “It will be to help us ensure everything's running properly."