Amid growing pressure to remove bad actors from Facebook, CEO Mark Zuckerberg said on Wednesday that the company would likely release more information about problematic content posted to the service during elections. But to ensure the accuracy of the data, Zuckerberg said, the reports will likely come after the elections are over. The move could help government officials, academic researchers, and concerned citizens understand whether Facebook’s increased attention to abuse is working — but the timing could make it harder for grasp what’s happening when it arguably matters most.

During a conference call with reporters on Wednesday, Zuckerberg took questions on a range of subjects surrounding the Cambridge Analytica data privacy scandal and its aftermath. Zuckerberg referred to recent steps Facebook has taken to protect the integrity of upcoming elections, including in the United States and in Mexico. The Verge asked Zuckerberg how Facebook would evaluate the effectiveness of the changes he made, and how the company would communicate whether they were working both in the run-up to and the aftermath of those elections.

“One of the big things we’re working on now is an effort to be able to share the prevalence of different types of bad content.”

“One of the big things we’re working on now is an effort to be able to share the prevalence of different types of bad content,” Zuckerberg said. Currently, he said, people only know when bad content is removed if they personally report it or if journalists write about it. In the future, Zuckerberg said, Facebook should share “the prevalence” of different kinds of bad posts: fake news, hate speech, bullying, and terrorism-related content.

But Facebook likely would not do that in real time, he said. “The most important thing there is to make sure the numbers we put out are accurate,” Zuckerberg said. “We wouldn’t be doing anyone a favor by putting out numbers and coming back a quarter later and saying, ‘Hey, we messed this up.’” Done right, such reports would “inform public debate and build trust.”

Arguably, public debate would be enhanced even further if voters had a sense of how fake news, hate speech, and other bad posts were shaping the narrative during the campaign. But Zuckerberg seemed to resist the idea of real-time reporting. “The calculation internally is it’s much better to take a little longer and make sure we’re accurate,” he told The Verge. “I think that’s going to be the way we end up being held accountable.”

He added that Facebook’s reports could become a standard followed by other social platforms. “My hope over time is the playbooks and scorecards we put out are also followed by other internet platforms,” he said. “That way, there could be a standard measure across the industry about how to measure the important issues.”