For years, Facebook's company mantra was "Move fast and break things," although few realized that the thing they were trying to break was democracy itself. The past few years have shown that Facebook is a petri dish for cultural disease, the Typhoid Mary of memes.

First there was the election hacking, in which foreign actors sought to foster division in communities by driving wedge issues. Then the fake news, through which hucksters sought to throw any old lie into the cultural milieu and make a profit on the ad sales, helping break down people's credulity and convince them that some pretty insane stuff was happening when it clearly wasn't.

Now, thanks to the tireless, years-long work of journalists like Carole Cadwalladr, Facebook has finally admitted that its user data has been weaponized. Specifically, a trove of profiles that were sold to a branch of the SCL Group, Cambridge Analytica (CA), a private "strategic communications" firm. Which is a euphemistic way to describe a private propaganda agency backed by a cadre of right-wing billionaires.

Facebook's role in this was to provide the platform and, wittingly or not, allow these organizations to run riot. It didn't sanction the sale of the data to CA, but it has known about the sale since the end of 2015. In his NYT Interview, Zuckerberg said that Facebook demanded that CA "legally certify that they didn't have the data, and weren't using it." Except that, as he added shortly after, the "formal and legal certification" CA provided was apparently false.

Zuckerberg also said that he didn't expect to become the world's arbiter of free speech, and he wants the policies to be shaped by the community. Unfortunately, there is no one-size-fits-all model that will apply globally, because every culture is different. Take blasphemy: While in the US you would be unlikely to face punishment for taking the Lord's name in vain, it's a different situation in other countries.

That is, perhaps, the problem with building a platform that you want to be universal and then forgetting that other people don't necessarily share your values. Zuckerberg says that the Facebook project of "building a community for people all over the world" to "connect across boundaries" is unprecedented. The fallout from those connections, or even just from sharing the information globally, can be problematic in the extreme. For instance, with the violence in Myanmar, the company stands accused of condoning the massacre of Rohingya Muslims because it censored posts from a resistance group.

Zuckerberg's response was almost offensively anodyne, saying that he and his team are seeing "new challenges that [he] didn't think anyone had anticipated before." He added that he didn't believe it was possible to "know every issue that you're going to face down the road." And that the company has a "real responsibility to take all these issues seriously as they come up" and "make sure we solve them."

Sadly, the time for such naïveté has long since passed, especially as Facebook can now see where its action -- or inaction -- has led. It is not wholly responsible, sure, but it has been at least partially complicit in the current turmoil in the US, the fracturing of the European Union and plenty of violence. Not to mention that the problems that Zuckerberg says were impossible to predict were ... fairly easy to predict.

It's hard not to feel a little sympathy for Zuckerberg as the techno-utopia he created begins to twist and crack under its own weight. If it was his plan to build an Agora for the modern world, a town square where the millennial versions of Plato and Socrates teach and debate, then he must be disappointed. After all, in their place, Facebook has instead helped spread the gospel of modern-day intellectual flyweights.

Not to mention all of the people raking him over the coals for not responding sooner, or the senators demanding that he show up and testify before Congress. It is a little unfair to demand that the CEO of a company that employs 25,000 people be aware of the minutiae of every aspect of their business. On the other hand, if Zuckerberg isn't spending every waking hour of his day trying to solve this crisis, then something is very wrong.

In his CNN interview, Zuckerberg said that he welcomes regulation, so long as it's the "right kind" of regulation. He declined to explain precisely what that would entail, but said he'd love to see tighter rules around online "ad transparency." Online ads are, after all, a wilderness compared with the worlds of print and TV, where there are obligations to disclose where the money has come from.

It's just a shame, you know? That Facebook, a company that has never sold ads and doesn't have a dominant position in the online media world, is powerless in this context. If it were a massive ad platform that had a huge chunk of the ad market, then it could have tightened its own standards years ago, before regulators stepped in. In fact, that would demonstrate a level of leadership that, in turn, would demonstrate a real commitment to change. Wouldn't that be cool?

Perhaps it was unfair to castigate Zuckerberg for not speaking out on this issue sooner, as if rushing to judgment is somehow an admirable quality. It is right, and something that should be lauded, that people don't simply open their yaps and begin speaking as soon as they're asked a question. People should never be bullied for wanting to know something before sharing their opinion on it.

The problem for Zuckerberg, of course, is that journalists first informed Facebook about the Cambridge Analytica leak at the tail end of 2015. It's tough to argue that he didn't have enough time to prepare a response.