Facebook's reputation is under pressure from a data leak affecting up to 87 million users, calculated foreign manipulation of its advertising platform, and reports out of Myanmar and Sri Lanka that its services were weaponized for ethnic cleansing and genocide among other things.

To address these scandals, Facebook has crafted a careful message: "We were too slow."

It's a smart message. Admit fault, apologize but maintain the assumption that you can be better and faster. Make sure everyone knows you have the tools and wherewithal to block manipulative foreign ads, or take down hate speech or keep users' private information private — you just didn't get there quickly enough.

It's the company's cut-and-paste explanation in written answers to Congress, in media interviews and in official statements from CEO Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg. In Facebook's recently released 748-page response to House members, the phrase "we were too slow" appears 47 times in repeated, identical passages. (Facebook declined to comment on its messaging.)

It's a particularly ironic message given that one of Facebook's internal guidelines used to "move fast and break things." And it's also potentially a premature conclusion.

It's possible that the world's leading social media company was too naive to recognize abuse of its services. Zuckerberg admitted in prepared testimony before Congress in mid-April, "Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring."

It was sometimes naive to a fault, Sandberg conceded to NPR, adding, "We did not think enough about the abuse cases."

Or, Facebook might have caught glimpses of wrongdoing but ignored it — the company only recently admitted it bears responsibility for the content that exists on its platform. Zuckerberg said in April that the company is "clearly" responsible for what's shared on its site, despite maintaining Facebook is not a media company.

"We do pay to help produce content," Zuckerberg said then. "We build enterprise software. We build planes to help connect people, but I don't consider ourselves to be an aerospace company."

Or, what might be the worst option: Facebook may not have had the technology or people to police its platform the way it increasingly needs to be policed, given its 2 billion members and growing influence.

It could also be a combination of the three. The jury's still out.

A bunch of international government agencies have launched investigations to determine how much Facebook knew about its privacy leaks and when. The Washington Post reported Monday that the FBI and SEC had joined the Justice Department in an inquiry of Facebook, in addition to an earlier confirmed probe by the FTC.

The results of those inquiries will likely shed light on whether Facebook was naive or negligent. As for whether Facebook was or is equipped to police its platform, the company itself is starting to hint to the contrary.

Facebook announced Tuesday a talent deal for the artificial inteengineers of London-based Bloomsbury AI, as earlier reported by TechCrunch. It's notable that after months of touting its internal teams and expanding its fact-checking programs, Facebook is looking outside the company.

Notable and commendable. It's hard to fault Facebook for pursuing any strategy that might make its platform better. But it's hard to believe that acting too slowly was Facebook's only problem. And the company's senior executives should stop regurgitating messaging that it is.