What did Facebook’s management know, and when did they know it?

The answer may come in response to a shareholder demand, filed by my organization, Free Speech For People, to inspect the company’s books and records related to the scandal in which Cambridge Analytica improperly received the data of 50 million Facebook users while it was working as a vendor to support the Trump campaign in the 2016 presidential election. The demand was submitted under a provision of Delaware corporate law that authorizes any shareholder to seek to inspect corporate records.

ADVERTISEMENT

Even if Facebook’s management refuses to admit it, Facebook’s shareholders need to come to terms quickly with the existential threat that the Cambridge Analytica crisis poses to the company’s business model. Facebook’s

credibility already sustained major blows

over its shifting narrative and slow response to the news that Russians used Facebook to try to interfere in the 2016 presidential election. Now, the revelations that user data was improperly shared with third parties, including Cambridge Analytica, and that Facebook

failed to notify users

when it discovered the breach, have raised concerns about

systemic mismanagement

.

One powerful tool investors have for understanding what management knew, when management knew it, and whether management’s responses have been as egregiously inadequate as they appear, is the shareholder right to inspect books and records to investigate mismanagement and wrongdoing.

Under Delaware law, shareholders may seek to inspect corporate records to investigate matters “reasonably related to their interest as stockholders.” In this case, with loss in market value at $90 billion and counting, and potential liability from legal actions, Facebook's reputational harm is likely to be far more damaging than the Equifax breach, the Wells Fargo scandals, or even the fallout from Volkswagen’s emissions cheating. Facebook’s reluctance to respond proactively leaves investors no choice but to conduct their own diligence to understand what went wrong and how to prevent it from damaging the company any further.

Cultivating the trust of users is critical to Facebook’s business model. Despite the company’s attempt to assuage fears by reiterating that “protecting people’s information is at the heart of everything we do,” the crisis continues to spiral out of control. But since Facebook discovered the sharing of user data as early as 2015, and Facebook’s management appears to have done little to address the issue — or worse, seems to have focused on hiding or minimizing the scope of the problem, rather than notifying users or shareholders and taking actions to protect them.

Though CEO Mark Zuckerberg Mark Elliot Zuckerberg2.5 million US users register to vote using Facebook, Instagram, Messenger Hillicon Valley: Trump's ban on TikTok, WeChat in spotlight | NASA targeted by foreign hackers | Instagram accused of spying in lawsuit The Hill's Morning Report - Sponsored by The Air Line Pilots Association - Trump contradicts CDC director on vaccine, masks MORE finally broke his deafening silence on the issue, his statements raise more questions about the effectiveness of the corporate governance systems in place at Facebook and whether they are functioning.

Ben Shapiro: "What’s genius for Obama is scandal when it comes to Trump" https://t.co/mIEfeMh4o5 pic.twitter.com/LnFZC4egTG — The Hill (@thehill) March 20, 2018

First, the fact that Facebook took steps to prevent such wide-ranging data collection by third party apps by 2015, prompts questions about what triggered that change and whether the reasons for it should have been disclosed to investors.

Second, the fact that it took Zuckerberg and Facebook over a week to come up with a public plan of action when they’ve known this could pose major risks to the company for at least three years and have been subject to a consent decree to prevent such risks for even longer, should shake investor confidence in the company’s internal processes for managing risk. Indeed, at least two sets of investors have filed resolutions to be considered at Facebook’s annual meeting this year that relate to better risk management practices and the misuse of the platform.

Third, this is exactly the kind of risk that Facebook has highlighted and described in detail in its financial disclosures, and yet, it appears Facebook’s audit committee either wasn’t notified of the Cambridge Analytica issue when the company found out in 2015, or the committee somehow concluded that this data mining did not rise to the level of a “material risk.”

Share prices will continue to fall as governments announce plans for investigations and new regulations and as users and investors begin to file lawsuits against the company. When a company’s senior officers and board demonstrate an inability to manage a crisis that strikes at the core of the business, investors can and must call the board to account.

The future of Facebook hangs in the balance, and shareholders may have to force management to face its day of reckoning.

Shanna Cleveland is senior counsel at Free Speech For People. In March, Cleveland filed a books and records request on behalf of Facebook shareholders.