Facebook’s annus horribilis continues to exceed the expectations of even its harshest critics, with a withering New York Times exposé of management blunders just the latest installment in Facebook’s troubles. As overheated speculation continues as to whether this is a premature decline for the company or simply temporary growing pains, there remains one inescapable fact: Facebook is faced with a problem of not just optics, but of innovation.

Anyone who worked in newspaper publishing over the past twenty years will at some point have found themselves metaphorically beaten about the head with the 1997 book The Innovator’s Dilemma: When New Technologies Cause Great Firms To Fail, by Harvard Business Professor Clayton Christensen. The book’s premise, wrapped up in plenty of talk about value creation and S curves, is that companies with big businesses cannot change the basic, successful core functions of those businesses quickly enough to innovate against their coming obsolescence. Newspapers, too, were becoming obsolete, but the grand publishing houses that produced them would not be able to meet the speed required by new technologies (the internet and social media) to adapt.

ICYMI: A reporter asked for 20 years of lottery winner data. After analyzing the records, he noticed something unusual.

Now, the innovations that upended creaking legacy media businesses are facing the same dilemma of how to set aside growth and re-engineer for long-term survival. To do that, they’ll need to insert safety and civic values into the core of a culturally insensitive product. For ten years the only innovations Facebook was prepared to tolerate were innovations that directly contributed to the company’s success, combining scale and speed with commercial targeting. Its looming challenges lurked beneath diminishing phrases like users and content; calling supposedly neutral content propaganda, threats, or abuse would force the company to organize in a totally different way. If they had paid more attention to how their own developing role as a publisher would reshape their business—as well as the newspaper business—rather than stubbornly insisting they bore no responsibility for cultural and political disruption, they would not be in this catastrophic mess. To use Silicon Valley parlance, you could say that they lacked vision about what they were becoming, and now do not have adequate skills to innovate their way into that space.

Nowhere was Facebook’s lack of vision more evident than in the startling details of The New York Times’s investigation into the company last week. The article laid out the levels of denial and obfuscation around the nature of the civic and political problems which beset the heart of the company’s products. Russian propaganda masquerading as American election chit-chat flowed freely through the unbelievably simple self-serve advertising platform, yet there were no guard rails against this stunningly obvious scenario. Safety and security experts warned the board about the attendant risks, but the high-powered executives at the company could not anticipate what others could plainly see.

Just as print companies in the 1990s failed to prioritize software development or creative technology, Facebook and others have failed to prioritize the forces of change that threaten the social economy. Experts in the political environment and publishing risk were absent from the upper echelons of the company, and critical functions, like moderation, were outsourced to low-paid offshore units.

Sign up for CJR 's daily email

As we heard from the Times investigation, Facebook hired a lobbying firm in the past year to stave off criticism from all directions, although there appeared to be no oversight of its activities by the most senior executives in the company. In cultural and political organizations, optics aren’t adjacent to the main business, they are the business. But because Facebook still sees itself as an engineering company, its internal organization is not attuned to the daily risks of publishing. The crisis at Facebook resides in its backward-looking attitude to technology; seeing it is a neutral vector for commerce rather than a potent cultural intervention.

Although a decade is a short period in an industrial cycle, it is a generation for cultural changes, so it is not surprising that Facebook did not see the necessity for transformation until it was too late. The question is whether this lumbering company, beset by out-of-date business school thinking, can transform itself—whether it has the ability to shed its linear thinking, tear itself down, and rebuild to become more appropriate for the moment.

The crisis at Facebook resides in its backward-looking attitude to technology; seeing it is a neutral vector for commerce rather than a potent cultural intervention.

In two reports released by Facebook in the same week as the devastating assessment by the Times, there are signs that the company at least understands the imperative to change, even if it doesn’t have the necessary skills to do so. The first, self-commissioned, report was on the disastrous episode of ethnic cleansing of Rohingya Muslims in Myanmar, for which Facebook’s platforms appear at least in part to be culpable. In it the recommendations for remedy were, in retrospect, incredibly basic to any publishing company: Hire people who speak the language and understand the culture, don’t allow just anything to be published into a volatile political environment, think about journalism and media literacy as an important part of what you do. No other publisher would enter a market without thinking through these propositions.

The second was a post from Facebook CEO Mark Zuckerberg himself on the issue of how the company would make decisions about what not to publish in the future, or “moderation,” as Facebook prefers to call it. “A Blueprint for Content Governance and Enforcement” is in some ways the most important update the company has published in recent memory. It shows Zuckerberg acknowledging how much and how quickly the company will have to change to meet its aspirations in this area. Marginal concerns will have to become central, the management of expression as important as product or engineering. For an old-fashioned software company, this is a radical step.

Zuckerberg and the board of Facebook always knew that they would be disrupted just as they were once disruptors. But they probably imagined it would come from graduates at the engineering department of Stanford, rather than from the heart of political and civic movements, which are urging a different kind of disruption altogether.

ICYMI: Conservatives trust conservative media. Here’s why.

Has America ever needed a media watchdog more than now? Help us by joining CJR today

Emily Bell is the director of the Tow Center for Digital Journalism at Columbia Journalism School.