What is the price of trust? Or, to put the question another way, what is the cost of becoming known for doing things that don’t match your words?

To put a number on that, Tony Simons, a management professor at Cornell, took an unusually detailed look inside the hotel industry. In 1999, Simons and a fellow-researcher, Judi McLean Parks, interviewed employees at seventy-six Holiday Inn hotels in the United States and Canada. They surveyed more than seven thousand staff members, in English, Spanish, Chinese, Creole French, and Vietnamese—including five hundred illiterate employees, with whom they did verbal surveys. They asked workers to score, on a scale of one to five, statements such as “My manager practices what he preaches.” Those results allowed them to score each hotel on what Simons called “behavioral integrity.”

Then Simons and his team compared those results with the financial records of each Holiday Inn. “It turned out that trust had a huge impact on performance,” Simons told me this week. Hotels with higher behavioral integrity scores were substantially more profitable than those with lower scores: an advantage of an eighth of a point correlated to a 2.5 per cent advantage in revenues. Simons, who chronicled the hotel study and others in his 2008 book, “The Integrity Dividend,” told me, “This is the single most powerful performance driver ever. It’s more important than employee commitment and worker satisfaction.”

I called Simons on Wednesday, as Facebook, a company whose fortune rests, more than most, on maintaining its users’ trust, encountered its latest round of difficulties. That morning, the attorney general for the District of Columbia filed a lawsuit against Facebook for allowing Cambridge Analytica, the British political-consulting company with ties to President Trump, to gain access to the personal data of tens of millions of users without their permission. It was the first major step by American regulators to punish the company for the Cambridge Analytica case, and other suits, fines, and punishments are likely to follow.

For Facebook, it had already been a long week. On Monday, the Senate Intelligence Committee released two reports—one produced by New Knowledge, a private security company, and the other by Oxford’s Computational Propaganda Project—that offered the most extensive look yet at Russia’s efforts to divide Americans, suppress votes, and boost Donald Trump during the 2016 Presidential election. The reports found that one of the most extensive campaigns was directed at discouraging African-Americans from voting for Hillary Clinton, by stoking distrust in her and in the political system. In a damaging detail, the New Knowledge report asserted that when Congress asked Facebook for information about that campaign, in October, the company “dissembled” in its responses. (The company was asked, “Does Facebook believe that any of the content created by the Russian Internet Research Agency was designed to discourage anyone from voting?” In its response, Facebook said, “We believe this is an assessment that can be made only by investigators with access to classified intelligence and information from all relevant companies and industries.”)

After that news broke, the N.A.A.C.P. called for a one-day boycott of Facebook. In a statement, Derrick Johnson, the president and chief executive officer of the N.A.A.C.P., said, “Facebook needs to acknowledge how it has undervalued people of color over the past two years.”

In Facebook’s telling, it grew so fast that it failed to protect its customers, but now that it is a behemoth, with a greater awareness of its responsibility, the company is doing all it can to mend those holes. Last spring, after Facebook was found to have allowed its users’ private data to get into the hands of Cambridge Analytica, the Facebook chairman and C.E.O., Mark Zuckerberg, visited Congress and vowed to restore control to ordinary users. “This is the most important principle for Facebook,” he said. “Every piece of content that you share on Facebook you own, and you have complete control over who sees it, and how you share it, and you can remove it at any time.” When Sheryl Sandberg, the chief operating officer, testified before Congress, in September, she said the company was now doing all that it could to prevent foreign governments from interfering in American elections: “We were too slow to spot this and too slow to act. That’s on us.”

The clarity of those declarations becomes a problem when Facebook’s actions tell another story. On Tuesday, an investigation by the Times revealed that Facebook gave other big tech companies “more intrusive access to users’ personal data than it has disclosed.” For example, Facebook gave Netflix and Spotify the power to read Facebook users’ private messages. The goal was to let Facebook users share music and entertainment recommendations; there is no evidence that Spotify and Netflix were reading people’s mail. (Both companies said that they were unaware of the access levels that Facebook had granted them.)

But the case reflects a fundamental problem: Facebook was so determined to grow, and to cement the commercial partnerships that would help it grow, that it didn’t pause to build tools that could parcel out narrow slices of information. In other instances, it opened the spigot on private information and forgot to close it. As a result, it once again betrayed its users’ confidence. On Wednesday, a former Facebook staffer who is familiar with those cases told me, “It was chaos. There was no set of principles that says this is how we do things and how we do not do things. They would forget to shut things down, literally for years.”

In a response to the Times story, Konstantinos Papamiltiadis, Facebook’s director of developer platforms and programs, conceded that “we’ve needed tighter management” of data sharing but stood by the company’s claim that “none of these partnerships or features gave companies access to information without people’s permission.” After two years of declining public confidence, that’s an astonishingly obtuse thing to say. Users did have to check a box to integrate Facebook and Spotify. But does Facebook really believe that users understood that it would give Spotify the right to read private messages? On Twitter, Senator Ed Markey, a Massachusetts Democrat, wrote, “Opening someone else’s mail is a federal crime. Why is @Facebook allowed to let Netflix and Spotify open your private messages? Mark Zuckerberg might think of this as just “data,” but this is people’s private lives. We need a law to protect Americans’ sensitive information.”

Which brings us back to Holiday Inn and the cost of losing trust. How many violations does it take for people to think twice before they decide to do business with someone? Simons told me, “Trust is the willingness to accept vulnerability. In a personal relationship, it is the willingness to self-disclose and be honest. For Facebook, it is the very willingness of the informed to participate in their platform.”

The trust calculus can be subtle, but, over time, it shapes behavior. In September, the Pew Research Center reported that forty-two per cent of Americans who use the Web site said that they had at some point taken a break of at least several weeks from Facebook; twenty-six per cent said that they had deleted it from their phones. Since then, Facebook has suffered the largest data breach in its history, which affected an estimated fifty million users. Last month, the Times revealed that Facebook asked a political opposition-research firm to investigate and undercut its critics, in particular George Soros, a frequent target of anti-Semitic attacks.

The company’s reach remains incomparably vast, with at least 2.2 billion users around the world, and most of its recent growth has been in foreign countries, where people aren’t following every tidbit of scandal in the D.C. courts or the Times. But the damage to its reputation among influential users is unmistakable. This week, Walter Mossberg, a pioneering tech journalist, announced that he was giving up Facebook because, as he wrote (on Facebook), “my own values and the policies and actions of Facebook have diverged to the point where I’m no longer comfortable here.” On Tuesday, it was Cher’s turn, in all caps on her favored platform, Twitter: “GETTING RID OF FACEBOOK ACCOUNT I DIDNT KNOW I HAD.”

“Trust is slow to build and quick to be broken,” Simons told me. “A long-built reputation for honesty can be broken by a single lie, and a single betrayal can dash decades’ worth of trust. It will take a lot for Facebook to regain broken trust.”