The Mark Zuckerberg who arrived at the Hart Senate Office Building on Tuesday afternoon very much resembled an adult. His hair was cut short. His dark-blue suit fit well. His presentation was polished, without glibness or arrogance. He said, “Yes, Senator” and “No, Senator,” with the slightly bemused deference of a curator giving a museum tour. The android-smooth forehead, now thirty-three years old, still appeared unmarked by age or adversity, but there were no flip-flops or pajamas this time; a small loosening of Zuckerberg’s Facebook-blue tie knot was his only variation from the Capitol’s accepted forms.

Despite this evident preparation, Zuckerberg did not have much in the way of answers. Two Senate committees wanted to know how Zuckerberg had allowed the data of tens of millions Americans to be misused—an oversight that may have tilted the outcome of a Presidential election—which went unaddressed for years. They wanted to know how Zuckerberg squared his claim that Facebook’s user data belongs to the user with the methods his company uses to extract forty billion dollars each year from that trove. Zuckerberg exercises an unprecedented degree of control over the personal information of Facebook’s two billion users, and senators wanted to know if these vassals were in good hands.

“Why should we let you self-regulate?” Senator Lindsey Graham demanded. A few moments before, he’d asked, “You don’t think you have a monopoly?” Zuckerberg parried, “It certainly doesn’t feel like that to me.” Not for the first time, he said that he was personally sorry. He repeatedly referred to Facebook’s beginnings in his Harvard dorm room, as though the company’s modest origins could justify anything that might have gone catastrophically wrong in the meantime.

Zuckerberg Goes to Washington Read more of our news and analysis.

Asked repeatedly about the eighty-seven million people affected by the Cambridge Analytica data grab, Zuckerberg said that he himself did not know where those users were located. He said that Facebook could not yet verify the identity of firms running political ads. “We’re working on that now,” he said. He dismissed the notion that Facebook accesses users’ phone audio as “a conspiracy theory” before clarifying that it was not being used “for ads.” He refused to say whether Facebook tracks individual devices without user contact. He wouldn’t say whether all users, or even children, should have to actively consent, or “opt in,” before their data is shared with outside firms. Nor would he say whether Facebook tracks users’ online activity after they log off.

Some of the sharpest questioning came from Senator Richard Blumenthal, of Connecticut, regarding a consent decree Facebook reached with the Federal Trade Commission for deceiving users who believed their information would be kept private. Blumenthal said that the Cambridge Analytica debacle violated the 2011 agreement. By some calculations, that would make Facebook liable for trillions of dollars in penalties. Zuckerberg claimed that he had complied with the F.T.C. agreement. “It certainly appears that we should have been aware of that app developer,” he allowed, referring to Global Science Research, or GSR, the subcontractor that harvested the private friends-only data and turned it over to Cambridge Analytica. GSR’s “app” was a quiz that paid unwitting users for access to their friends’ data. One of GSR’s owners, Joseph Chancellor, now works at Facebook. Senator Blumenthal asked if Zuckerberg had disciplined anyone at Facebook for approving GSR’s access to Facebook’s user base. Zuckerberg said he had not.

Zuckerberg hedged his answers to many yes-or-no questions with “generally” or “my team will get back to you” or concerns about working out “the details.” He repeatedly fell back on Facebook’s utopian rhetoric of “community” and “connecting.” He said that the use of social media in part to help fuel attacks on Rohingya people in Myanmar would be solved by thousands of new multilingual moderators assisted by artificial intelligence. He was “optimistic” that A.I. could be doing this work within five or ten years.

Zuckerberg’s failure to master the details of the Cambridge Analytica scandal only became apparent for a moment, when Senator Patrick Leahy, of Vermont, asked why he didn’t ban the company from Facebook in 2015, immediately after the first reports of the data misuse. Zuckerberg said that he had done so, only to return after a recess and correct his testimony. In fact, Cambridge Analytica had continued to work closely with the Trump campaign through the election. So had employees from Facebook’s own election-sales team. “Twenty-seven months,” Senator Kamala Harris, of California, said, emphasizing how long it took for Facebook to notify its own users.

Again and again, Zuckerberg referred to “a broader responsibility”—one that Facebook had failed to recognize, but one that it was now ready to assume. The company would do more than build “tools” for users; it would make sure those tools were used for “good.” Facebook’s responsibility is indeed broad. Zuckerberg controls sixty per cent of Facebook’s voting shares; his monopoly over the daily flows of human consciousness now rivals John Rockefeller’s control of the oil markets. The scope of this responsibility is especially striking when compared to Facebook’s admitted failure to protect personal data. Facebook has had plenty of practice at briskly apologizing, even as it explains why it is entitled to second, third, and fourth chances. Congress may give Zuckerberg yet another pass. Or, if the senators finally move beyond posturing, they may decide it is time to write some new laws that protect users.