“In a lot of ways Facebook is more like a government than a traditional company,” Facebook CEO Mark Zuckerberg has said.

He elaborated on this claim in a recent interview with Vox’s Ezra Klein. After noting that the Facebook community consists of more than 2 billion people around the world, he wondered if executives “sitting in an office here in California” were the right people to be making decisions for a constituency of that size. He asked, “How can you set up a more democratic or community-oriented process that reflects the values of people around the world?”

It’s good to see Zuckerberg starting to grapple with Facebook’s political responsibilities. But Zuckerberg and Facebook face far bigger challenges than he acknowledges in the interview.

As the technology writer Zeynep Tufekci has argued, it may be nearly impossible for Facebook to reform itself, given its underlying business model. And even if Facebook can reform itself, it faces some extraordinary challenges in building trust with its users and regulators.

The best way to understand this is to start from Zuckerberg’s comparison of Facebook to a government. Facebook is so powerful in its own domain that it is, indeed, like a sovereign state. It can upend the business models of companies that depend on it, or completely change the ways its individual users relate to each other — without them even realizing what has happened.

As Larry Lessig observed decades ago, computer code is law. And today, Facebook’s code establishes critical rules by which more than 2 billion of the world’s people and millions of businesses interact online.

This means that Facebook is a powerful sovereign and Mark Zuckerberg is the key lawgiver. In some ways, of course, the comparison is inexact. Facebook doesn’t have the power to tax, and it certainly doesn’t have what Louis XIV called “the final argument of kings” — the ability to use physical violence to force people to comply with its demands.

And Facebook must answer to the regulators of other powerful sovereign governments, that of the United States and others around the world. But this is less of a check on power than it first appears because the impact of social media on society and the economy is still poorly understood, and regulation takes time to catch up.

Still, once you begin to think about Facebook as a government, a great deal falls into place. Zuckerberg’s dilemmas are not the dilemmas of a corporation but the dilemmas of a king. Thanks to decades of research on political economy, we have some idea of what those dilemmas are and how difficult they are to solve.

A king is all-powerful. That makes his declarations about wanting to give up authority hard to take seriously.

It is difficult for a king’s subjects to trust him because he has absolute power over them. Yet it is also difficult for a king to give up that control.

One of the key problems of political economy, therefore, is “tying the king’s hands” — allowing the sovereign enough power to do his job, but constraining him so that he does not abuse his trust. Facebook’s version of this problem is doubly hard, in that the company faces mixed incentives.

On the one hand, it has to reassure its users, but on the other it has to answer to the advertisers who drive its business model and allow it to make profits — and to the stock-market investors who create such extraordinary wealth for its owners and employees by speculating on the future growth of those profits.

Despite Zuckerberg’s vision of democracy and community control, Facebook more closely resembles an autocratic than a democratic regime. Zuckerberg is the chair, chief executive officer, and controlling shareholder of Facebook.

Standard arguments in political economy suggest that this combination will make it difficult for Facebook to generate trust. Zuckerberg’s promises about new governance arrangements, privileging transparency, and accountability may be sincere. Even so, as game theorists like the late Thomas Schelling have noted, costly commitments are not credible unless there is some means to enforce them when the time comes to deliver. Facebook’s corporate structures mean that such promises are hard to turn into credible commitments unless there is some external incentive that will make them enforceable.

Zuckerberg has defended the way power at Facebook is concentrated in his own person. And it may be true that, as he argues, this unusual corporate governance arrangement allows Facebook to hold off shareholders obsessed with short-term profits and think instead about the long-term interests of its users. But that argument bears an uncanny resemblance to ones made on behalf of “rational” political autocrats. The historian Edward Gibbon, describing the Roman emperor Septimius Severus, argued:

The true interest of an absolute monarch generally coincides with that of his people. Their numbers, their wealth, their order, and their security, are the best and only foundations of his real greatness; and, were he totally devoid of virtue, prudence might supply its place, and would dictate the same course of conduct.

Modern political economists are more likely to emphasize how potential clashes of interest between autocrats and their subjects can generate terrible outcomes and mutual distrust. For example, a classic article by Nobel Prize-winning economist Douglass North and Barry Weingast, on constitutions and commitment problems, highlights how absolute rulers cannot be trusted to be trustworthy.

English kings successfully tied their own hands

North and Weingast point out that from 1603 to 1714, when the United Kingdom was ruled by the absolutist Stuarts, the crown found it enormously hard to persuade bankers to lend it money. The bankers knew that the head of state could simply decide later to repudiate any loan agreement and just keep whatever it had “borrowed.” The result was that the crown was perpetually short of funds and had to resort to forced “loans” that made merchants more determined to keep their money hidden from the government, hurting economic growth.

This only changed after the Glorious Revolution, when a constitutional settlement gave Parliament supremacy — significantly reducing the king’s role and making the monarch subject to the law. The Parliament took on taxing, budgetary, and lending authority. These political limits meant that lenders could trust the crown to repay its debts, making it much easier for the British government to raise money cheaply and efficiently.

Margaret Levi and David Stasavage show that there are other benefits when the executive authority is truly responsible to a legislature representing citizens. Citizens are happier to comply with government demands for taxes, military service, and compliance with the law. When governments are beholden to their constituencies, citizens can respond to broken promises by stopping complying and, at the extreme, removing them from office. When citizens have power over government, the government has strong reason to behave trustworthily, hence generating trust.

This is why tying the king’s hands is a necessary precondition to trust and peaceful economic exchange. It is also a fundamental basis of democracy, which uses institutions to constrain governments to act accountably towards their citizens.

Facebook can’t chop off people’s heads and it has to obey the laws of the jurisdictions where it operates, but, even so, it faces a version of the problem that plagued the Stuarts. Facebook can unilaterally and arbitrarily change many of the laws of its realm (terms of service and market making algorithms among them).

Even if you feel manipulated, there may not be much you can do about it. Network effects — which Facebook has gone to great lengths to reinforce — lock both businesses and users into Facebook, even if they all individually might prefer a different system of rule. As with citizens of traditional states, users of Facebook can exit if they wish, but it is extremely difficult.

The result is that even if Zuckerberg and his lieutenants have good intentions, they will still be too powerful for people to trust easily. It’s possible for powerful actors to break out of these “trust traps,” but it is very hard. North and Weingast discuss two ways in which kings’ hands can be tied: voluntarily or by compulsion.

There are different approaches to tying the king’s hands — some more credible than others

First, the king can try to adhere to an informal norm of behaving responsibly. However, this is very hard to maintain over the long term, as new demands and changing circumstances challenge even the most sincere commitments. Facebook’s competitor Google used to have a simple motto: “Don’t be evil.”

Over time, the company succumbed to the temptation to accept practices that motto would once have forbidden, becoming just a little bit evil, and then just a little bit more, and more, until much of its original idealism seems to have become submerged.

In 2010, Google’s then-CEO Eric Schmidt said: “Google policy is to get right up to the creepy line and not cross it,“ indicating a much more flexible approach to pushing its users’ boundaries.

Second, the king can agree to a constitutional settlement in which he gives up power to become more trustworthy. Zuckerberg has proposed setting up independent bodies that would hear appeals from users regarding Facebook’s decisions. This might be the corporate version of a constitutional compromise.

However, the devil will be in the details. History tells us that it is very hard for kings to curb their power in this way. Even when they recognize that tying their hands is in their long-term interests — because they risk rebellion if their legitimacy comes to be doubted — they are often overwhelmed by the desire to cheat.

For example, even when nation-states seem to give up power to international arbitrators and judges, they create sneaky backdoor incentives to rig the odds in their favor. Rulers can also, more simply, retake power or arbitrarily change the course of policy, either for their own benefit or because they believe they know better what is in the best interest of their community.

If and when Facebook starts to build its own systems of trust, it will face similar temptations. If, for example, Facebook has the power to appoint and reappoint its own “independent” appeal judges, those judges are going to have excellent financial reasons for keeping Facebook happy. They will not have, as federal judges in the United States have, an externally binding constitution to anchor their judgments and provide true independence.

Some past efforts by Facebook to introduce democracy and limitations have failed

Furthermore, democratic processes are only legitimate when they have some real likelihood of binding the monarch to make decisions that he doesn’t like. When in 2009, Facebook faced a previous furor over changes to the ways it shared its users’ data, it responded by proposing a “site governance” system under which its users would have some collective control over their data. In Zuckerberg’s words back then:

The past week reminded us that users feel a real sense of ownership over Facebook itself, not just the information they share. Companies like ours need to develop new models of governance. Rather than simply reissue a new Terms of Use, the changes we’re announcing today are designed to open up Facebook so that users can participate meaningfully in our policies and our future.

However, Facebook’s attempted model of governance involved mass referendums, which were confusing and hard to organize around, and had very low participation. The final referendum was held in 2012, and involved setting the terms under which Facebook could share user data with other organizations. Facebook announced that it would only consider the referendum binding in the (extremely unlikely) case that 30 percent of its global users voted. Since only 668,000 users voted, Facebook felt free to ignore the result, and it has not held any referendums since.

Now, Zuckerberg is arguing in his interview with Klein for establishing a far more independent process: “You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.”

It’s heartening to see Zuckerberg recognizing that Facebook needs real democratic legitimacy and accountability — and that it’s considering more binding procedures. However, this is only the first step on an incredibly difficult journey. We don’t have many good models showing how corporate accountability and democratic accountability can be combined.

If Facebook finds a way to deliver on this promise by giving others the power to appoint those who decide what behavior must be regulated and how, and by making data sufficiently available for a healthy oversight and decision process, it will be a significant move toward tying the king’s hands. However, this system will have to involve an irreversible ceding of power if it is to be credible, and to create the trust that Facebook is now so sorely lacking.

There are those who will say that the market will eventually rein in any company that doesn’t properly serve its customers. Here, economic history serves up the same lesson as political history: When companies become too powerful, they are often ruthlessly successful in extending and preserving that power. At some point, when customers and competitive markets seem helpless and the harms become overwhelmingly obvious, governments typically step in.

They may go so far as to break up monopolies, but more often they tie powerful companies’ hands by limiting the behaviors that have caused the most obvious harm. European regulations on Facebook’s privacy violations are a good case in point. The risk of course is that technology typically moves much faster than government, and today’s regulation may prove insufficient for tomorrow’s challenges.

Facebook has an opportunity to get ahead of this situation by taking proactive steps to tie its own hands before they are tied for it. Its voluntary adoption of the provisions of the Honest Ads Act, whether or not it passes in Congress, is a good example. (The act, proposed by US Sen. Mark Warner (D-VA), would make public who is making sizable political-ad buys on major online platforms, among other requirements.) But if the law does not pass, Facebook will be back in its old situation of being free to change the rules when convenient.

What we need is a partnership between government and platforms in which platforms voluntarily agree to limits on their behavior and establish independent bodies capable of true oversight, and the government provides a backstop of known consequences for the failure to observe those limits. We are hoping that in his appearance before Congress this week, Mark Zuckerberg provides detailed proposals for reestablishing trust, not just vague promises to do better.

The governments and regulatory regimes that we do have are the product of imperfect processes of trial and error over long periods of time. Because we know it will take a long time to create a new cooperative approach to regulating social media, we cannot begin it too soon.

Update, 4/9: On Monday, Facebook announced it is taking a significant, if small, step in this direction by giving access to some of its data to scholars for research on the impact of social media on democracy. The research enterprise will be funded by a consortium of foundations and without any Facebook money; the nonpartisan and independent Social Science Research Council will oversee it. According to the council’s president, Alondra Nelson, “SSRC-appointed review committees will actively engage with technologists, advocates, and ethicists to develop 21st-century academic standards for anonymized digital data use.”

Facebook pledges not to interfere with the research. The hope is to implement a model that other platforms will also adopt to better serve democracy. To be sure, this does not rise to a full binding of hands, but Facebook is ceding power and agreeing to independent oversight.

Henry Farrell is a professor of political science and international affairs at George Washington University. Find him on Twitter @henryfarrell. Margaret Levi is the Sara Miller McCune director of the Center for Advanced Study in the Behavioral Sciences at Stanford University and a professor of political science at Stanford. Find her on Twitter @margaretlevi. Tim O’Reilly is the founder and CEO of O’Reilly Media and the author of WTF? What’s the Future and Why It’s Up to Us. Find him on Twitter @timoreilly.

The Big Idea is Vox’s home for smart discussion of the most important issues and ideas in politics, science, and culture — typically by outside contributors. If you have an idea for a piece, pitch us at thebigidea@vox.com.