After two days of bruising testimony before Congress, there’s never been more interest in regulating Facebook. In question after question this week, lawmakers seemed to take it as a given that new rules were needed to rein in Facebook, with proposals like the Markey-Blumenthal CONSENT Act (which would require opt-in consent for all data-sharing), taking center stage. At the same time, Congress doesn’t seem likely to act soon; most bills come from the Democratic minority, and both chambers are already settling into gridlock.

In both hearings, Zuckerberg insisted he wasn’t opposed to new legal restrictions on the platform, although he demurred when asked to support specific measures. But there was one moment when he showed more interest than usual: when Sen. Brian Schatz (D-HI) mentioned Yale Law professor Jack Balkin’s concept of an information fiduciary, Zuckerberg seemed to perk up.

“I think it’s certainly an interesting idea,” Zuckerberg said, “and Jack is very thoughtful in this space, so I do think it deserves consideration.”

“it’s certainly an interesting idea.”

Balkin’s idea is simple: we’re trusting services like Facebook with our data, and that trust should come with concrete legal responsibilities. To make that happen, Balkin proposes designating cloud providers as “information fiduciaries,” binding them an industry-wide code of conduct modeled after similar designations in law, medicine, and finance. In the abstract, the rule would require Facebook and other companies to not act against user’s interest, leaving courts to decide the penalties when they do. Crucially, Balkin’s fiduciary rule could be put in place by any number of agencies, including state-level legislatures, letting privacy advocates sidestep Congress entirely.

Balkin says it’s a much-needed course correction for the industry. “I don’t want these companies to crash and burn,” Balkin told The Verge. “For me, the most important balance that needs to be struck is ensuring that you can provide viable social media, without allowing particular business practices that are manipulative.”

Creating an “information fiduciary” designation could also be more effective than the other options currently facing Congress. Both the Markey-Blumenthal bill and the EU’s GDPR focus on the importance of user consent and making sure it’s as clear and informed as possible. In practical terms, that means telling users what data is being collected, and then getting them to click a box that says “I Accept.” Balkin says that approach is too easy for platforms to game. “It’s very easy to get consent from end users,” Balkin says. “They’ll just click and go. So consent-based reforms often look really great on paper but don’t have any practical effect.” Even if we add mandatory opt-ins for data collection (as in the Markey Bill) or clearer descriptions of how data is used (as mandated by the GDPR), there’s a good chance users will simply click through the warnings without reading them.

“It’s very easy to get consent from end users. They’ll just click and go.”

Balkin’s fiduciary approach would attack the problem from a different angle. Instead of counting on users to understand the data they’re sharing, it establishes up front that services are in a privileged position and bear the blame if things go wrong. In some ways, this is already how Facebook talks about its relationship with users. Over and over again this week, Zuckerberg talked about earning user’s trust, and how the platform only works when users trust Facebook with their data. Balkin’s fiduciary rule would put that trust in legal terms: establishing that Facebook users have no choice but to share data with Facebook, and as a result, requiring that the company be careful with that data and not employ it against the user’s interest. If Facebook failed to uphold those duties, they could be taken to court, although the nature of the proceeding and the potential penalties would depend on how the rule is written.

That might sound unusual, but it’s a surprisingly common concept in the law. Doctors already have a fiduciary duty to their patients, binding practitioners to recommend treatment based only on genuine medical needs. Lawyers have a similar fiduciary duty to their clients, restraining them from misleading a client for their own advantage. In each case, the point of the designation is to recognize an imbalance of power. Doctors know so much about medicine that patients can’t hope to keep up; the only option is to trust your doctor and put in place extra penalties for anyone who violates that trust. According to Balkin, we’re at the same disadvantage when we hand data over to Facebook and other tech companies.

“It’s the kind of scandal that wakes people up.”

Facebook doesn’t have any fiduciary duty to its users right now, but there are a number of ways to establish one. Congress could pass a federal law establishing Facebook’s fiduciary responsibilities, or a federal agency like the Department of Labor could use existing authorities to make the designation. In one recent example, the Department of Labor designated investment advisers as fiduciaries, a regulation that was subsequently rolled back by President Trump.

That might seem like a long shot with Congress in disarray and Trump more interested in rolling back regulations than adding new ones — but it could also happen at the state level. Privacy-focused states like New York and California could pass its own laws, which could be just as binding without conflicting with federal law. Like any regulation, the new rule would likely face a serious legal challenge — most likely on federal preemption or first amendment grounds — but Balkin has written extensively on how to navigate those challenges.

A fiduciary rule wouldn’t solve all the problems with Facebook. It wouldn’t address concerns about monopoly power. It also wouldn’t address deeper concerns about Facebook warping society at large, subverting democracy, or radicalizing groups through filter bubble effects. Those are genuinely hard issues, and there are few clear ideas about how to address them in law — even if Congress wanted to. Even if Balkin’s rule goes through, there will still be a lot of work to be done. But for the specific question of how tech companies handle our data, the idea of an “information fiduciary” could be the easiest way to add some legal weight to floating ideas about trust and privacy on platforms. As more and more personal information finds its way to the public sphere, Balkin’s model is one of the few ideas we have for protecting ourselves.

“There’s a sense in which this Cambridge Analytica scandal is just the tip of the iceberg,” Balkin says. “But it’s the kind of scandal that wakes people up. That’s why Mark Zuckerberg is spending a couple of days on Capitol Hill testifying. If there wasn’t a scandal, no one would have paid attention to the problem.”