In the world of tech, privacy has become a selling point. Apple promises that “What happens on your iPhone, stays on your iPhone.” Google claims it’s making privacy more than a “luxury good.” Facebook CEO Mark Zuckerberg claims “the future is private.” But consumers are often still at the mercy of companies. We have to trust that platforms aren’t secretly tracking us or using our data to train facial recognition algorithms — and that their privacy policies, which can be vague or impenetrable, offer any real protection.

Johnny Lin and Rahul Dewan, the co-creators of a new standard called Openly Operated, want to change this. Openly Operated is a set of guidelines for auditing how apps and web services deal with user data, like a combination of a report card and a seal of approval. But it’s also a bid to change the terms of the privacy debate — as Lin puts it, to get past the sense that when ordinary users think about privacy, they figure “I’m screwed anyway, so why should I care?”

An OO-certified app or site must meet three criteria. First, it needs to demonstrate “a basic level of transparency” by making its code and infrastructure — among other things — public and fully documented. Second, it needs to lay out its policy in the form of “claims with proof,” establishing what user data is collected, who can access it, and how it’s being protected. Third, those claims must be evaluated by an OO-certified auditor who then makes the audit results public.

The site OpenlyOperated.org, for example, is OO-certified. (It’s one of two OO-certified services right now, alongside Lin and Dewan’s Confirmed VPN.) Its audit report lists several easily readable and footnoted claims about the site, including the claim that your email address is kept totally private — even from the site’s operators. It then includes details about the encryption system that makes this possible, plus statements from cybersecurity consultants who corroborate the claims. While companies can already run privacy audits, Openly Operated’s branding is supposed to promise a certain level of depth, in addition to guaranteeing transparency.

“We’ve kind of created this system today where the privacy policy is totally an afterthought.”

Lin (a former iCloud engineer) and Dewan (creator of the iOS second screen app Duet Display) wrote a popular study of shady iOS apps in 2017. Now, they want Openly Operated to nudge consumers toward apps and sites that value privacy and transparency, while giving companies an incentive to behave well. “We’ve kind of created this system today where the privacy policy is totally an afterthought for smaller companies. And for bigger companies, it becomes unmanageable, because you started out as a smaller company,” says Lin. “People before were taught to move fast and break things. Our solution is no, slow down — because when you do that user privacy goes in the backseat.”

“Openly Operated” evokes the term “open source,” and it requires open source software. The project’s goal is to evaluate specific promises about companies’ behavior, though, not just their code. It shares broad goals with legal frameworks like Europe’s GDPR laws, but it’s voluntary and more focused on transparency than specific practices. And it’s not necessarily devoted to pulling behemoths like Facebook on board; it seems more like a way to lay the groundwork for the next generation of popular services.

OO certification, notably, doesn’t specify a particular level of privacy. Companies could still do things like sell targeted ads or deal with data brokers. They’d just need to spell this out in the terms of service. “We don’t actually stop companies from dealing with third parties if that’s what they want to be open and honest about and if that’s the agreement they have with users,” says Lin.

Similarly, the system seems best equipped for platforms that take a hands-off approach to data. A site like OpenlyOperated.org might promise an encryption scheme that prevents anybody from seeing email addresses, but it’s less clear how to make meaningful promises about sharing publicly accessible information — or information that’s shared with non-OO-certified companies. Flickr owner Yahoo created a research data set from public user photos, for example, but drew controversy when one specific company accessed the data for facial recognition training.

Openly Operated isn’t trying to solve every problem itself

Openly Operated seemingly isn’t attempting to solve all these problems itself. Instead, Lin hopes that it will give companies reason to change their practices, whether that means sharing less data or building encryption infrastructure that lets them prove their claims. “I don’t believe there’s a limit to clever ideas that people can come up with,” he says. “I think often people just need to think a little bit harder and take an extra few minutes to design something that may be kind of counterintuitive to the developer but better for the user.”

All this relies on Openly Operated gaining a user base. The program technically launched in April, but it’s only coming out of stealth mode this month, and Lin says that “our priority is creating a transparency standard that puts the user first,” so sales and partnerships “haven’t been a strong focus” so far. Lin and Dewan will be fighting against the fact that people really do have a sense of learned helplessness around privacy. Even if users think it’s important, they’re very accustomed to giving it up.

The Openly Operated site lists a number of potential benefits for companies, including the fact that other companies might be more comfortable dealing with a partner that’s spelled out its security practices well. Unlike ordinary users, however, companies may be better equipped to navigate existing auditing programs — and have less need for a clearly readable standard.

But amid discussions of how to regulate data protection, Openly Operated lays out an attractive paradigm for privacy. It imagines a world where apps’ terms of service get evaluated in the same way as their interfaces or feature sets, and where the onus is on companies to earn users’ trust, not dazzle them with big claims or grudgingly submit to evaluations. “Our vision is that in the future, every [app and site] review comes with a section that also says: ‘And are they audited? Do we have any major privacy concerns that we can definitively point at and look at? Are they transparent?’” says Lin. “We’re trying to raise the bar.”