The stories turn up daily: Social media is being used to undermine democracy. Someone has run off with millions of Social Security numbers stored by a major financial institution. Internet service providers are selling our browsing history to marketers. There are machine-learning algorithms that literally don’t see black people. Facebook has apologized (again) for something horrible it has facilitated (again).

This stream of bad news showcases the far-reaching impact of how our personal data is used — and misused. I’d say we’re in a crisis, but in the interest of positivity and solutions-oriented thinking, I’ll call it “an opportunity.” This moment of deep distrust in big tech gives us an opportunity to rewrite the rules, formal and informal, governing how the data we generate is collected, used, and valued. In doing that, we can write a different future for ourselves.

Right now, a few pioneering companies — big platforms like Facebook, Google, and Amazon — are extracting most of the value from the data that’s being collected whenever we power up our laptops, write an email, go anywhere with our phone in our pocket, take a photograph, talk to Alexa. In exchange, these companies offer us photo storage or messaging or upgraded mapping. But there’s a lot more happening behind the scenes.

We could each try to make a difference on our own. We could engage in the latest version of #Delete[Name of Popular Service], install virtual private networks, and live inside homemade Faraday cages where we exchange information exclusively via encrypted USB sticks transported by armed carrier pigeons. But individual acts alone won’t move the needle. They rarely do. (See: climate change.)

Since companies value us collectively, we must restore balance with a collective response that is based on the view that we’re in this together — that our rights and responsibilities are shared. It’s one of the reasons I’m an advisor at Data & Society, a research institute in New York focused on the challenges wrought by data-centric technological development. (The other reason is free Wi-Fi.)

Here is my first draft proposal for restoring some balance and trust between the tech companies that are shaping the future and we the people.

1. Offer Real Transparency Around Data Collection and Usage

Real transparency means we should be able to see how our data is being used while we interact with a platform as easily as we can find out that someone “liked” our post. We should understand, from a data-extraction perspective, what is inside the tech products we use. And we deserve to know clearly and upfront what companies are doing with our data, including how they are monetizing it—even if they’re not selling the raw data itself.

Matt Reynolds, a writer for Wired U.K., calls Facebook a “dual-headed beast” that has for years been perceived by advertisers as a sophisticated tool for targeting customers, while users think it’s a convenient way to keep in touch with friends. Real transparency means that the user is fully informed about both sides of the business without having to read novel-length legal documents. (Real transparency also means that if you’re a massive tech company that, say, exposed the data of 87 million users, you wouldn’t threaten to sue the journalists who brought it to the world’s attention, and you would let those users know in a timely manner.)

If these companies want to earn our trust, I propose they take a cue from the food industry. We don’t individually drag chemistry sets to the grocery store in order to measure the ingredients of our food. Instead, companies are required by the federal government to include standard nutrition labels on their products, and many now go much further to increase transparency and brand trust with their customers about how that food is brought to market.

Imagine something like a “data usage label” or scorecard that demystifies the terms of service and allows users to see if a service collects information about our friends, tracks our location, encrypts our records, or wipes our data at regular intervals.

Companies could then compete for our attention based on these data scores and based on who protected our data best—rather than who exploited it the most.

2. Change Data Defaults from Open to Closed

Defaults matter. I’m going to guess that 90 percent of users don’t change the default settings of a technology product they buy or use within the first six months. I admit that number is an educated guess — such things are not widely studied — but I also suspect it’s close to true. We sign up for a service and trust that the people who made it aren’t trying to rob us (and who has time to flip through all those settings, anyway?). But they are, metaphorically, out to rob us.

Most tech products grab as much data from as many users as possible regardless of whether that data is currently useful to them. They lay claim to something they assume will be valuable in the future, and they assume we won’t challenge them on it. Mostly, we don’t.

But in most cases, companies don’t need all that data to provide their services. So what if they flipped the defaults? What if the data extraction defaults were as constrained as possible, taking a more “data conservationist” approach? Mozilla offers a simple starting point through what it calls lean data practices. The policy is a win-win: It protects users and limits companies’ liability, because the less data they store, the less someone can steal from them.

Bottom line: Tech companies should treat our data like added sugar or reality TV, and consume as little of it as possible.

3. Respect Our Right to Our Own Data

I’m going to make a light legal proposal that we extend property rights to cover our data—both the data we generate (such as photos or messages) and data derived from our activities (such as our purchase history, location, or our interactions within a service, including swipes, taps, clicks, and more).

Without our data, these services wouldn’t have anything to monetize. Without our data, the artificial-intelligence systems powering machine vision, speech recognition, and many other technologies of the future would be very, very dumb.

When you understand that it is lots of user generated or derived data that is powering the foundations of future innovation and wealth, we become more than users. We become partners with rights to determine how our contributions are used and how the value created from them gets allocated. When we consider our true worth, “free” photo storage and communications suddenly don’t seem like a fair trade.

An analogy to land rights may help. Let’s say someone offers to buy your home for $200,000, and they throw in free shipping to remove all your belongings to make the move easier. But they haven’t told you that the land your home is on contains precious vibranium. So you sell—because: “Free shipping!”—and you give away the real value. Our data is like vibranium, and we’re offloading it unknowingly, and at criminally low prices.

I know implementing this thinking is complicated, but our current economy deals all the time in complex calculations that were previously unimaginable. Look at algorithmic stock trades or the digital-copyright-claims process or the fact that, despite its best efforts, Netflix doesn’t break the internet every single day.

Thanks to the surveillance economy we’ve built, we have most of what we need to account for the value of our data. All that remains is to recognize that it’s ours, and throw some big brains and big computers at the problem. I believe in you, Silicon Valley! With your help, we can do this!

4. Diversify Who’s At the Table

The power of technology to shape the future of literally everything means that the people in the drivers’ seats — the entrepreneurs, engineers, and investors — wield incredible power. But being a good software engineer does not qualify you to engineer society, politics, economics, and beyond. Not alone.

Technology is created by people, and people have blind spots and biases. That’s why tech companies need more diversity at the table — people who think differently about ethics, privacy, and tech’s ability to facilitate abuse. (Project Include has a 14-item list of recommendations to get you going.)

Even the most inclusive, multi-perspective team can’t anticipate every outcome of its service before launch. The systems are too complex to see it all. That’s why we also need more researchers with controlled access to how these complex systems work, not fewer.

5. Implement New Laws and New Rules

Leaders in the tech space should encourage regulation. Regulation would provide clear lines within which companies should operate, which would prevent embarrassing public spectacles and level the field among competitors. Of course, rules only work if they’re enforced.

Back in 2011, the Federal Trade Commission (FTC) reached a consent decree with Facebook over its practice of sharing user data with third-party apps. Google was similarly called out for misusing user data.

Marc Rotenberg, head of the Electronic Privacy Information Center who brought the FTC complaints, recently wrote about those decrees:

[B]oth Google and Facebook are now subject to 20-year oversight by the Commission and annual reporting requirements. At the time, we were elated. Although the United States, unlike many countries, does not have a data protection agency, we believed that the FTC could safeguard online privacy even as the tech industry was growing and innovation was proceeding. We celebrated too soon. Almost immediately after the settlements, both Facebook and Google began to test the FTC’s willingness to stand behind its judgements. Dramatic changes in the two companies’ advertising models led to more invasive tracking of Internet users.

I don’t share these examples as a sign that we can’t get things right. I share them to point out that we almost had a different history, and if we do things differently now, we can have a better future.

I am encouraged by what’s going on in Europe, where sweeping changes have been enacted with the General Data Protection Regulation. But stateside, there is a worrying lack of understanding of technology at the highest levels of the U.S. government. For that reason, I think part of the onus is on tech companies to encourage regulation, but it’s also on us to demand more from our government.

The FCC’s decision to end net neutrality was partially reversed thanks to continual pressure on Congress. I think we should keep that pressure up when it comes to allowing internet service providers to sell our browsing history data. New York City is exploring a way to hold algorithms accountable, since they are increasingly implicated in policing, finance, and other resource-allocation decisions. We need more of that. And we need to upgrade the knowledge of our elected officials, either by educating them or replacing them with people better equipped to face our future challenges. Beware the politicians who want to make Americans feel safe by separating immigrant children from their parents—but who refuse to secure their own smartphones.

Right now, a handful of companies are creating and controlling massive amounts of wealth. The U.S. government would be wise to catch up with the economy of tomorrow instead of fighting over jobs that may not exist in 25 years. A more modern take on data and regulation around its use would help us get there.

6. Enable Users to Collect and Analyze Our Own Data

We can tip the balance of power between users and big tech companies with increased transparency, a new framework on data rights, and stronger regulation — but we won’t achieve true balance until we shift what we do with the data itself. So far, mostly what we’ve done is taken the smartest people and most powerful machines in the history of the world and used them to distribute ads. We turned “we the people” into “we the product.” That’s quite an underwhelming use of a superpower. We cannot let the story stop there.

It must continue with tech companies empowering users to collect and run analyses of our own data. We’ve seen hints of what’s possible from now-shuttered services like Knodes and ThinkUp, which allowed people to analyze their own social media data and find hidden connections in their networks.

We’ve also seen justice pursued and powered by data through projects like the Equal Justice Initiative’s Lynching in America installation, and the Center for Policing Equity’s National Justice Database which uses data science to understand and reduce discriminatory police behavior. We need more of all this.

The promise of the internet isn’t that a few centralized powers will do everything for us. That’s the Old World, and we shouldn’t try to recreate it. The promise of an inter-networked world is that we can do more ourselves under new models of collaboration, whether in the fields of science or art or justice.

Imagine if we used our collective data to help us be better neighbors, partners, artists, citizens, and humans, rather than just better products to be auctioned off to the highest bidder. Imagine, too, if we could hold technology companies accountable by demanding that they share power more equitably with the people who use and enable their products and services.

Imagine it. Now let’s go build it.

Do you want to expand on this draft manifesto? Contribute to this open source Google Doc with additional principles/demands, resources, and examples of progress being made. (Yes, I’m aware of the irony of asking you to give Google more data in service of reclaiming our data from companies like Google. What can I say? I’m an artist.)

Also, if you want to take matters into your own hands, read the companion piece called “Find Out What Google and Facebook Know About You: How to Do a Data Detox, in a Zillion Easy Steps.”