I could not care less whether Democrats are avoiding Republicans over Thanksgiving or vice versa, as the Washington Post reported earlier this week, drawing on data collected from 10 million cellphones. People voice their politics in a lot of ways, and if one of those ways is over the jellied cranberries, that’s their choice. What I do care about is how SafeGraph (who collected the data) got their hands on this information in the first place. The company was effectively able to look at and analyze 10 million individual mobile phone users and track every one of their locations over the course of Thanksgiving Day 2016. How on Earth did they get 10 million smartphone users to consent to a data-collection firm tracking their every, individual move?

The truth is, they didn’t have to. As The Outline points out, SafeGraph partners with a wide variety of app makers—though there’s no central list to see who the partners are, alas—and leaves the thorny issues of data collection notification and user privacy to the third-party partners. The Outline continues:

According to SafeGraph’s privacy policy: “We obtain information from trusted third-party data partners such as mobile application developers, through APIs and other delivery methods. The data collection and use is governed by the privacy policy and legal terms of the data collector and the website using the data; it is not governed by SafeGraph.” In other words, SafeGraph is partnering with apps—they could be weather apps, games, wallpapers, anything—and leaving the disclosure up to the app maker.

Do app makers disclose what they’re tracking? Only if they feel like it. In 2013, the U.S. Commerce Department unrolled a purely voluntary, not legally enforceable “privacy code of conduct” for app makers. Among the recommendation were a “short notice” in which users would review the following items prior to purchasing or downloading the app:

What the type of data the app would collect from you and transmit off your device, Where to find a long-form privacy policy, What third-parties the app shares your data with The actual company responsible for the app

Think back to the last time you saw a short notice before downloading an app. How’s that voluntary and unenforceable thing working out?

So what? Your favorite app may be collecting and sharing information about you that is not relevant to what is necessary to run the app. What’s more, you may not know that. And literally nothing in the U.S. compels any app maker to have to tell you. So whenever you use any app you’ve downloaded from the store of your choice, your data collection—and the reselling of that data, and the security of that data—are entirely at the whims of a company that may or may not see any reason to let you know it’s tracking your every move.

Who cares? We all should. As of right now, we are (in the aggregate) accepting as the status quo the idea that any app maker can track us without our knowledge and consent. We are also accepting the idea that businesses can take our personal data whenever they want and resell it when it suits them—again, without ever telling us.

Deleting the apps on your phone isn’t going to keep you out of any of these databases. That genie’s out of the bottle. At this point, what we can do is start looking at consequences.

Europe is, unsurprisingly, way ahead of the United States on this: The General Data Protection Regulation (GDPR), which was adopted by the European Union in April 2016 will be implemented over a two-year transitional period, was developed to strengthen individual citizens’ rights to data privacy, and to simplify the regulation of data export outside the European Union.

Among the tenets of the GDPR:

Citizens have the right to question and fight significant decisions that affect them that have been made on a solely algorithmic basis. (Legal scholars already argue a “right of algorithmic explanation” will not have any legal standing in an inevitable court case.)

Citizens have the right to consent. The way the regulation is framed, citizens must explicitly consent both to data collection and the purpose for which the data is being collected.

Citizens have the right to request erasure of data when the data is found to privilege the data collector’s business interests over any fundamental rights and freedoms of the data subject which require protection of personal data. (In other words, Google can’t keep spitting up search results about an EU citizen that could endanger them.)

Citizens have the right to data portability, meaning they can transfer their personal data from one electronic processing system to another, without being prevented from doing so by the originating data collector/controller. What this means in practical terms: people can move their text files, photographs and videos from one social network, or e-mail or cloud storage service to another without proprietary-format headaches or caveats saying the assets are still technically in use by the former service.

Know which citizens enjoy none of these rights? U.S. citizens. Although there are some data privacy laws that are highly industry-specific (HIPAA for health care, FERPA for education), there is nothing on a federal level designed to protect citizens’ digital privacy.

The Obama administration unveiled its proposed Consumer Privacy Bill of Rights in 2012, positioning citizen’s privacy as a fundamental right and not something subject to the whims of app makers, but after four years of beavering away on it (and getting nothing from Congress, despite Vermont Senator Timothy Leahy’s efforts), the effort left the Oval Office when the Obama staffers did.

So here we are: App makers tracking us without having to tell us so, businesses collecting our data and suffering no real repercussions when they invariably have a data breach and we all have to monitor our credit for fraud again.

Data collection is a fact of life if you live in the U.S.—it’s everywhere from your grocery store reward program to your cloud-based services. Having companies be transparent about collecting your data is a first step. But a meaningful second set would be having companies held accountable for using or abusing the resource you’ve provided them. If there is data collection, then let there be consequences.

Lisa Schmeiser has been reporting on and writing about tech, business and culture since the dot-com days. Find her on Twitter at @lschmeiser or subscribe to So What, Who Cares.