Stay on Top of Emerging Technology Trends Get updates impacting your industry from our GigaOm Research Community

Even as websites, wearable computers and, increasingly, every piece of technology we touch gathers and analyzes our data, there’s still hope that privacy will survive. Making that case, however, might mean working from a different definition of privacy than we’re used to.

One cold, hard fact about data privacy is that the data-collection ship sailed long ago, never to return. With limited exceptions, consumers can’t really stop tech companies from collecting data about them. When we log into web services, make phone calls, play our favorite apps or buy the latest in connected jewelry, we’re giving those companies the right to collect just about whatever information they please about who we are and how we use their products.

The situation isn’t wholly good or bad — data analysis is behind lots of user experience improvements as well as targeted ads, for example — but understanding it is critical to understanding what the future of data privacy might look like. There’s not much point in debating what companies can or should collect (because doing so is too easy and regulating it is so hard), but there is an opportunity to put some limits on what companies do with data once they have it.

This why the White House, as part of its new consumer privacy push unveiled on Monday morning, is talking about how student data is used and smart grid data is secured rather than what’s collected. It’s why Federal Trade Commission chairperson Edith Ramirez, speaking about the internet of things at last week’s Consumer Electronics Show, spoke about how long companies should store user data and not whether they should collect it.

The internet of things, in fact, is a prime example of why we’ll probably never be able to put a lid on data collection: because many people actually crave it. The whole point of connected devices is that they collect our data and do something with it, presumably something that users view as beneficial. If I love my fitness tracker or my smart thermostat, I can’t really be upset that it’s sucking up my data.

What I can be upset about, however, is when the company does something unethical or negligent with my data, or something I didn’t agree to (at least constructively) in the privacy policy. It seems this is where a lot of regulatory energy is now being spent, and that’s probably a good thing. (We’ll also delve into this topic at our Structure Data conference in March, with FTC Commissioner Julie Brill.)

Even if it’s forced on them, companies selling connected devices need a framework for thinking of user data not just as a valuable resource, but also as something over which they’re the stewards. Collect the data, analyze it, make your money — the whole industry is predicated on these things. But know there will be penalties in place if you do something bad, or even just stupid.

Of course, the devil here will be in the details. What constitutes an acceptable use, security protocol or retention period could vary widely based on industry, company, product, cost or any other of a number of variables. A connected car is not a fitness tracker. A smart door lock is not a connected toothbrush.

But hopefully, the attention the internet of things is getting early on means lawmakers and regulators will be able to come up with some workable, flexible and relatively future-proof rules sooner rather that later. The last thing we want — especially when dealing with data about our physical-world activity — is a repeat of the web, where it’s 25 years later and we still haven’t figured out what privacy means.