Second, a new privacy law must require greater transparency around what consumers are signing up for and how their data will be used. (We have found that around half of consumers are likely to opt-in when given a clear idea of what they are agreeing to.) Permission to collect or share data should be spelled-out in plain English, not buried within terms and conditions. These permission pop-ups must explain concisely how data will be used and whether it will be shared with partners. Lawmakers must also require apps that are collecting data to make it easy for users to opt-out at any time (under the condition that they won’t get personalized content as a result). Consumers, not companies, should control the process.

Third, a privacy law must establish the obligation and duty on those collecting location data (even with consent) to “do no harm.” It must require companies to apply privacy-protecting measures to all data uses. The industry needs to earn consumer trust and speak to what makes consumers anxious.

A Hippocratic oath for data science, akin to the fiduciary duty the Securities and Exchange Commission applies to investment advisers, might be the right model. Companies should have to respect restricted, sensitive categories like a Planned Parenthood clinic or cancer center. They should also be required to ensure that the data they collect is not used to discriminate based on religion, medical conditions, sexual orientation or political beliefs. This can’t be voluntary: A federal regulator would receive complaints and investigate alleged abuses — and be authorized to administer meaningful punishments and fines that are large enough to change behavior, as in the European Union’s privacy law, the General Data Protection Regulation.

A Hippocratic standard would make it illegal for any location or movement data, even with proper permissions, to be used to gouge consumers or deny them access to basic life services like loans, health care, insurance, employment, civil rights and educational opportunities. This would not prohibit a person from installing a car tracker to earn a “good driver” discount. But family tracking apps should not secretly be used to hike your teenager’s auto insurance above normal rates by monitoring driving speeds.

Moreover, all location companies should be required to protect consumer data with appropriate security steps, and blur or minimize data sharing in ways to enhance privacy. For instance, Foursquare does not want to see a future where people’s full “location trails” (the exact names of places visited and exact times a person moves through them reconstructed to form a daily journey) are widely accessible to every advertiser or government agency, without any safeguards. That’s too intrusive and completely unnecessary for us all to benefit from the value of location.

Not all location data is the same. When we work with partners who want to understand if marketing is leading more people to visit their stores, or to identify consumers who frequent rival chains to try to win them over, we use only pseudonymized identifiers and blur out precise data. That means we don’t share the specific time of a visit to a specific location. Nor do we offer details on all visits for any particular user over the course of a day.

The idea is that whatever we share, the data is sufficiently general so that no individual can be picked out and “watched,” or it’s entirely abstracted into interest-based audiences (such as “moms in the United States who shop at Whole Foods and prefer yoga studios” or “college students who frequent fast food”).There’s no reason that every company in our industry couldn’t adopt the same practices, and regulation should force them to.