Google's Quiet, Confusing Privacy Policy Change Is Why We Need More Transparency & Control

from the don't-hide-this-shit dept

The practical result of the change is that the DoubleClick ads that follow people around on the web may now be customized to them based on the keywords they used in their Gmail. It also means that Google could now, if it wished to, build a complete portrait of a user by name, based on everything they write in email, every website they visit and the searches they conduct.



The move is a sea change for Google and a further blow to the online ad industry’s longstanding contention that web tracking is mostly anonymous. In recent years, Facebook, offline data brokers and others have increasingly sought to combine their troves of web tracking data with people’s real names. But until this summer, Google held the line.

Google spokeswoman Andrea Faville emailed a statement describing Google’s change in privacy policy as an update to adjust to the “smartphone revolution” “We updated our ads system, and the associated user controls, to match the way people use Google today: across many different devices,” Faville wrote. She added that the change “is 100% optional–if users do not opt-in to these changes, their Google experience will remain unchanged.” (Read Google’s entire statement.) Existing Google users were prompted to opt-into the new tracking this summer through a request with titles such as “Some new features for your Google account.” The “new features” received little scrutiny at the time. Wired wrote that it “gives you more granular control over how ads work across devices.” In a personal tech column, the New York Times also described the change as “new controls for the types of advertisements you see around the web.”

To opt-out of Google’s identified tracking, visit the Activity controls on Google’s My Account page, and uncheck the box next to “Include Chrome browsing history and activity from websites and apps that use Google services." You can also delete past activity from your account.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Last week, I wrote about how privacy is about tradeoffs , and despite what some people claim, there's no such thing as "absolute privacy," nor would you actually want something approximating what people think they mean by it. The real issue is the tradeoff. People are quite willing to trade certain information in exchange for value. But, the trade has to be clear and worth it. That's where the real problems come in. When we don't know what's happening with our data, or it's used in a sneaky way, that's when people feel abused. Give people a clear understanding of what they're givingwhat they're getting and you eliminate most of the problem. Then give end users greaterover all of this and you eliminate even more of the problem.This was our thinking in designing a Privacy Bill of Rights for companies to abide by in designing their services (along with EFF and Namecheap).It appears that Google would fail to meet the standards of that bill of rights. Last week, ProPublica wrote about how Google quietly changed the privacy policy related to how it connects DoubleClick advertising to other data that it has about you, allowing the company to actually link your name and other identifying information to you as you surf around the web. And, on top of that, it apparently includes tying what you type in Gmail to the ads you might see.Here's the thing: a lot of privacy advocates I know will likely say that this move is de facto "bad." And that any linkage between identity and ads is bad. But I'd argue that the real problem here is Google's unwillingness to be clear and transparent. It slipped this change in and then made up some PR-speak about why it was doing it, in a way that wasn't at all clear to basically anyone:Blech. If this is really actually important, and provides more value, don't give the bullshit explanation and confuse reporters.. If Google is afraid to be upfront and honest about it (things that the company) then it feels like the company recognizes that it's not providing enough value to consumers with these moves. To paraphrase the old saying about it not being the crime but the coverup that gets people, in this case, it's not the privacy policy change that's the clear problem here, but the fact that Google tried to hide it and mislead people about it.Thankfully, Google does provide the other prong of our test: giving users control.But it would have been a lot better if the company could have just been upfront and honest about it. This is why transparency and clarity about intentions are so important. If companies don't do that, then people will (rightly) assume that the moves are designed in a manner to be anti-consumer. If Google truly believes it's providing a better product with such changes, explain why and how and let users decide for themselves.

Filed Under: ads, control, privacy, privacy policies, transparency

Companies: google