For years, researchers have discussed how the “anonymizing” various companies claim to perform on the data they gather is poor and can be easily reversed. Over the last few years, we’ve seen multiple companies respond to these problems by refusing to continue anonymizing data at all. Verizon kicked things off, but Vizio has gone down this route as well, and now we know Google has — or, at the very least, has reserved the right to do so.

According to an investigation at Pro Publica, Google has quietly changed its privacy policy at some point over the last few months. When Google bought the advertising firm DoubleClick a few years back, it promised to keep all data gathered by DoubleClick sandboxed and isolated from the data it gathered elsewhere and used for other Google services. The company has since changed the wording of its privacy policy, as shown below:

Google has stated it doesn’t use the information gleaned from Gmail scanning to target ads to specific people, but it’s not clear what this means for its other services. Google tracks a great deal of information and its email keyword scanning is just one business area. Previously, Google’s privacy policy contained a hard line of what it would and would not do. Google has replaced that flat guarantee with a weasel-word “depending on your settings” statement that hides behind the word “may.”

Speaking of those settings, Google does have a “Privacy Checkup” tool that you can use to hide certain data from being tracked or gathered. It’s generally well-designed, but for one major example, shown below. Play a game with yourself if you like — see if you can spot the problem before you read further:

This is a perfect example of what’s known as a dark pattern. A dark pattern is a pattern designed to trick you into choosing the “right” option, where “right” is defined as “What the company wants you to pick,” as opposed to what you actually want. In this case, boxes are checked by default and you uncheck them to hide information. But if you uncheck the box labeled “Don’t feature my publicly shared Google+ photos as background images on Google products & services,” you’re actually giving Google permission to use your name and profile to advertise products. Google flipped the meaning of the checkbox to make it more likely that someone not reading carefully would click the wrong option.

But what’s really interesting to me is that the word “Don’t” is bolded. You bold something you want to draw attention to — and that’s pretty much the opposite of how a dark pattern works. Huge organizations are much less monolithic than they appear from the outside, and I suspect that what we see here is a tale of two opinions, played out in a single checkbox. By reversing what checking the option does, Google made it more likely that you would give it permission to use your personal likeness and data for advertising. By bolding the word “Don’t,” Google made it more likely that you’d realize what the box did and set the setting appropriately.

In any case, Google’s decision to stop anonymizing data should be serious, but there’s not much chance people will treat it that way. To-date, people have largely been uninterested in the ramifications of giving corporations and governments 24/7 permission to monitor every aspect of their lives, even when it intrudes into private homes or risks chilling freedom of speech.