“The future is private,” Mark Zuckerberg declared at Facebook’s F8 conference keynote yesterday. He went on to discuss the importance of building “private” online “living rooms,” an analog for direct messages and Facebook Groups, to contrast the “public square” of the News Feed.

Zuckerberg described a number of new initiatives in this “future is private” push, including encrypted, and even ephemeral, Facebook messaging features, as well as an ephemeral “status” feature (similar to Instagram or Facebook Stories) for WhatsApp. WhatsApp messages have always been end-to-end encrypted, and Zuckerberg noted they would stay that way. He emphasized several times that Facebook will not be able to see the content of this material, saying it was private “even from us” several times about several features, and emphasizing the words “safety” and “secure.”

But what his presentation elided was the fact that Facebook does not need to see the content of what people are saying in order to advertise to them. The metadata — who, or what (as in a business), you’re talking to, and even where you are or what time the conversation is taking place as it comes together with other pieces of information — provides more than enough information to make a very educated guess about what you’re interested in, to the point that knowing specifically what you are saying adds almost nothing.

The value of metadata, not just in advertising but in building an understanding of a person, has been well-studied for years; Facebook is neither inventing it nor even just beginning to use it. It’s easy to forget that while Facebook builds all of these “private” features into its own products, it still has not only an immense body of information that we gave to it freely in its earlier days, but also an extremely robust tracking apparatus across the entire Internet.

Facebook can see, for instance, that you are WhatsApp-chatting with your mother (a fact that it knows from cross referencing both your phone numbers with your Facebook profiles as well as a wealth of other information that is available about both of you within its own stores and across the various data brokers from which Facebook can buy your information). It can also see in real time that you’re pulling up the page for thyroid cancer, or searching for the best thyroid cancer treatment centers. Does it need to know the very content of your conversation to make an educated guess what’s happening, or what you’re talking about, or what your needs are?

Knowing who you talk to and roughly what you think about is plenty to sustain an advertising business. Just the fact of our interactions, and not necessarily the content of them, tell on us plenty to a company like Facebook, such that offering encryption just on the content of messages is practically a free PR win for the company with virtually no downside. It has long been the case that deleting anything you put on Facebook is a world of difference from deleting data from Facebook’s servers. Europe’s GDPR has gone some way to changing this, but “deleting” has always effectively meant the user is breaking their own and other users’ access to information that Facebook gives itself the right to hold onto forever. Even if a user managed to scrape every last bit of their presence off of Facebook, the loophole is that the service is built off of what happens between them and other people: Their deletion doesn’t touch the halo of interactions they’ve had with every person they’ve ever known. If you delete your conversation with your mother, the conversation (or, per Facebook’s new approach, the metadata of that conversation) still exists in her Facebook files and can be used for advertising, theoretically. Even the famously encrypted WhatsApp has ads coming its way within the Status pane, the same place Facebook will soon be pushing users to post ephemeral, Instagram-Story-like updates.

“Privacy” has always been a subpar term for what we need from tech companies; it’s vague and suggests that as long as we are being protected from something, as opposed to most things or everything, the promise of the word is being fulfilled. The lack of privacy from Facebook is what allowed the entire Cambridge Analytica fiasco to happen. Privacy from each other’s prying eyes, whether that’s “ephemerality” or “security” or “safety,” will never mean privacy from Facebook, or any of the companies it allows to market to us.