Listen to the PrivateID Podcast. Download this episode here (right click & save as…)

Last week I saw Tim Cook’s keynote speech to the European parliament. It was nice, very nice. He made good points and nailed them. However, I don’t buy it.

At this point, given the history of public companies, I think we should just judge actions rather than words. If they truly do what Cook promises, my hat’s off. But he doesn’t even talk about the challenge we face with biometric data.

When I listen to this speech I have the feeling of “damn, he must mean that”, but I don’t know guys. I don’t buy it.

Show me how you look and I’ll tell you who you are

Too often with, let’s say politics, our judgment is based on how the person looks like. In the end if the person running for president is taller, she has better chances to be elected.

Appearances matter.

But not only that, body language, the way you communicate an idea, how you emphasize something. There’s a whole bunch of things that change your mind, or just reinforce an idea about something.

Or consider this article. It’s structured in a specific way to prove a point. If I’m lucky, it’ll change your mind, or maybe it’ll reinforce what you already believe. But the point is, like any other communication process, it’s designed to generate a certain impact or reaction.

We communicate with more than just words, though. We communicate with everything. In fact, words don’t matter that much. It’s how you present those words, the way you emphasize them and, of course, the way you use body language to prove your point.

That means, when you judge the appearance of a politician, you’re failing (we all do, by the way) to judge properly. Because you’re leaving a lot of critical information out the table.

When it comes to Tim Cook’s words, yes, we judge his brilliant presentation. We believe him. We believe the authority he projects through his body language. The environment he’s in. The words he uses and how he uses them. But is that what we should judge?

When it comes down to something as important as the privacy problem — something that shouldn’t be on the hands of companies profiting from this situation — we should beware how we judge and how these non-verbal communication affects our judgment.

The reason some people don’t wanna sit with me in an airplane.

Imagine this. You’re in an airplane and you ask the person next to you: What do you do for yourself? She’d tell you, and then ask the same question back. And you just do some small talk. Pretty normal.

Well, that doesn’t happen with me. When someone asks me what I do for a living I’d say, I’m a marketer. Then the person next to me asks one of the crew members to change her seat, and avoid any visual contact with me.

This isn’t fair, but it’s true. Most marketers are not trustworthy. So it’s obvious that people have that reaction.

When I studied marketing I was surprised that people still praised tricky and clever messages that got people to buy stuff they don’t need nor want. Somehow there was like a badge of honor if you could come up with a creative way of communicating some BS.

I’m not saying every marketer is like that. I’ve got lots of friends in the marketing industry who I believe do things the right way. But let’s be honest, that’s the minority group.

Regardless of the impact, most marketers would use any trick they can to get you buy or do something you might don’t want or need in the first place. If they’d have more tricks, they’ll use them all. That, as they say, is that.

Privacy invasion is old hat. Back in the 1950s major advertising firms hired anthropologists to “observe” consumers behavior. They spied on consumers and measured their reactions to sell more stuff — which led to include babies in their ads. And if babies weren’t effective, doctors worked flawlessly. The truth is we’ve been tracked for decades. Phone providers have been invading our privacy for a really long time. Banks have been collecting our data for decades and track us every time we use our credit cards.

All that data — which might seem isolated and without context in many cases — gets scarier when conglomerates like Acxiom collects all those little dots of data and put some context on them.

The reality is that with the exponential grow of technology, their techniques and tools not only have gotten cheaper, but way, way more effective in invading privacy and recording everyone’s daily life.

If it were up to them, they’d run subliminal ads all the long. But they can’t, so they push things to the edge, close enough to the heat without getting burnt.

So, it’s not that good guys turn into bad. Or bad guys turn into good guys. Bad guys are always bad guys, maybe they masquerade it pretty well, but they just are that way.

My point here is that, there are always bad guys. They’ll use whatever they can to profit from any given situation. And we just believe their masks, judge their words without looking at their actions. And that can be dangerous.

Right now by default we’re skeptical of marketers and advertisers (which is great — I don’t really consider myself a (bad) marketer, so no hard feelings!) But we need to spread that span of skepticism to places where it’s needed the most. It’s not just about Facebook. It’s also about Apple, but also cable companies and many, many other players. We just need to learn to see.

I’m going to pick on Apple here. The easiest way to go would be to pick on Facebook, but I prefer to focus on Apple for a few reasons: (1) they are getting very powerful in biodata, (2) their past history of locking-in people crosses the line, and (3) I have the feeling that because they’re taking the lead on pro-privacy, people might trust them blindly.

I could pick other blatant companies like Google or Amazon. And the following analysis applies the same way to them, but stick with me here.

The Data Industrial-Complex Analysis

Tim Cook’s speech is about the importance of privacy, and how other players are profiting from your data, violating your trust. However, I believe Apple is shooting themselves in the foot.

Apple’s CEO said that a US privacy law should prioritize four things:

Data minimization Transparency The right to access The right to security

It looks great, but if you notice in the speech he doesn’t say anything about real ownership. The right to access is far away from being the owner… Big idea.

In Steve Jobs’ words, which Tim Cook quotes: “privacy means people know what they’re signing up for, in plain language and repeatedly”.

I believe semantics are really important, and we’re confusing things here. What’s not being talked about is the real importance of privacy, which I’ve stressed in my other articles. So this isn’t a solid definition.

Privacy isn’t just about having 256-bit encryption or using your data for “creating a better experience”. As I said in Three Laws of Privacy, privacy is a fundamental right to own your own value, your own data, and be aware of what’s the real purpose behind that usage — few dare to communicate that.

But before we get off rails here, I wanna get busy on the reason Apple is a trillion dollar company: the lock-in effect.

The Lock-in Effect

What lock-in says is that it’s too hard for me to switch. For example, if you’re into photography, it’s likely that if you started with Canon, you’ll always buy Canon. Because all the gear isn’t gonna work on other brands. So the cost of switching is pretty high.

The same goes with Apple. Everything they do, everything they’ve done for years, is to lock you in. Remember those days when iTunes only worked on Mac? Or if you’re a designer you know there are a bunch of software apps that only work on Mac. They’ve also done specific connectors like the thunderbolt to lock you in. Or consider their Apple Watch, you can’t sync it with an android phone. Period.

What most people don’t know is that the lock-in effect is almost always emotional. Okay, there are people who are immune and switch all the time, but most people are vulnerable. Because switching might mean you were wrong and you have to admit to your friends they were right. Or it might mean you have to lose status .

I’ve been told — because I’ve never had an iPhone — that one of the things of the iPhone people would miss the most is the iMessage. You can text everybody through that app, but those who have an iPhone show up with a green dot, while those without an iPhone show up with a red dot. That’s status at work. Believe or not that’s an emotional lock-in.

I could do an entire book about Apple and their marketing techniques with the lock-in effect, but let’s get into the nitty-gritty details.

Apple is a trillion dollar company thanks to their ability to lock people in. If they take over the healthcare industry, there’s a high likelihood they decide to use the lock-in effect — as they’ve always done.

Well, one of the ways Apple can put the lock-in effect to work is through biodata. It might not be through data itself, maybe it’ll be with a specific platform or market that only works with the Apple Watch. It’s going to be something on those lines, because I seriously doubt Apple is gonna kill their golden goose. They’re too big and have too many investors to do such thing.

I know this can lead to confusion: How are they going to lock people in if they are, apparently, defending privacy? Simple, by tweaking the definition of data ownership.

Privacy and data go hand-by-hand, and violating the right to own that data is the same as violating your privacy. While some people might not see how privacy could be violated here, let’s try to make this clear.

Picture this. Someone has a picture on you where you show up naked. They put that picture in a folder and say: you might use (for whatever purpose) this picture, but I’m the one who’s gonna store it. You can’t take it to another place. You can just access it.

Is the act of holding that piece of data (the picture) a violation of your privacy? You betcha.

Even if that picture is stored just in your device, it’s a violation of your rights to not have control over it, especially if it gets its way out to the cloud. Hey, in the end you can’t tell whether they have access to it or not. Or if they store it in their servers. You’d have to trust their word — and words mean nothing in this environment.

Now we need to connect the dots, and if we really want to understand what’s going on here, we need to see how Apple is disrupting the healthcare industry and what does that have to do with privacy.

The Apple Watch: The Key to the Future

Steve Blank wrote a terrific article about the Apple Watch and the tipping point for healthcare, where he analyzes Apple’s strategy and how they’re getting ahead of their competitors, and that’s where the FDA clearance enters into scene.

“Sooner than people think,” Blank says, “virtually all home and outpatient diagnostics will be performed by consumer devices such as the Apple Watch, mobile phones, fitness trackers, etc. that have either become FDA cleared as medical devices or have apps that have received FDA clearance.”

This is a big idea. What tech companies have discovered is a multi-trillion dollar market that is going to suffer a big disruption. Every company is trying to get their noses into this change: Google is heavily investing in healthcare, Amazon is investing in pharmacy distribution and, of course, Apple with the Apple Watch using it as a health screening and diagnostic device.

One of the interesting changes we’re seeing is that Apple already sells around 15 millions watches a year, which will allow them to test some interesting features at a mass scale. But what’s most interesting about this, and the reason Apple might take over the healthcare industry is that they’re one of the few companies that know how to get through the FDA clearance process.

This is huge.

The FDA, Food and Drug Administration, is the federal agency in the US that determines whether the Apple Watch and its apps are valid as a healthcare device. If that happens — or when that happens— it’ll be used as a diagnostic tool. Which means that insurance companies will mandate the use of an Apple Watch. As a matter of fact, they’re already doing it with activity trackers, like the Apple Watch.

Let’s leave aside the benefits. I get it, this will be revolutionary and it will save millions of lives — there’s no doubt about that. But what will this mean?

This starts the process of what I call privacy inequality. If people refuse to use it, insurance companies will charge more, or just refuse to offer their services. And if someone with a bad health situation uses it, and the insure company figures that out, they might charge more, or just deny the service.

This reminds me of 9/11 and how they used terrorism as an excuse to violate people’s privacy. They might say the same shit and make us believe this is for our own good, while on the side they just mine our data.

Steve Blank, in that same article, went through Apple’s patents and created a list of eight key features:

Sleep Tracking and Sleep Apnea Detection Pulse oximetry Respiration rate Blood Pressure Sunburn/UV Detector Parkinson’s Disease Diagnosis and Monitoring Glucose Monitoring Sensor and Data Challenges

It’s worth noticing that the Apple Watch has already sensors that are capable of detecting everything from this list. So, as soon this isn’t another Theranos case, this is getting serious.

Elizabeth Holmes, founder of Theranos

Now, imagine a company with all this information on you. Would you trust that company with just the words of the CEO?

Where does everything fall into place?

I wonder when a public trillion dollar company decides to “defend” privacy, and ends up making billions of dollars in the way, by tweaking what we understand about data ownership. And avoids conflict by redirecting the attention to other tech companies.

This technology adaptation can dramatically improve our health. The finish line looks awesome, however, what’s the cost of getting there?

I wanna be crystal clear here: I’m not saying we shouldn’t get there. I’m saying there are consequences along the way. And maybe there’s too much power on a single company. Or even if that power is distributed within the tech oligopoly. It’s still too much power on just a few hands.

Everybody will wear one of these in the future. And I get it. A device like that will provide so much valuable information, that not wearing it might be consider reckless or even dangerous. Insurance companies might even deny your coverage if you don’t wear them.

This is where it gets tricky…

Will insurance companies have access to this data?

How Apple is suppose to keep this safe?

How would we know if they can’t actually access it?

Are we really gonna be the real owners of our biodata?

Should we blindly trust the word of a company?

The finish line is clear, but the road to get there? Not so much.

Beyond Apple

I can’t stress this enough. This isn’t just about Apple. This is about every company, organization or government whose words don’t match their actions, and we blindly believe them.

Trust is something they have to earn.

I really hope Apple changes things for the better. They’re actually in the best position to do so, but I’m skeptical. And don’t think they’re gonna do it. It’s a trillion dollar company! A public one.

The only reason they’re doing this is because it’s convenient! You can’t go to investors in a public company and say “hey guys, I think we should do this because is the right thing to do.”

They’d laugh at you. Wall Street won’t care about ethics. They’ve never done it, and today won’t be the day.

But if you go to them and say “hey, look at what’s happening with Facebook. There’s a shitstorm coming. So we better take the pro-privacy lead here. Otherwise we’re gonna get behind in the biodata war and lose a ton of money.”

There’s a lot at stake here to drive unnecessary attention to Apple. They need this to be clean and nice to have leeway here.

Now after everything I’ve said to you, what do you believe to be true?

Never trust words. Don’t trust me, don’t trust anyone until you see their actions.

Especially, never, ever judge the words of a public company. If they end up doing something good, great. You doubted them but they did the right thing. But more often than not their actions rarely match their words.

Actions speak louder than words.

Don’t let them fool you. Don’t judge their words, focus on what they do.