How To Get Away With Murder — The Facebook Way

The Social Media Giant Walks Away Scot-Free After Killing Privacy

Original cartoon by Daniel Murphy. Modified by Sarvesh Mathi.

When Facebook announced its robust first-quarter results for 2019, it also disclosed the company is setting aside $3 billion in anticipation of a fine for violating the Federal Trade Commission’s 2011 privacy consent decree. For any other company, one would expect a multi-billion dollar fine to disappoint shareholders. Instead, Facebook’s share price rose by 8% and market capitalisation increased by $40 billion. The markets were cheering because Facebook was let off with a mere slap on the wrist.

How did Facebook escape unscathed after killing privacy?

Step 1: Choose a clever weapon

Facebook’s entire business model is based on getting more users and their personal data. The company earns 98% of its revenue from ads that rely on this data. To get it, Facebook had to invade user privacy. And its weapon of choice: the user. Buried in the lengthy terms and conditions that users accept when creating an account is a slate of permissions that allow the company to use personal data in all sorts of ways. This, along with the lack of user control over what information is shared, weak safeguards against data breaches, loose regulation of third-party apps and sketchy partnerships with other companies, led to the death of privacy.

Original cartoon by Guy Body

Step 2: Kill and get caught

Let’s start with the bombshell — the Cambridge Analytica scandal. In March 2018, it was found that the data firm acquired huge troves of private Facebook data. The firm used this to make psychological profiles of 87 million users. It was able to get the data through a quiz app that exploited a feature (which existed before 2015) in the Facebook API, allowing it to collect user data on not only the quiz-takers but their friends as well. The relative ease at which any app on the platform could access sensitive user data and use it for purposes undisclosed to the user shed light on Facebook’s weak privacy safeguards. But there was more to come.

In June 2018, NYT found that Facebook gave over 60 companies (including Apple, Amazon, Microsoft and Samsung) access to extensive user data for over a decade. Facebook considered these companies as an extension of itself, rather than third-parties. Hence, user data was shared without any consent or even when sharing to third-party was explicitly denied. Then in September, Facebook revealed that a flaw in its code allowed hackers to access and control the accounts of 30 million users. Later in December, British parliamentarians accused the company of giving advertisers special access to user data without user permission. In the same month, Facebook disclosed that a bug in the platform’s photo API exposed the photos of 6.8 million users to third-party developers. Just when you thought December couldn’t get any worse, another NYT investigation found that the platform continued to share personal user data, including the ability to read private messages in some cases, with more than 150 companies after statings such data-sharing agreements were terminated.

2019 has been no different. In January, TechCrunch reported that Facebook paid people to install a VPN app that monitors users’ phone and internet activity. Earlier this month, NBC news found further evidence that Facebook gave companies it favoured special access to data and denied access to rival apps. In a separate finding, researchers discovered that 540 million Facebook user records were left exposed on Amazon cloud services. Last month, Facebook revealed that the password of thousands of Instagram users and millions of Facebook users was stored in plain text leaving them exposed to some people within the company. Last week Facebook deviously updated the blog post to state that “millions of Instagram users” were exposed, not merely “tens of thousands” as previously stated.

Cartoon by Dave Granlund

Step 3: Apologise, apologise, apologise

Facebook killed privacy and got caught. But CEO Mark Zuckerberg knew what to do next. Zuckerberg has been getting away by saying sorrys since his Harvard days. But even with his rich history of apologies, 2018 was a challenge. Soon after the Cambridge Analytica fiasco, Zuckerberg took out full-page ads in top newspapers in the US and UK asking forgiveness for the “breach of trust.” The following month, he visited DC and met lawmakers on an apology tour. He then appeared in front of Congress, admitting failure and, once again, apologising.

14 years of Mark Zuckerberg apologising by The Washington Post

Step 4: Make some changes. Or at least pretend to.

To its credit, Facebook has tried to make amends over the last two years. The company made changes to data sharing agreements, gave users more control on what is shared and overhauled its privacy policies last year, but the scandals of this year meant more had to be done.

In early March 2019, Zuckerberg published a letter describing a privacy-focused vision for social networking. He published another letter in late March calling for more regulations. He pitched the idea of having regulations like the GDPR, the stringent EU data protection law, in the US.

In his latest — and most promising — attempt to resurrect privacy, Zuckerberg revealed a new, overhauled version of Facebook at its annual F8 developer conference on April 30. “I believe the future is private,” said Zuckerberg, at the beginning of the presentation. This new version of the platform moves its focus from the public newsfeed to private groups and introduces end-to-end encryption for messaging, as promised in Zuckerberg’s March letter. It also moves away from its signature, blue look towards a clean, white look — an attempt to present a different company altogether. The company is also trying to rely less on ad-revenue. It’s testing out payments using WhatsApp in India, increasing its focus on shopping through Instagram and making purchases on Marketplace more seamless.

Facebook’s desperation for renewed public trust was evident during the presentation. On the live stream of the keynote, the company asked the audience to take part in a poll answering questions like “How does your opinion of the statement ‘Facebook is good for the world’ change as a result of watching this video?”

The impact of revamped Facebook will only be evident once the platform rolls out. Until then, as WIRED journalist Issie Lapowsky points out, we can only wonder “whether Facebook’s ‘major shifts’ will ever amount to much more than a fresh coat of paint on a building with rot in its foundation.”

Cartoon by Chris Slane for Slane Cartoons

Step 5: Pay your parking ticket. After setting its price.

After appeasing the government with apologies, calls for more regulations and promising drastic changes to the platform, Facebook preempted an official FTC announcement and said the company expects a $3 billion to $5 billion fine, adding a caveat: “there can be no assurance as to the timing or the terms of any final outcome.” The unusual announcement is a clever negotiation tactic, noted Ashkan Soltani, a former chief technologist at the FTC. It attempts to leverage anchoring bias— the common tendency to give too much weight to the first number put forth in a discussion.

This would be the largest fine by FTC ever. The previous record was $22.5 million fine levied on Google in 2012. But this exponentially larger penalty is too small for a repeat offender. As tech journalist, Kara Swisher, put it: “It’s a parking ticket. Not a speeding ticket. Not a DUI — or a DUI(P), data under the influence of Putin. A parking ticket.”

Facebook’s revenue last year was $56 billion and its profit was $22 billion. It had $45 billion cash in hand. Against these numbers, a $3 billion fine would amount to three-weeks of revenue for the company or simply 6% of the cash in hand. The purpose of a fine is to deter the subject from repeating the unlawful behaviour. Herein lies the problem. To Facebook, $3 billion is merely the cost of doing business. A cost they are willing to pay. Even worse, an insignificant fine can increase the very behaviour it was meant to prevent.