Facebook was quick to point out that it already implemented fixes that would prevent another company from repeating Cambridge Analytica's technique. For example, the company started reviewing apps that asked for user data before they launched in 2014, and it's given users more ways to control how developers access their data. But those tools weren't exactly easy to use, even for tech-savvy users.

Moving forward, Facebook says it's going to fully audit apps that had access to large amounts of data before its 2014 platform revamp, as well as ban any offenders. Given that Facebook knew about Cambridge Analytica's data mining strategy in 2015, responded with a mere wrist slap (the company asked for collected data to be deleted, but never double-checked to see if that were true), and didn't ban them until last week, the company has to be extra vigilant to prove it's taking privacy issues seriously.

Additionally, Facebook says it'll alert anyone affected by malicious apps, something the company should have been doing from the start. One of the more damning aspects of the Cambridge Analytica story was that Facebook didn't warn the 50 million users that their data was potentially being misused. Perhaps more than just saying it'll keep people informed, Facebook has to prove to its 2.2 billion monthly users that it can actually be trusted.

The company will also turn off data access to apps you haven't used in three months. While that might lead to headaches for some, especially for people who don't often log into Facebook, it squashes one of the platform's big vulnerabilities. Just go look at your Facebook privacy settings -- there's a good chance you've given random apps access to your data and have completely forgotten about it. In a similar vein, the company says it's also going to make it easier for people to control how apps user their information.

In a future update to its Login tool, Facebook will also restrict the data that unapproved apps can see to your name, photo and email address. Beyond that, developers will have to be approved by Facebook. The company is also expanding its bug bounty program to reward people who discover apps that maliciously gather data.

While all of these updates should vastly improve data privacy on Facebook, it's inexcusable that it took a major PR disaster for the company to implement them. As social media scholar Zeynep Tufekci wrote in the New York Times, "If Facebook failed to understand that this data could be used in dangerous ways, that it shouldn't have let anyone harvest data in this manner and that a third-party ticking a box on a form wouldn't free the company from responsibility, it had no business collecting anyone's data in the first place."