The Cambridge Analytica scandal that erupted over the weekend has snowballed into the biggest threat to CEO and founder Mark Zuckerberg's rule since the company's 2012 IPO.

But, as we noted earlier, the manner in which Cambridge Analytica allegedly leveraged the data it purportedly "stole" from Facebook (or rather, refused to delete after receiving it from an intermediary who himself had improperly accessed it, according to the company) isn't all that unusual. Case in point, Carol Davidsen, Obama's director of integration and media analytics during his 2012 campaign, revealed that Facebook knowingly helped the Obama campaign collect as much user data as possible - even from the friends of users who may not have explicitly consented to the data collection.

When Facebook found out about the data mining for political purposes - the same thing they just banned Cambridge Analytica for doing - they "didn't stop us," the Obama staffer said. Representatives from Facebook even traveled to Obama campaign headquarters and candidly told campaign workers, including Davidsen, that they were allowing the Obama campaign do things they wouldn't have allowed other developers to do.

Fast forward nearly six years and Facebook Security Chief Alex Stamos is planning to leave the company in August after clashing with executives over their refusal to prioritize policing how user data is accessed and manipulated over ever-expanding advertising profits.

And now, another former Facebook employee has come forward to reveal that, before the company started tightening its data security practices after its IPO, the type of "unauthorized" access that Facebook suspended Cambridge Analytica for was routinely carried out by app developers. The reason? Facebook's advertising business can increase profits by offering more data to advertisers and developers. And the more successful games like FarmVille and Candy Crush become, the more money Facebook - which takes a piece of developers' profits - stands to make.

Combined, these factors created a powerful incentive to look the other way.

Asked what kind of control Facebook had over the data given to outside developers, he replied: "Zero. Absolutely none. Once the data left Facebook servers there was not any control, and there was no insight into what was going on."

The employee, Sandy Parakilas, first accused Facebook of prioritizing data mining of consumer safety in a New York Times op-ed published in November, when the scandal surrounding a "Russian troll farm's" alleged purchases of Facebook ads and promoted posts was still in full swing.

While the company insists that it has strengthened its oversight in the years since Parakilas's departure, the degree of negligence described by Parakilas is staggering nonetheless. If he had to guess, Parakilas would say that, in reality, the majority of Facebook users have probably had their data improperly sold or shared.

Parakilas said he "always assumed there was something of a black market" for Facebook data that had been passed to external developers. However, he said that when he told other executives the company should proactively "audit developers directly and see what’s going on with the data" he was discouraged from the approach. He said one Facebook executive advised him against looking too deeply at how the data was being used, warning him: "Do you really want to see what you’ll find?" Parakilas said he interpreted the comment to mean that "Facebook was in a stronger legal position if it didn’t know about the abuse that was happening." He added: "They felt that it was better not to know. I found that utterly shocking and horrifying."

The developer feature that allowed a UK-based psychology professor to access data from 50 million Facebook users is called "Friends Permission".

That feature was a boon to outside software developers who, from 2007 onwards, were allowed to build quizzes and games that were hosted on the platform. Parakilas says the company could've easily disabled this feature - which helped developers sneakily hoover up the data of friends of assenting users - but it chose not to.

Why? It was making too much money.

The apps proliferated on Facebook in the years leading up to the company’s 2012 initial public offering, an era when most users were still accessing the platform via laptops and computers rather than smartphones. Facebook took a 30% cut of payments made through apps, but in return enabled their creators to have access to Facebook user data. Parakilas does not know how many companies sought Friends Permission data before such access was terminated around mid-2014. However, he said he believes tens or maybe even hundreds of thousands of developers may have done so. Parakilas estimates that "a majority of Facebook users" could have had their data harvested by app developers without their knowledge. The company now has stricter protocols around the degree of access third parties have to data.

* * *

Lawmakers and regulators in both the US and UK are demanding investigations and hearings into the lapse, which they are squarely blaming on Facebook. The company said it would meet with the House Judiciary Committee to brief lawmakers on what happened.

However, any attempts at reconciliation or reform might be too little, too late: Lawmakers are demanding a scalp - somebody to blame for President Trump's improbable electoral triumph over the "eminently qualified" Hilary Clinton.

...And Zuckerberg would be a suitable pariah.