Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.

In 2009, Bernhard Rieder created an app similar to the one that allegedly helped Cambridge Analytica collect data on 50 million people.

But he deleted the data.

“I should have stored all the data, get that Lamborghini,” said Rieder, an associate professor of new media and digital culture at the University of Amsterdam, in the Netherlands.

Rieder’s work sheds light on just how many different parties could have accessed the data of Facebook users by exploiting the same loophole that allowed Cambridge Analytica to build its data set.

When asked how many companies might have been able to collect this kind of data, Rieder responded: “Hundreds of thousands?”

Rieder built a tool he called Netvizz, a Facebook data extractor for academics studying social networks. The app, which Rieder said was used by more than 100,000 researchers, is still working today, though it was weakened when Facebook made changes to its programming interface in 2015, limiting the kind of information apps could pull in an effort to protect its users’ privacy.

Alexander Nix, CEO of Cambridge Analytica arrives at the offices of Cambridge Analytica in central London on March 20, 2018. Henry Nicholls / Reuters

Rieder said that getting user data was the easy part.

“Before 2015, you could get troves of data, especially as an app,” Rieder said.

“I could have easily created a trove comparable to [Cambridge Analytica's],” he said. “So a lot of people could have. And a lot have.”

Rieder’s work is suddenly very relevant because of the allegation that Cambridge Analytica received Facebook user data acquired through the same means — an app that users connected to through Facebook.

Byers Market Newsletter Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox. This site is protected by recaptcha

Rieder joked about a Lamborghini because he knows the data that these companies harvested is valuable. Cambridge Analytica brought in nearly $6 million in payments from the Trump campaign and millions more from other Republicans and PACS in the run-up to the 2016 election, with the promise that its insight into people’s identities, friends and “likes” could help sway the election.

Rieder’s Netvizz never stored the data it collected, and the files it generated were deleted at regular intervals. This is what academic researchers were supposed to do. Facebook’s terms of service (TOS) dictated that any data collected should be deleted within 24 hours, but Rieder said the company never once made sure he had done so.

“Facebook never asked for [deletion] other than the general demands in the TOS, i.e. deleting after 24 hours,” Rieder said.

Rieder said he was speculating, but added that academics tended to follow ethical rules, while the numerous business and marketing apps had no such compunction. Even for researchers, he said, apart from the ethical commitment, there was no real incentive to delete Facebook data — and plenty of incentive not to delete it.

“If you don't have clear ethical pressure, why not keep it for further use?” he said. “Get to know your customers better, improve your service, check your demographics, etc.”

Rieder isn’t alone in his thinking.

“I think this was a disaster waiting to happen,” said Bart Preneel, a professor at the Catholic University of Leuven, in Belgium, who co-wrote a 2017 paper, “Collateral Damage of Facebook Apps.”

Published by the International Association for Cryptologic Research, the paper offered a window into how a political data broker like Cambridge Analytica could have turned the survey answers of 270,000 willing participants into a political weapon armed with the personal information of 50 million Americans.

This is the kind of app that allowed the "researcher" to collect Facebook data. These are everywhere and shadowy marketing research firms have been using them since the dawn of the Web to collect data. Remember those "download accelerators" back in the late 90's? https://t.co/pJdwmKw53r — John Robb (@johnrobb) March 20, 2018

Preneel’s paper highlighted how apps could siphon off identities and information including location, groups, interests, “likes,” photos and videos, relationship details, religion, and even politics from people who never opted in to certain apps, but were friends of others who had.

“I think that part of the business model of Facebook was to grow very quickly and I think this was a side effect, of which researchers have warned for a long time,” Preneel said.

Rieder echoed a chorus of researchers and privacy advocates this week, expressing surprise that a story about Facebook data had captured the public’s attention.

Far more user data is currently for sale than the data that Cambridge Analytica collected, he said.

“This is quite specific since it involves academics, but this is nothing compared to what data brokers have for sale,” Rieder said. [The story] seems very media compatible, like a shady data set, exchanged by some guy in a suit, used to fool the American public. But it's pretty much business as usual in advertising, etc. This is just a small view into a huge business.”

Still, the media firestorm may create an opportunity for users to take control of the data they allow companies to harvest in the future, said Anja Bechmann, research director of Datalab and a fellow at the Aarhus Institute of Advanced Studies at Aarhus University in Denmark.

“The data is already out there but you have the right as the user to claim it,” she said. “You can close down your apps by going into your preferences.”

But Bechmann said the real change will come only when consumers on Facebook are willing to change the way they use the platform, perhaps refusing to play the games, take the quizzes and approve the apps that currently drive a typical user’s experience — and suck up their personal data in the process.

And if this moment of data awareness doesn’t translate to a change in user behavior?

“If users are going to act as they always do, we need to have other methods, create other actions,” Bechmann said. “Because it’s a societal issue.”