A new Netflix original documentary, "The Great Hack," provides an inside look into the Cambridge Analytica scandal as it unfolds, with exclusive interviews from whistleblowers and journalists.

The film also follows David Carroll, a Parsons professor who fights to get his own data back from Cambridge Analytica and has become a leading advocate for data rights in the United States.

Business Insider spoke with Carroll about the fight for data rights and what he thinks those in the U.S. should do to improve data-protection laws ahead of the 2020 election.

It's been well over a year since it was revealed that Cambridge Analytica improperly accessed the data of 87 million Facebook users to target advertising for President Donald Trump's 2016 campaign.

But David Carroll still hasn't been able to get his data back.

Carroll, an associate professor at Parsons School of Design, filed a legal claim against the company after the scandal, demanding to see what information was in his profile.

British laws allow users to request their own data if it has been processed in the UK. Even though the U.S. does not afford such data rights, Cambridge Analytica had processed the data in the UK, and Carroll believed he was entitled to it.

However, after Cambridge Analytica filed for bankruptcy and was liquidated, a British court did not grant Carroll access to his data. But he hasn't given up — today, he is still engaged in the fight for his data, and remains optimistic that he may gain access through the British Information Commissioner's Office in the fall.

His pursuit is depicted in "The Great Hack," a new Netflix original documentary that provides an inside look into the Cambridge Analytica scandal and follows Carroll as it all unfolds.

The film also features interviews with Cambridge Analytica whistleblowers Brittany Kaiser and Christopher Wylie, as well as Carole Cadwalladr, the Guardian journalist who first broke the story.

Business Insider spoke with Carroll about the fight for his own data, what the U.S. can do to improve data privacy ahead of the 2020 election, and the broader struggle for data rights.

This interview has been lightly edited and condensed for clarity and length.

Your attempt to retrieve the data that Cambridge Analytica collected on you is a leading storyline in "The Great Hack." What can people do to protect their individual data and what have you learned from your struggle?

My pursuit is a highly individualized narrative, which obscures the reality that it's a story about all of us. Quitting your Facebook account doesn't do anything. You can try to do the work of going through all your settings and being really hygienic about your data, but it's only going to reduce the scope of data leaking all over the place. It's certainly not going to have a total effect that people might want.

Climate change is a better metaphor for the problem because individuals cannot solve the problem with their own behavior, nor are they really responsible to do that. It's not feasible. We can't stop climate change as individuals, it does require a collective response.

Data protection is a structural problem. We don't have effective ways to hold companies accountable and to enforce when they commit data crimes because we don't even have a way to define, let alone prosecute, these data crimes. We can see that the existing tools we have are not succeeding at what they're supposed to do. They're not fit for purpose.

There have been growing calls for new legal frameworks to address how technology has changed our society. The EU passed the General Data Protection Regulation (GDPR) last year. Why doesn't the U.S. have something similar?

Europeans have data rights. Americans don't have the same rights. The irony is that if Cambridge Analytica had not exported the data to England and kept it in the U.S., I would have had no recourse at all. I could have asked for my data and they could have denied the request with no obligation to respond.

Some of that is a product of history — the EU is a relatively new political construction. In the charter of human rights that founded the EU, data protection rights are listed as a fundamental right that's equivalent to freedom of speech, freedom to marry, all these other basic human rights. That's why Europe has a 20-year lead on creating the infrastructure for businesses to provide for these rights.

We're really behind. The law that I used is before the GDPR, it's the UK Data Protection Act of 1998. It's fascinating that before the dawn of the commercial internet, countries in Europe were ahead of the game and were creating data rights and establishing the enforcers well before we even realized we would need it.

How optimistic are you about a federal privacy law? Some states are beginning to pass measures, but it doesn't seem that we'll have national regulation any time soon.

I don't feel optimistic, either. But what's happening at the state level is extremely interesting to me. The California law, the New York privacy bill that's even stronger, and the Illinois biometric lawsuit around facial recognition against Facebook, these show that states can exert some appropriate pushback, especially when there is no federal law.

If more state bills pass, especially in big states like New York and California that have economic power in the market, and where the tech companies are located, that's really significant. The industry lobby will then also pressure Congress to pass something, because they want a federal law that preempts the states. The tech industry does not want a patchwork of harsh state laws to comply with — that terrifies them.

What has changed since the Cambridge Analytica scandal and the 2016 election? Do you think we are better prepared for 2020, or are we going to be dealing with the same problems?

Facebook's ads library now tries to create some transparency with political ads. It gives you the ability to at least see what is being run and who it's being targeted to. The requirement to register and publish political ads for the sake of transparency is a good step forward, and the pressure on Facebook to do that is critical.

The recent story about finding the word 'invasion' in thousands of Trump campaign ads shows that negative, hyper-targeted campaigning is still in full swing. But at least we can see it this time, whereas last time, it was all super targeted — you wouldn't know it was even there.

The problem is there's still a lot of unattributable activity that we can't know who is behind. Cambridge Analytica boasted that they produced media that would never be recognized as political ads, through fake blogs and unattributed videos.

Ultimately, it's a problem of privacy asymmetry, and it has to be rebalanced. An advertiser can mount a covert influence campaign on one of these platforms, and they are shrouded in privacy. Facebook protects their privacy more than their users because the advertisers are the real customers. That inverse relationship is really at the core of the problem. Advertisers deserve less privacy and everyone else deserves more.

As daily users of these platforms, what can people do? Where should people's focus be in the fight for data rights?

Researchers have shown that even when people read privacy policies, they do not understand them. We need requirements to make the terms and conditions readable, as if they were a nutrition label, so you can figure out at a glance what are the advantages and risks of making a choice.

You wouldn't go to the grocery store and buy food products that didn't have the ingredients listed. This is something we expect in other parts of our lives. Why don't we expect it for the tech services we increasingly depend on?

Right now, we can't even begin the challenge of becoming an educated consumer. It's a step-by-step process. We first need to have basic fundamental rights, like the right of access. We should be able to demand from any organization or company, 'Give me the data you have on me because you're legally obliged to do it.' All the other rights stand upon that foundation.