Privacy is a hot topic for legislators all over the world.

Democractic presidential candidates have privacy laws and regulations in their campaign platforms. Amy Klobuchar discussed a tax on companies who share user data. Elizabeth Warren has introduced legislation that considers the idea of jail time for CEOs over privacy failures. Before he dropped out of the race, John Delaney proposed the U.S. adopt a law similar to the California Consumer Privacy Act, which gives greater agency to consumers when it comes to limiting companies collecting of their data.

Voters are demanding action. A recent poll from Morning Consult found 79 percent of registered voters said Congress should pursue a bill to better protect the online data of consumers, while 65 percent called data privacy one of the biggest issues facing society.

The European Union, 27 member states with the loss of the UK, enacted the General Data Protection Regulation (GDPR), enshrining the idea that people have control over personal data. California recently enacted its own privacy law, the California Consumer Privacy Act (CCPA), which goes into effect January 1. The law empowers California consumers to know when private companies collect, share or sell their data and to stop that sale if necessary. It applies to companies with annual gross revenue of more than $25 million or that possess information on 50,000 or more consumers.

But laws can have unintended consequences. Sometimes the very laws meant to enforce privacy can result in companies nevertheless sharing it. GDPR opens up a way for crooks to impersonate people and get their data from companies.

A year after GDPR went into effect, researchers in the EU showed how it’s easy to access personal data from companies.

“This isn’t a problem with the law itself, but instead with the companies and organizations implementing it,” Mariano Di Martino, one of the researchers, who is a PhD student as Hasselt University in Belgium, told CoinDesk in an interview. “This may be because of budgetary constraints or maybe it’s because they don’t understand the risks of this data.”

One group used publicly available information, such as names, emails and phone numbers, in addition to more complicated methods to request information on their research partners from 55 companies under GDPR. One of these complex methods for obtaining the data included replacing the name, birth date and photo on the image of an ID to reflect the person whose information the researchers wanted. Of those 55 companies, 15 companies gave up sensitive personal information to the researchers. Four companies never responded to their data requests, in clear violation of GDPR.

This isn’t a problem with the law itself, but instead with the companies and organizations implementing it.

The information they gathered included financial companies giving up details such as ID card numbers, a list of timestamped financial transactions, customer IDs, telephone numbers and place of birth, and transportation and logistic companies releasing locations people visited in the past as well as routes they’d saved.

Another team of researchers in the EU found similar issues when one requested information on his research partner and the research partner’s wife using a spoofed email account that was a variation on the name of the wife. About a quarter of the 150 companies and organizations they contacted gave up sensitive personal information without verifying the identity of the requester. The information given to him included everything from her social security number to her high school grades and various account passwords.

As the CCPA goes into effect, it’s possible we may see similar issues. The GDPR research illustrates that privacy laws may only be as good as the companies affected by them. Which is scary. These leaks have real-world implications.