A British researcher has uncovered an ironic security hole in the EU’s General Data Protection Regulation (GDPR) – right of access requests.

Right of access, also called subject access, is the part of the GDPR regulation that allows individuals to ask organisations for a copy of any data held on them.

This makes sense because, as with any user privacy system, there must be a legally enforceable mechanism which allows people to check the accuracy and quantity of personal data.

Unfortunately, in what can charitably be described as a massive GDPR teething problem, Oxford University PhD student James Pavur has discovered that too many companies are handing out personal data when asked, without checking who’s asking for it.

In his session entitled GDPArrrrr: Using Privacy Laws to Steal Identities at this week’s Black Hat show, Pavur documents how he decided to see how easy it would be to use right of access requests to ‘steal’ the personal data of his fiancée (with her permission).

After contacting 150 UK and US organisations posing as her, the answer was not hard at all.

According to the accounts by journalists who attended the session, for the first 75 contacted by letter, he impersonated her by providing only information he was able to find online – full name, email address, phone numbers – which some companies responded to by supplying her home address.

Armed with this extra information, he then contacted a further 75 by email, which satisfied some to the extent they sent back his fiancee’s social security number, previous home addresses, hotel logs, school grades, whether she’d used online dating, and even her credit card numbers.

Pavur didn’t even need to fake identity documents or forge signatures to back up his requests and didn’t spoof her real email addresses to make his requests seem more genuine.

Lateral thinking

Pavur hasn’t revealed which companies failed to authenticate his bogus right of access requests, but named three – Tesco, Bed Bath and Beyond, and American Airlines – which performed well because they challenged his requests after spotting missing authentication data.

Nevertheless, a quarter handed over his fiancée’s data without identity verification, 16% asked for an easily forged type of identity he decided not to provide, while 39% asked for strong identity.

Curiously, 13% ignored his requests entirely, which at least meant they weren’t handing over data willy nilly.

The potential for identity theft doesn’t need spelling out here, notes Pavur’s session blurb:

While far too often no proof of identity is required at all, even in the best cases the GDPR permits someone capable of stealing or forging a driving license nearly complete access to your digital life.

The danger is that criminals might already have been exploiting this without anybody noticing.

As Pavur points out, automating bogus standardised access requests wouldn’t be hard to do at scale by using the sort of basic name and email address data that many people make public on social media.

Whose fault?

If Pavur’s research shows a failing, it’s that too many organisations still don’t understand GDPR.

It isn’t enough to secure data in a technical sense if you don’t also secure access to it. If someone phones up to requesting to know what data is held on them, not authenticating this request becomes a bypass that ends up endangering privacy rather than protecting it.

While it’s true that this could have been happening long before GDPR existed, giving citizens the legal right to request data has handed people with bad intentions a standardised mechanism to try and manipulate.

But there are deeper failures here too – if organisations try to verify someone’s identity, what should they ask for? GDPR or not, there is still no universal and reliable identity verification system to check that someone is who they say they are.