The wealth of information many dating apps request may help to home in on the perfect match. But users should be able to find out what is known about them and how that sensitive information is being used

In the real world, asking someone out is sometimes fraught with awkwardness and tinged with the prospect of possible humiliation. One of the successes of the dating app Tinder was that it revealed only that a user liked another when the feeling was mutual. The wealth of information that such software requests may help to home in on the perfect match, but it also raises important questions over how the data is stored and what rights one has over it. Very little, if the case we reported earlier this week is anything to go by. In that story it took months for a journalist to extract her Tinder data. More than 800 pages of her most intimate life details – volunteered when using the dating app – tumbled out.

In a world where personal data is increasingly shared and manipulated, there is an increasing risk that it might leak. The European Union, to its credit, understands that individuals might object to the way a tech company deals with their personal data. The EU’s general data protection regulation (GDPR) sensibly attempts to strengthen a user’s control over their data: in certain instances, companies are required to get explicit consent for how they use data. Fines for violations will be steep: up to €20m (£17.5m). It also grants users access to their personal data. This is important: as our story points out, Tinder shares personal information with other tech giants which then use it to build a picture about users.

The personal data affects who you see first on Tinder but it can also be used to decide what job offers you have access to, or how much you will pay for car insurance. We found that Tinder only disclosed what it wanted to. Crucially, the company did not say how “attractive” the reporter had been rated – nor explained how this metric had been calculated. The Silicon Valley firm argued that as the data was always in the US, it had no legal duty to give her anything. This is wrong morally, if not in law.

In effect, US big tech is engaging in jurisdictional arbitrage – taking advantage of the discrepancies between competing regulatory systems – to shelter what it does with personal data from oversight. US firms are not processing data in Europe for fear of falling under stricter EU regulations – something that became obvious after Google found itself liable for data-handling in Spain, a case that led to the landmark “right to be forgotten” ruling. The GDPR is supposed to level up protections; but one of Donald Trump’s first acts was to say he would give non-US citizens less data protection than US ones despite previous assurances from Washington. The result is the growth of a US-centred tech economy where sensitive data is shared with little accountability.

In the case that the Guardian highlighted it took a lawyer, a technologist and a reporter to pierce the corporate veil and work out where personal data was processed. It’s a global issue: Russia has told Facebook it would block access to the social network unless it stores the personal data of Russian citizens on its soil. Britain’s new data bill, supposed to incorporate the GDPR, looks badly drafted. A concern is that UK regulators will simply take tech giants at their word. We are heading to a society where personal data is ever more influential. But its collection, storage and aggregation remains opaque. Shielding it impinges on people’s right to make decisions for themselves and ultimately restricts their individualism.