Q: Are our data profiles inaccurate?

A: Who knows? Without transparency and access to the full profiles that are generated for us by tech companies we cannot really tell. I am sure users themselves would be the best auditors of these datasets because they have real (often economic) incentives not to be judged on the basis of incorrect or incomplete information. But they are not given the chance to do so.

I came up with this layered metaphor to explain the complexity (and dangers) of how online data profiles work after hearing for the hundredth time: ‘What’s the problem if we choose to share and publish our data ourselves?’ The thing is that we do not make these choices ourselves. We are lured into sharing more data than we would accept, observed and qualified by machines in ways we can hardly imagine. Not surprisingly, they detect sensitive characteristics we may prefer to keep private.

Q: Why should we want to see our data?

The only way to regain full control over our profiles is to convince the companies who do the profiling to change their approach. Instead of hiding our data from us, they should become more transparent. We need to open these opaque systems to the scrutiny of users.

On the other hand – instead of guessing our location, relationships, or hidden desires behind our backs, I think companies could simply start asking us questions, and respecting our answers. I even see this as a real opportunity for marketing companies to build trust and make targeted ads more relevant and fair.

In the European Union, we have a legal framework that facilitates greater openness and access. The General Data Protection Regulation (GDPR) now gives Europeans the right to verify data held by individual companies, including marketing and advertising profiles. Companies can still protect their code and algorithms as business secrets, but in theory they can no longer hide personal data they generate about their users. I say in theory – because in practice companies don’t reveal the full picture when confronted with this legal obligation. In particular, they hide behavioural observation data and data generated with proprietary algorithms. This must change, and I am sure it will, once we begin to see the first legal complaints result in fines.

Q: How could we make radical transparency a reality?

Well, no doubt we have to be prepared for a long march. We need to work together as a movement and test different approaches. Some of us will continue to test legal tools and fight opponents in courts or in front of Data Protection Authorities. Others will advocate for (still) better legal safeguards, for example in the upcoming European ePrivacy Regulation. Others will build or crowdfund alternative services or push big tech to test new business models, and so on. I am sure it will be a long run, but as a movement, we are at least heading in the right direction. The main challenge for us now is to convince or compel commercial actors to come along.