“Someone who knows things about us has some measure of control over us, and someone who knows everything about us has a lot of control over us. Surveillance facilitates control.”

– Bruce Schneier, cryptographer and security expert

When the European Union talks about modernising EU rules on data protection in the digital age, the most important challenge is unquestionably “big data”, and the most important challenge of big data is profiling.

Big data is not “more data” – big data is the massive merging of data to generate more data, more assumptions and more knowledge about you and me. If you are this age, went to this website and bought that product, big data will predict that you can be offered higher prices, or you shouldn’t be offered insurance, or you might vote in a particular way. Innocuous morsels of personal data interact and become pregnant, producing offspring that could be anything but harmless. The “I’ve nothing to hide” argument never made much sense, but it makes no sense at all in a world where you have no idea what guesses are being made on the basis of the data that you know about.

Our devices are feeding information into large databases 24/7: Our mobile devices gather and send information about our movements while we walk.. Many of the apps installed in our phone demand unnecessary access to our contact list. Our home smart meters will know when we get home after work and if we have guests. Our search engine keeps records of our interests and fears. Facebook has successfully experimented with its power to make people happier or sadder, and even to make them more (or less) likely to vote. Professionals called “data brokers” collect and aggregate personal data from a wide range of sources to create detailed profiles of individuals which are then sold to third parties.

So, how do the proposed new EU rules (the General Data Protection Regulation, GDPR) address this huge new challenge? Not very well. First, the article dealing with profiling (Article 20) was weak in the European Commission’s initial proposal, was diluted by the European Parliament and eviscerated by the Council of the European Union. The current Council text says that data subjects, the individuals to whom the collected personal data relates, cannot oppose to the profiling itself only to “decisions based solely on automated processing, including profiling”. Therefore, if there is a profiling activity but no formal “decision” has been made, or if that automated processing and profiling is only part of the process and not the sole basis for the decision, there would be no specific right to object under EU data protection law.

Flanking protections, which could normally be relied upon, even if profiling and decision-making rules were weak, have also been diluted: Data-minimisation becomes “not excessive” data processing. Access and rectification become problematic when profilers can hide behind their algorithms as “trade secrets” or pseudonymisation. “Purpose limitation”, the principle that data must be collected for specified, explicit and legitimate purposes only, is undermined by watery compromises on what “compatible use” might be, while the need for the user’s consent can be bypassed by the open-ended “legitimate interest” loophole.

If this had not watered down the safeguards enough, profiling has been re-inserted into the list of exceptions for which Member States may restrict rights and obligations for purposes related to “national security”, “defence”, “public security” and, for fear that these provisions were not vague enough, “other important objectives of general public interests of the Union or of a Member State”. This, in practice, allows national governments to circumvent EU data protection law and allow profiling when the goal is allegedly linked to any of these ill-defined goals.

A harmonised, modernised legal instrument for the EU is more necessary than ever. The GDPR needs to be future-proof and needs to have strong safeguards without loopholes. The current negotiating text of the GDPR looks like set to fail its biggest test. If the ongoing negotiations between the European Parliament and EU Council do not resolve these and other problems, we might be facing the loss of a fundamental right, the loss of trust, and take-up of technologies based on big data. This should not be worrying for EU citizens only: The GDPR is crucial for global norm setting in the field of data protection and privacy. We have one opportunity – we must do better than this.

Surveillance-based manipulation: How Facebook or Google could tilt elections (26.12.2015)

http://arstechnica.com/security/2015/02/surveillance-based-manipulation-how-facebook-or-google-could-tilt-elections/

Facebook reveals news feed experiment to control emotions (30.06.2014)

http://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds

General Data Protection Regulation: Document pool (25.06.2015)

https://edri.org/gdpr-document-pool/

Obfuscation: how leaving a trail of confusion can beat online surveillance (24.10.2015)

http://www.theguardian.com/technology/2015/oct/24/obfuscation-users-guide-for-privacy-and-protest-online-surveillance

Our obsession with explaining past atrocities could destroy our free speech (22.10.2015)

http://www.telegraph.co.uk/news/uknews/law-and-order/11947492/Our-obsession-with-explaining-past-atrocities-could-destroy-our-free-speech.html

(Contribution by Diego Naranjo and Joe McNamee, EDRi)