Privacy, data protection and democracy in the era of algorithms

You probably care about your privacy, right? With scandals such as Cambridge Analytica and leaks becoming more frequent and new data protection laws, such as the European GDPR being passed, I’m sure that you’ve started to care more about this subject.

And that’s great, of course. However, I need to ask: do you know what privacy is, this precious thing that we all now agree that must be protected?

In order for the debate around data protection to be fruitful, we need first to know what exactly we are talking about when we talk about privacy. In other words, without a proper definition of what it is that we should protect, any initiative that aims to solve the “problem of privacy in the digital world” already starts out deficient.

Privacy: an ever-changing concept

From the outset, it is very important to note that the notion of privacy is historically very recent and has had very different meanings over time. Because privacy is such an abstract value, it only becomes clear when threatened or lost, and the way in which that occurs depends on the context.

The first legal cases about privacy dealt only with the material aspects of the unauthorized entry into the property of others. At that time, in fact, we didn’t even talk about a right to privacy, but rather the right to be left alone, and it was applicable only to a restricted group of individuals: white free men who owned land.

As the technology evolved, especially since the late nineteenth century, the concept of privacy came to include also the protection of individuals’ private communications (letters, postcards, photographs, phone calls), both in relation to government and other private agents. Urbanization has also contributed to the broadening of the concept of privacy. When people lived in the countryside, in properties far from each other, and lived mainly with members of their family, the possibility of interference from outside agents was small. As they moved into cities and began to coexist in ever-smaller spaces and with more and more unknown people, the opportunities to be harassed also increased. In 1965, the Supreme Court of the United States finally decided for the first time that citizens had a specific right to privacy.

But it was only with the age of the internet and with the growth of the online world that we began to properly speak about the need to protect data as a matter of privacy. Note that the term “data” here is broad and refers to both personal data and metadata, as well as any content we produce, even if they are voluntarily shared when posted on a social network. In this way, the concept of privacy went from a concrete reference to something abstract and general that refers to each of us and our clicks and likes.

Privacy, what for?

Now that you know that privacy is not an absolute concept, but rather dependent on temporal contextualization, the second thing I want to ask you is: why do you even need your data to be protected? If you are just a citizen leading the good, old, honest life, why do you need data protection?

Even if you have nothing to hide, your data can and is used as the raw material for an entire industry that specializes in predicting the future through algorithms. It is through them that platforms like Netflix can suggest which series you should watch next, or that Amazon can tell you that “if you liked product x, you might be interested in products y and z.”

These services are convenient, but algorithms are used in many other instances of our social life and have extremely serious consequences. Privacy matters because unprotected data can lead to algorithms that threaten our autonomy as individuals and the democratic system itself. Having a piece of information being public is not, by itself, something so harmful – for example, the information that your fitness tracker gives you that you slept less than six hours last night. But if a company has access to data, it knows that you are tired and stressed (since the text messages you exchanged with your colleagues were more aggressive than usual), so it may be the right time to show you a fast-food ad and take advantage of the fact that your self-control is lower.

Does this seem harmless to you? Then imagine that a company knows, by crossing your data, that you did internet research on how to get a fast loan, that you started to buy your groceries in a cheaper market and that yesterday you had to pay a very high bill for a child’s medicine? This algorithm will be able to infer that you are in a bad financial situation, especially now that your child is sick, and use your moment of desperation to offer you a loan with an abusive interest rate that you would otherwise refuse.

Now add a layer of manipulation to that and go to the political sphere. Neither your Facebook feed nor the results of your Google searches are automatic or neutral. They are controlled by algorithms that try to show what is most relevant to you (although 62% of Americans who use Facebook don’t know that and 73% of them think that Google searches always lead to the same result). With that, what you see online can easily be manipulated, as seen both in the presidential elections in the United States and in Brazil.

If you populate the feed of a working-class father from a small city, who happened to lose his job recently, with news of crimes committed by immigrants, they are much more likely to vote for nationalist, borderline xenophobic parties. Similarly, if you emphasize other issues, such as the urgency of an environmental catastrophe or the threat to women’s reproductive rights, target individuals are likely to line up further to the left of the political spectrum. From the profile of each user, the political campaign can now deliver messages that are increasingly “personalized” and convince them to vote for a particular candidate.

Conclusion

With the evolution of technology and the ever-increasing integration of smart devices into our lives, the amount of information available about each of us will only increase. This also increases the possibilities of us being manipulated on relevant issues, both as consumers and as citizens. Given the importance of this, it is absolutely essential that initiatives and laws made to ensure our privacy are placed at the center of the political debate.

Organizing data is not just about increased market efficiency and profit-seeking. The way our personal information is (or isn’t!) used is a choice, ultimately a moral one, that must be made by all of us. Otherwise, we will not be able to guarantee that democracy will survive and thrive in the digital age.

Do you want to find out what blockchain has to do with privacy? Then check out our posts The privacy paradox of blockchain technology and Blockchain Identity II: On the Internet, Nobody Knows You’re a Dog.

You can also book a call with our sales team right now and discuss the possibilities OriginalMy has to offer.