Isabelle Falque-Pierrotin has a wake-up call for the world’s digital citizens: Beware of the tech giants lurking behind your screens and keyboards. Falque-Pierrotin—current head of France’s CNIL (National Commission on Informatics and Liberty) and the “Article 29 Working Party,” a group of European Union data-protection advocates—believes we are sleepily handing over personal data in droves without truly understanding the consequences. Comprehensive privacy protection should be an enforced requirement, she argues, not just an “opt-in” afterthought.

Google, not surprisingly, remains in her crosshairs. Last year, CNIL fined the company for violating French privacy protocols. This month, CNIL threatened again to fine Google for failing to comply fully with a year-old EU order to grant Internet users the “right to be forgotten”—a controversial idea that some influencers, including Tim Berners-Lee, oppose. The 55-year-old privacy crusader also believes that U.S. companies are not holding up their end of the EU’s “Safe Harbor” agreement that allows U.S. companies to store data on European citizens provided they protect it.

Shortly after CNIL gave Google 15 days to comply with the “right to be forgotten” order, we spoke with Falque-Pierrotin about the need to create and implement greater privacy protections globally, and to restore the balance of power between the Web giants and their billions of users.

What excites you most about where tech is leading humankind?

I am thrilled because technology has never been so powerful and so available. I believe that it will be able to replace, improve—outsource even—most of our actions or thoughts.

What worries you the most?

So, in parts, we could be replaced by machines. But no machine can replace humans as a unique combination of a body and spirit, a combination wherein lies the soul.

CNIL has recently charged Google with failing to comply with the EU order to allow delisting. If they don’t comply, will your sanctions have teeth?

Well I’m afraid they will have short teeth because financially until now we are able to [fine] only €150,000, [which] is what we did previously toward Google with privacy policy. Which is very small. We also have the possibility to have the decision made public, which [is] probably the worst for them. But the problem behind this right to be forgotten, it’s not only the question of how high the fines will be, it’s a principle question—the scope of the delisting on a worldwide treatment.

So how can you and others enforce this?

Even if in France we have rather low levels of sanctions, others in Europe have higher ones. For instance, Spain, Netherlands, Italy, had imposed much higher fines on Google for the privacy policy.

Every time we push a button on our keyboard we’re releasing valuable information. Yet we all seem hooked.

Oh yeah, absolutely. I think that we all have to be aware that when we are criticizing the surveillance by public authorities and by whomever, in fact we are part of this device. It’s difficult, because we don’t want to change really drastically our habits of life which are providing us with huge services. It’s very convenient. So the question is actually, how do you keep to that standard and comfort, while realizing that you’re not just an object in this digital infrastructure?

What is the most important issue today in the realm of digital privacy and security?

First, I would say making security issues represented as really important and as a priority for all of the stakeholders. I’m not sure that’s the case right now. Second, convince people that data protection is not against innovation and growth; on the contrary data protection contributes to confidence. It is a key factor in the digital environment.

Why is delisting so important?

It’s probably one of the first very symbolic examples of the rebalancing of the relationship between the data subjects and the industry representatives and the data controllers. It gives the possibility to each of us not to alter the past but to have the possibility to control a little bit what we have done in the past and their digital appearance. In terms of balance of the right of individuals and right of data controllers, there is a kind of shift that’s wanted by the individuals.

On the other hand, it’s not an absolute right. We should not fall into a kind of digital revisionism, where you could say that something did not even happen. It has to be balanced between the right of the individual and the right of the public to have access to the information.

Are you surprised at how freely younger generations surrender their personal data?

I don’t really agree with that. From the surveys we have, we see that the young people, they are quite clever about the way they share their information. Obviously there is a shift in whether they consider things private or public. This is true. But it doesn’t mean at the same time they want to share everything. And they worry much more than older generations about the privacy parameters. Look at the success of Snapchat compared to Facebook.

On the whole as a society, though, do you think we hand over personal data too freely?

I think most of the people don’t realize what they’re doing. And when they are informed, they are much more cautious about what they are doing. So I think the first priority is really to develop digital education, which at least in Europe we don’t have enough of. People are using the services, but they don’t know how it works behind the scenes. When you go and visit three or four websites, you have like 40 or 50 cookies that are implemented on your computer, without you even knowing the names. It’s Amazon, Cloud, Google, DoubleClick, or all types of B2B actors that are behind it exchanging information on you. We want to encourage the individual to regain power over the data.

So how can we get to this state of play where privacy is the default?

I think probably we have to work more than we have on the general conditions. Try to find something that would sum up what’s going on in a very simple manner on our websites because sometimes all these general conditions are written in a legal tone, and nobody understands the words. So how can we simplify in sort of everyday life language what is done with your data, who’s using it, for what purpose. With Google, one of the infringements was lack of information, and we said they must give clear and simple information to the users about what they do with the data. And until now, they haven’t provided us with these types of simple things.

Companies should really think about new kinds of data transparency tools: simple data dashboards with efficient parameters and transparent explanation of data uses, APIs that will help users automatically send their privacy preferences to the platform, data portability … There are no limits to what truly innovative and willing companies could implement in this realm. Those tools can all be steps toward making privacy by default, in a sense that you do only what is understood and accepted by the users.

Are there any companies that have actually made privacy inherent in their services?

Not as many as I would like, and nothing I can call a bullseye. Yet some projects or companies are definitely heading toward making privacy a priority or a part of their DNA. Mozilla’s Firefox browser has always been an example of “putting the user first.” In the search engines world, we’re very interested to see DuckDuckGo or Qwant getting a lot of interest from a larger audience for their privacy concern. As far as I can tell, many French companies, especially start-ups, are trying to explore paths in the personal cloud ecosystem (CozyCloud is an example of that), in the quantified-self and other connected objects markets, or in the intelligent personal predictive assistant space, where French start-up SNIPS recently said privacy by design was the necessary addition to context-awareness.

In the Internet of Things, anything that can be digitized will be. How do we stand a chance of maintaining privacy as everything comes online?

I can’t answer your question because I’m not a technician. What I know is that of course there is a new degree of complexity and as far as privacy protection is concerned, we believe that it’s going to be very difficult, especially because a lot of these devices are health and welfare devices in which we are exchanging very sensitive information, and this data is very interesting for insurers and others. Of course what is at stake is whether all of this will stay under the control of the person. So I agree with you. IoT is a new milestone of the digital environment and it’s a key one, because it brings us really into a sort of ambient Internet—the Internet is the environment now. It’s not a service; it’s not a network; it’s the environment in which we are interacting with each other. And so the security issues are therefore increased but I don’t have, I’m afraid, answers and solutions yet.

I like your phrase ‘ambient Internet.’ With everybody on their gadgets now I’d also say that ‘nobody’s here anymore, everybody’s there.’ That’s not so much a privacy observation as it is about human behavior.

Yes. And I think that in this ambient Internet the issue is: Where is the human being? And how do we make sure that the whole system is still organized around the human being, and not around the technical devices that are interacting with each other?

Do you think privacy should be the default? Share your ideas in the comments section below. #maketechhuman