The facial recognition technology used by Facebook could be used to train an artificial intelligence algorithm on age-related characteristics and progression. That’s according to author Kate O’Neill, who wrote in Wired that the trending “10-Year Challenge” on Facebook helps it create a large dataset of photos of users from 10 years ago and now.

Radio Sputnik discussed whether users should be more aware of the possible dangers associated with personal data sharing with Paul Levy, author of the book Digital Inferno.

CC0 / Pixabay US Mulls Imposing Record Fine on Facebook for User Privacy Violations - Reports Kate O'Neill's blog was a bit mischievous, but in a good way; it was reminding us how quickly and easily, almost immediately, we share our data publicly. We might be doing it for fun and innocently, and a lot of famous people have been sharing their memes on Twitter and Facebook; some for very good reason, like comparing themselves to 10 years ago when they were going through difficult times.

There could have been issues in their mental health and they're saying how they've changed and trying to be educational; other people have been doing it for fun. But what she was pointing at, what could possibly be used when this happens, one of the possibilities is that if it's two images side-by-side, very conveniently, we're doing the sourcing of those images and they can be used with algorithms for facial recognition.

It doesn't mean they are, but with trust low in social media networks from some of the stories mentioned in the past, like Cambridge Analytica, if trust is low, she's suggesting we just should be a little bit savvier and mindful before we rush to share our data online.

So I think it's been a good wake-up call, but I'm not quite sure how many people will wake up.

READ MORE: Desperate to Control the Narrative — Facebook Censorship Rages Onward

Sputnik: It's funny how yeah, we are basically doing their work for them because I've seen so many people desperately looking for 10-year-old photos of themselves and if they can't find them on Facebook they reach out to other social media, wherever they're actually putting in a lot of effort to get those photos out there, but one wonders how an algorithm of this sort could actually be used?

Paul Levy: I think that's what's being revealed through this. Some of the algorithms are not yet able to do what some of these organisations would like them to do.

It's not that easy from a 100 or a thousand or even a few thousand photos you posted; there may be timestamps, but you may have put them up on the day that shows a picture of you five years before, and so this very simply means you organise that data, so it could be used in lots of ways.

It was suggested in facial recognition it can be used in showing how you've changed, whether you're showing truthful images of yourself, and so it's revealed a lot about yourself that algorithms can't easily pick up from the chaos of thousands of images. So at times, they're getting us to do the work, you know, when the algorithms they'd like aren't quite there and efficient enough, and that worries me.

READ MORE: Facebook Bans Sputnik Pages on Advice of NATO-funded Atlantic Council

Sputnik: Surely it is just a matter of time before algorithms of this kind will be out and will be used; it's kind of inevitable, isn't it?

Paul Levy: It is, but you know some of it is not necessarily bad. You know we're going through a phase of evolution in this field, that's social evolution in some ways where maybe all the scandals will mean that we will come to a better place, to better governance. There's a pushback that goes on here and there are court cases.

© AP Photo / Ross D Franklin FBI Pilot Programme Uses Amazon’s Controversial Facial Recognition Software The question about whether this is inevitably heading towards complete openness and transparency whether we like it or not or whether these crises are in some ways necessary, to put the proper governance in place, whether that is possible. We're still in the process of change and turmoil and we don't know where that's going to go.

My own view, as you probably heard before, is a bit pessimistic simply because as an academic pessimism is never a bad thing, a bit of caution enables us to stay awake and scrutinise things. And at the moment corporations are doing the innovation and our ability to work out the impact is running slower than the speed of change, and that's damaging and dangerous if we don't keep awake.

READ MORE: US Cops Use Facial Recognition Tech to Open Suspect's iPhone X — Report

Sputnik: Is it necessary to obtain consent before using private or public photos of users on social networks for any sort of research?

Paul Levy: I think the problem is the whole nature of the free model, for this to occur and so many people, if they were researched and asked: do you like these platforms, would want them to happen if they weren't able to operate very easily, if we put all those rules in place.

The assumption is that when you signed up that you've read the terms and conditions, but most people don't read, and that's why I think need to be teaching this at younger and younger ages at school; this is a new skill set, and if you decide to go for transparency, the sharing model as a default, you know what you're doing, you know where your privacy things are and when to switch off and how to switch off and people are fairly clueless.

There's nothing really wrong with it per se if we are empowered and savvy but we are simply not, and because of that you might think the government and organisations have a duty of care like in any under industry to look after us.

READ MORE: Cybersecurity Firm Warns AI May 'Go to Dark Side' in 2019

Sputnik: Of course, every time we're talking about social networks we always ask the same question: how can users protect their data posted on social networks? Is there a way to do this? Or some are just saying it's safer to avoid posting personal information, but I don't think this is going to happen, it's become a part of peoples' lives, hasn't it?

Paul Levy: The assumption is that you're sharing, the assumption is transparency, and then you have certain switches to switch things off, but the Internet has a long memory and it finds it much harder to forget than to keep memorising, so people are shocked when they google themselves and find out what's there.

You know one thing is to have some of your stuff removed is problematic, as we've read with Google. You're giving a lot of permission from the moment you press, I always joke about it, the submit button, submit means surrender.

READ MORE: Google Employees Seek to Weaponize IT Services to Aid Left-Wing Agenda — Prof

Sputnik: Do you think that anything is going to change in the future or in the next few years we're just going to carry on with what's happening on social networks right now, people being pretty oblivious? Or will something happen, in the unlikely event something that does happen, that will make people more aware?

Paul Levy: Certainly there have been real steps taken, and some messaging apps do this, for properly encrypting your data. Certainly, there is the possibility that has been pushed out by some of the founders of the Internet that the Internet will need to be premium. For those people who want pure privacy, well that is a product and that's something they will have to pay for.

You know the ability to shut your curtains and your front door and not have people peek in involves making certain choices. Now the start-ups that have offered that have often been bought up and compromised, but it could still be that soon the innovation will come that you'll pay a little bit for the privacy that we have to pay for in the physical world, where you have to buy things to be able to be behind closed doors.

And the closed-doors Internet for the average person like us isn't really being offered properly yet, but it may very well come with innovation.

Views and opinions expressed in the article are those of Paul Levy and do not necessarily reflect those of Sputnik.

The views and opinions expressed in the article do not necessarily reflect those of Sputnik.