Welcome to Small Humans , an ongoing series at Mashable that looks at how to take care of – and deal with – the kids in your life. Because Dr. Spock is nice and all, but it’s 2018 and we have the entire internet to contend with.

You're at a barbecue, or maybe a birthday party in the park. Someone, let's say a friend of yours, takes out a phone and snaps a pic of your kids all playing together. The picture is adorable, and it's posted to Facebook before you even realize what happened.

This doesn't bother you, of course. You, and hundreds of millions like you, have already uploaded scores of photos of your children to the online platform — all the way from the delivery room to the moment they uttered their first word. As social media continues to work its way progressively deeper into our everyday lives, this near-constant documentation of our kids has become normalized.

But perhaps it shouldn't be. Instead, perhaps it's past time we reevaluate our relationships with the platforms working to monetize an indelible record of those we love.

The Big Questions

Concerns over Facebook's, and by extension Facebook-owned Instagram's, handling of personal information are long past the point of academic. Massive slip-ups, followed by scandals, followed by breaches serve to constantly remind us that the Mark Zuckerberg-helmed behemoth is unable to protect the reams of personal data it has collected on its users, even when it professes a desire to do so.

And then, of course, there are all the cases where what's good for Facebook explicitly doesn't line up with what's in you or your child's best interest.

What is happening with those photos once they're uploaded?

But this is not a story about shady third-party apps scraping Facebook for your personal data, or even Facebook intentionally manipulating users' News Feeds to see if it would make them sad. Rather, this is a look at what it means when untold amounts of personal information about your child is posted to the service — with or without your consent.

What does it mean for your kid's present and future well-being when thousands of photos showing every stage of their development have been handed over to a company that brushes off an acknowledgment its service may directly contribute to literal deaths before those same kids are even old enough to have a Facebook account?

And how does Facebook protect those children from being categorized, monetized, and exploited by the same algorithms designed to track and profile its users — the company's entire raison d'être? Is the company that, despite all the evidence to the contrary, very publicly denies knowledge of the shadow profiles it actively creates, really the right corporate entity to which you should entrust to safeguard your child's life story?

These are just a few of the questions we posed to privacy advocates, childhood development experts, and Facebook itself. What we found wasn't exactly reassuring.

First, the unsurprising part: Over the course of reporting this article we reached out to multiple specific individuals at Facebook with likely knowledge of the company's policies on safeguarding minors' data, as well as Facebook PR and the lead spokesperson on issues relating to facial recognition tech.

Despite an initial offer to discuss the general concerns raised by our reporting, Facebook stopped responding to our emails after we posed specific questions. None of the Facebook employees we reached out to directly returned our requests for comment.

Photographs and Privacy

According to Facebook's stated policy, children under the age of 13 are not allowed to have an account with the service. The most obvious reason for this is the Children's Online Privacy Protection Act, commonly referred to as COPPA. Passed in 1998, the act created strict requirements for "operators of websites or online services" that collect information from children under the age of 13.

Simply put, it's meant to protect kids. Notably, COPPA applies to information collected from said children, not about them. That's one of the reasons why it's perfectly fine for Facebook's servers to be full of kids' pictures uploaded by their parents.

But what is happening with those photos once they're uploaded? Unfortunately, it's not 100 percent clear, and Facebook isn't inclined to say. However, we do know at least one thing Facebook does with photos: train its AI.

Facebook offers a facial recognition service that can determine who is in a picture and suggest corresponding name tags. So, if you upload a photo from your high school reunion, Facebook is able to prompt you to tag your old pals. Users can opt out of this feature, but that doesn't change the fact that the company has the tech to analyze photos and recognize individuals by their faces.

"We think most parents would object to their children’s information being used to make profits for Facebook."

In order to perfect this tool, Facebook required a large set of photos on which to train and test its algorithms. And we do mean large. A 2017 study from the University of Maryland specifically notes the number of images we're talking about — "about 500 million images over 10 million identities."

Where oh where could Facebook get its hands on 500 million images? Oh, right.

A 2015 paper published by Facebook researchers notes that, in addition to using a public dataset containing 13,233 photos of 5,749 celebrities to test and refine its facial recognition tech, they "also validate [their] findings on an internal dataset, probing 100K faces among 10K subjects with a probe-gallery identification protocol."

Privacy researcher and creator of anti-facial-recognition fashion CV Dazzle, Adam Harvey told Mashable that it's "possible and likely that [Facebook's] training dataset of 10M identities contains at least one person under 13."

He was quick to note that, at present, he can't prove this. But still, it holds to reason that if Facebook has 500 million photos — likely culled from its users — in the service of testing or training its facial recognition tech, then at least some children are going to be found in there. That's because, as Associate Professor at the University of Michigan's School of Information Sarita Yardi Schoenebeck told Mashable, at present, facial recognition tech doesn't do a great job of identifying individuals' ages — hence, it would require a herculean effort to manually weed out photos of children in that data set.

"So far," noted Professor Schoenebeck over email, "I haven't seen any evidence that Facebook is doing something different with children's photos (i.e., trying to protect them more than adult photos), outside of sensitive contexts like nudity."

James Graves, of the Georgetown University Law Center, echoed Professor Schoenebeck.

"According to Facebook, its algorithm 'learns for itself what distinguishes different faces and then improves itself based on its successes and failures, using unknown criteria that have yielded successful outputs in the past,'" wrote Graves — quoting from a court decision in a class action suit regarding Facebook's use of facial recognition technology. "Based on that description, I would guess that Facebook's algorithm analyzes children alongside adults, does not attempt to classify children as children, and does the same thing with non-user children as it does with non-user adults (whatever that might be)."

In other words, it's possible that your precious child's face is already being studied and analyzed by Facebook in its efforts to build better facial recognition algorithms. What those algorithms could be used for in the future is anyone's guess, as even Facebook won't confirm they will just stick to suggested photo tags.

“Can I say that we will never use facial recognition technology for any other purposes," mused Facebook chief privacy officer Erin Egan in a 2013 interview with Reuters. "Absolutely not.”

Whether or not this bothers you in many ways depends on whether or not you trust Facebook to be a good steward of your child's privacy. History suggests caution.

Messenger Kids

As your children age, but before they are old enough to have a Facebook account of their own, Facebook provides them with some brand loyalty imprinting in the form of Messenger Kids. The app, designed for kids aged six to 12, differs from Facebook at large in that it functions essentially as a bare-bones version of the company's Messenger app.

There's no News Feed, no public groups, and hopefully no Russian influence campaigns.

But should you create an account for your kid?

"When you sign your child up for Messenger Kids, it doesn’t mean you are signing them up for a Facebook or Messenger account," the company tells parents. "Messenger Kids is a separate, standalone app just for kids."

This, of course, hasn't assuaged the scores of childhood development and privacy experts that have come out in opposition to the app.

"Adults can weigh those risks against the benefits of using Facebook and Facebook Messenger. Children cannot."

In January of this year, the Campaign for a Commercial-Free Childhood publicly opposed Messenger Kids — publishing a letter signed by 118 public health advocates demanding the app be shut down.

"Younger children are simply not ready to have social media accounts," read the letter in part. "They also do not have a fully developed understanding of privacy, including what’s appropriate to share with others and who has access to their conversations, pictures, and videos."

In the approximately nine months since the letter's publication, the CCFC has only become more strident in its warnings about the potential harm caused by signing your children up for Messenger Kids.

On October 3, the group submitted a letter to the Federal Trade Commission stating that Messenger Kids does not comply with COPPA and requesting an investigation as a result.

"Facebook is encouraging young children to use Messenger Kids as the channel for relationships with friends and family," David Monahan of CCFC explained to Mashable over email. "So Facebook is clearly collecting sensitive information — the contents of kids’ conversations. Their privacy policy is vague, but it appears to give Facebook wide latitude to use the information they collect for broad business purposes."

And Monahan wasn't done. "We think most parents would object to their children’s information being used to make profits for Facebook."

Future Risk

These concerns are not limited to uploading photos of your young children, or even Messenger Kids. Facebook itself, of course, is a minefield for 13-year-olds — and posting your child's early life on Facebook only to have them later join Messenger Kids and then Facebook, primes them for a very unique form of trouble.

Common Sense, a nonprofit "dedicated to helping kids thrive in a world of media and technology," has advocated extensively on behalf of children's privacy and healthy development. Ariel Fox Johnson, Common Sense's senior council on policy and privacy, sees Facebook as fundamentally at odds with those core values.

"Minors joining Facebook, or any social media company, should be aware that nothing they do on a site is truly private," she told Mashable in an emailed statement."You can set certain limits on who can see a post, or a photo — and you should, but the company is still going to see all of it. And how they use that information is often up to them."

"Minors joining Facebook, or any social media company, should be aware that nothing they do on a site is truly private."

And the concern is not just what Facebook is going to do with kids' data today, but what it's going to do with it tomorrow. As anyone who's spent any time online since Facebook's launch can attest, the company has repeatedly changed its privacy settings and core features after the fact — often leaving users to deal with the fallout on their own.

What's more, the company's services and use of data has aggressively expanded in its 14 years. What was once a way to connect with like-minded college students is today a global network facilitating age discrimination, political influence campaigns, and possibly even ethnic cleansing.

Who knows what it will be a year from now.

Adam Harvey, the aforementioned privacy advocate, notes that keeping your kids — photos and all — off Facebook is, in light of this future uncertainty, a reasonable precaution.

"From a long-term security perspective," noted Harvey, "by using social networks you can provide potential adversaries with information that could eventually be 'weaponized' against you either for reputation damage or personal information verification."

And as the recent hack of approximately 50 million Facebook accounts painfully reminds us, it's not just Facebook's potential use or misuse of your data that you have to worry about. Whatever information you hand over to the company could one day be used against your kids by anyone who finds a way, legal or not, to access it.

"[Even] if Facebook promises not to make use of all the data it is collecting," emphasized Graves, "and even if whatever third parties to whom it discloses that information also refrain from using the information for building profiles on children and tracking them, there is always a risk of involuntary disclosure of that information. Adults can weigh those risks against the benefits of using Facebook and Facebook Messenger. Children cannot."

So What Now?

The desire to take and share photos of our kids' cute, and possibly not so cute, moments isn't going anywhere. So, in a world where the main service we rely on to share those pictures is fundamentally problematic, where does that leave us?

It's unrealistic to just tell people to stop. Despite its numerous flaws, Facebook does offer the real ability to easily broadcast important life moments to friends and family. This, obviously, has resonated with the billions of people who rely on the service.

Extended family and friend networks also create a problem, as Facebook considers images of you or your children taken and uploaded by someone else to be their property, governed by that person's privacy settings. Even if you opt out of image tagging and lock down your profile, you have no control over what other people upload. In the real world, do you bow out of that big family photo because you know it's going to end up on Facebook? Are you willing to be the squeaky wheel that asks for it not to be posted for the faraway relatives to see? Would the rest of your family abide by that request even if you did make it? There aren't easy answers.

In the end, there needs to be a sea change in the public's understanding of the privacy risk services like Facebook create. This is not out of the realm of possibility. Calls to break up the social media giant are growing, and following the Cambridge Analytica scandal, Facebook is on its heels doing everything it can to convince people that it takes privacy seriously.

It's the job of parents, as well as elected officials, to hold Mark Zuckerberg to account.

In the meantime, consider the advice that Zuckerberg gave his own newborn daughter: "The world can be a serious place," he wrote in a 2017 Facebook post. "That's why it's important to make time to go outside and play."

And the next time, while you and your family are outside playing, you feel inclined to take a picture, consider sharing it via group text.

Read more great stories from Small Humans:



