michael barbaro

From The New York Times, I’m Michael Barbaro. This is “The Daily.” Yesterday, my colleagues Michael Keller and Gabriel Dance described the government’s failure to crack down on the explosive growth of child sexual abuse imagery online. Today: The role of the nation’s biggest tech companies and why — despite pleas from victims — those illicit images can still be found online. It’s Thursday, February 20.

michael keller Would it be possible — I don’t want it to get too sticky, but for the sound, the — lawyer I can turn it down, we’ll just get warm. michael keller If it’s too warm —

michael keller

Last summer, Gabriel and I got on a plane and flew out to the West Coast to meet in a lawyer’s office to speak with the family that she’d been representing.

gabriel dance As Mike said, like, once we started looking into it, there’s so many facets.

gabriel dance

So we explained to them a little bit about our reporting and why it was so important that we speak with them.

gabriel dance Yeah, we’re here to answer your questions, too, I mean, as best we can.

michael barbaro

And who is this family?

michael keller

All we can say is that this is a mom and a stepdad who live on the West Coast. And that’s because they only felt comfortable speaking with us if we could protect their privacy.

stepfather I mean, we started this not knowing anything about it. And I might get emotional here but — you know, as parents, we’re trying to figure out what’s the best way for our kid to deal with this.

michael keller

And they started to tell us a story about what happened to their daughter.

stepfather It was August 21, 2013. I was at work. She was shopping with our two middle children and —

gabriel dance

So one day, six years ago, the mom is out with her kids, doing some back to school shopping, and she gets a call from the police. And they tell her she has to come in immediately.

mother Just odd. An odd call — stepfather It was very weird to get a call. mother — to get a phone call from a detective saying, you need to come down to the station right now. We need to talk to you. We feel your kids are in danger. And I’m like, what? stepfather She called me panicked, going they want to talk to us. Go talk to them. We don’t have anything — we’re not criminals, so. mother It was just odd.

gabriel dance

So she goes into the sheriff’s office.

mother And there’s this F.B.I. agent, introduces himself. He said, you know, we think your children are in danger from their father, particularly the youngest one. And I was just shocked and had no idea what they were talking about. No idea. Mind you, I had my two other kids who I was shopping with, were in the room next door, playing or doing something, I don’t know what they do. And he just went on to say that we’re going to start investigating him, or we’ve been investigating him. stepfather And there’ll be an agent at our house Friday to tell us more. mother Mhm. stepfather And she showed up Friday morning, August 23, 9:00 in the morning. Can I talk to you outside? And we talk outside. And she drops this bomb on us.

gabriel dance

And what the agent tells her is that her children’s biological father had been sexually abusing her youngest child, starting at the age of four years old and going on for four years. And not only that, he had been documenting this abuse with photos and videos and had been posting them online.

mother So I remember that moment. I mean, I think I kind of doubled over or something. I was just shocked. I asked, when did this start? How long has this been going on? How did I not know? I’m mom. So all of those questions and all those feelings, and I mean everything just came crashing down.

michael barbaro

What does this couple tell their children at this point?

michael keller

The F.B.I. agent said, actually, it’s better if I’m the one to tell your kids.

mother In her experience, it would be best for the news to come from her, as opposed for me telling the kids directly that their dad was arrested, because kids might blame me and point the finger. I said, yeah, whatever you think. stepfather She passed around her F.B.I. badge and showed the kids and built a rapport with them, and then — mother She said, you know, your dad has been arrested. I don’t think we even talked about what it was for or why he was arrested, just that he was arrested and they’re still investigating. stepfather Right. mother You know, even to this day, I think about — again, I was married to this guy. How did I miss that? Still. And this is six years later. You know, how did I miss that? So there is a piece of that guilt that’s always going to be there, I think. [SIGHS] stepfather Yeah. But it isn’t your fault. mother And just — stepfather It’s not your fault. mother (CRYING) And how could somebody do that to their own child? I still — I don’t think I’ll ever understand a person like that.

[music]

mother She’s just now developmentally dealing with the effects of it. She’s angry, and she’s acting out and —

michael keller

Her daughter is now a young teenager —

mother She’s in counseling now, you know, so —

michael keller

— and has a counselor, and is not only dealing with all of the normal things that young teenagers have to deal with, but the trauma of being sexually abused at such a young age.

mother So that’s something she’s just now having to learn how to say no, how to inform others that she’s not comfortable with something. And she’s only 13.

michael keller

And meanwhile, even though the physical abuse ended six years ago, the images continue to circulate to this day. And the parents know that because they get notified whenever someone is arrested having these images in their possession.

michael barbaro

And why would they be notified? What’s the point? It feels like it would just be a horrendous reminder every time that this is still happening.

michael keller

One of the abilities that victims of this crime have is to seek restitution from the offenders. So when they’re notified, they and their lawyers can petition the court to get a certain sum of money, which is a really good thing. And helps to rectify some of the damage that was done.

gabriel dance

But Mike, you’re right.

mother Oh my gosh, this person in Kansas and New York. Somebody in Wisconsin saw my kid, and oh my god.

gabriel dance

It is a brutal, double-edged sword, these notifications.

mother Oh my god. There’s people out there who know about this, and they can watch it over and over.

gabriel dance

And for this young woman, her parents and their lawyer received more than 350 notices in just the last four years.

michael barbaro

Wow. So something on the order of 100 times a year, somebody is convicted of having looked at these photos of their daughter?

gabriel dance

That’s right.

[music]

michael keller What do you think should be kind of in the front of our minds? What do you think is really important for us to understand? mother The internet technology companies need to really do something about this, because — I don’t know. It’s just — stepfather I don’t know enough about technology, but I just — where is this crap, you know?

gabriel dance

Certainly for the stepfather, his question, which is a legitimate question, is —

stepfather How do we get it off?

gabriel dance

— why can’t they just take it down?

stepfather How do we, how do we make it go away?

michael keller

And that’s the same question we had. Why, six years later, are these images and videos still showing up on some of the largest technology platforms in the world?

stepfather Figure it out. We’ve got this technology. We can go to the moon and Mars and stuff. We can get this crap off the web.

michael keller

And are the companies doing enough to prevent it?

[music]

michael barbaro

We’ll be right back. Gabriel, Michael, how do you begin to answer these parents’ very reasonable questions about why these images of their child’s sexual abuse keep showing up online?

gabriel dance

So before we can answer the question of why these images keep showing up, we needed to understand where the images were online and what companies were responsible for people sharing them. But the National Center, which you’ll remember is the designated clearinghouse for this information, wouldn’t tell us.

michael barbaro

But they know, right? So why wouldn’t they tell you? Why wouldn’t they give you that information?

gabriel dance

Part of the reason why they don’t divulge these numbers is because the companies are not required by law to look for these images.

michael barbaro

It’s voluntary.

gabriel dance

It’s voluntary to look for them.

michael barbaro

Right. Without the help of these companies, they have no idea where these images are, where they’re coming from, how many of them there are.

gabriel dance

That’s right. And the National Center is concerned that if they disclose these numbers, that they might damage those relationships which they depend on.

michael barbaro

Mhm.

gabriel dance

But then we start to hear anecdotally that there is one company responsible for the majority of the reports. And that was Facebook. So we are doing everything we can to run down that number. But very few people know it. However, we ultimately do find somebody who has documentation that reveals that number. So in 2018, the National Center received over 18.4 million reports of illegal imagery. And the number this person provides us shows that of those 18.4 million, nearly 12 million came from Facebook Messenger.

michael barbaro

Wow. So the vast majority of them.

gabriel dance

Almost two out of every three reports came from Facebook Messenger.

michael barbaro

So just that one service of Facebook.

gabriel dance

That’s right, just the chat application. This doesn’t include groups or wall posts or any of the other public information that you might post or share. This is specifically from the chat application.

michael keller

But then, after we reported that number, the D.O.J. actually comes out and says that Facebook in total — so Messenger plus the other parts of the platform — are responsible for nearly 17 of the 18.4 million reports that year.

michael barbaro

Wow. This is a Facebook problem.

michael keller

That’s what the numbers at first glance would suggest. But we realized we needed to talk to people that really understood this to know what conclusions to come to from these numbers.

phone ringing alex stamos Hello.

gabriel dance

So we called up somebody who would know better than almost anybody.

gabriel dance Alex.

gabriel dance

Alex Stamos.

alex stamos Hey. gabriel dance Gabe Dance. Mike Keller here. michael keller Hey, how’s it going? alex stamos I’m doing OK. I’m getting back to my office.

gabriel dance

Alex was the former chief security officer at Facebook for the past several years. He’s now a professor at Stanford University.

michael barbaro

So he’s somebody who very much would have seen this happening, would’ve understood what was going on inside Facebook when it comes to child sexual abuse.

gabriel dance

Absolutely.

michael keller You’ve obviously worked on this area for years. Facebook is, Facebook Messenger is responsible for about 12 million of the 18.4 million reports last year to the National Center. That seems like a lot. Can you help us understand what’s happening on that platform? alex stamos Yeah, so we’ve been discussing one very important number — 18.4 million, which is that the number of —

gabriel dance

So Stamos tells us something that actually is a little counterintuitive. That this huge number of reports coming from the company —

alex stamos That’s not because Facebook has the majority of abuse. It’s because Facebook does the most detection.

gabriel dance

It’s them working the hardest to find this type of content and report it.

alex stamos I expect that pretty much any platform that allows the sharing of files is going to be absolutely infested with child sexual abuse. If everybody was doing the same level of detection, we’d probably be in the hundreds of millions of reports.

michael barbaro

What is he telling you? That of the 18 million, the reason why Facebook has so many is because other companies are not reporting this at all?

gabriel dance

Essentially, yes. What he’s saying is that Facebook is reporting such a high number of images and videos because they’re looking for them. And that a lot of other companies aren’t even doing that.

alex stamos Facebook checks effectively every image that transfers across the platform in an unencrypted manner to see whether or not it is known child sexual abuse material.

gabriel dance

Every single time somebody uploads a photo or a video to Facebook, it’s scanned against a database of previously identified child sexual abuse imagery. And in doing so, Facebook is finding far, far more of this content than any other company.

michael barbaro

So he’s saying don’t shoot the messenger here.

gabriel dance

That’s what he’s saying. So as of now, this is the best method these companies have to identify and remove the imagery.

michael barbaro

Mhm. What Facebook is doing?

gabriel dance

That’s right.

[music]

michael barbaro

So why doesn’t every technology company do this? It seems pretty straightforward.

gabriel dance

Well, the short answer is that it’s not baked into the business model.

michael barbaro

Mhm.

gabriel dance

That this is not something that helps them grow their base of users. It’s not something that provides any source of revenue. And it’s in fact something that works against both of those things, if done correctly.

michael barbaro

What do you mean?

gabriel dance

Well, every time Facebook detects one of these images, they shut down that account. But most —

michael barbaro

You’re deleting your own users.

gabriel dance

That’s right. You’re deleting your own users. And it costs money to delete your own users.

michael barbaro

Right. You have to spend money to hire people to find accounts that are doing something wrong that you’re then going to lose as a customer.

gabriel dance

You got it. So both of those things fly in the face of almost all of these companies’ business models. And Stamos actually told us something else interesting.

alex stamos The truth is that the tech industry is still pretty immature at the highest levels about the interaction between executives.

gabriel dance

And that’s that these companies aren’t really working together to solve this problem.

alex stamos You know, if you look at, say, the banking industry, these big investment banks, the C.E.O.s hate each other. But they understand that their boats all rise and fall together, and so they are able to work together on what kind of regulation they’d like to see. But a lot of the top executives at the tech companies really kind of personally despise one another. And it is very difficult to get them to agree to anything from a policy perspective.

michael keller

And in our reporting, we found some pretty egregious disparities in how different companies police this on their platforms. Amazon, who has a massive cloud storage business, for example — they handle millions of uploads and download a day — they don’t scan whatsoever. Apple has an encrypted messaging platform, so that doesn’t get scanned. They also choose not to scan their photos in iCloud. Snapchat, Yahoo, they don’t scan for videos at all, even though everyone knows video is a big part of the problem. And it’s not because the solutions don’t exist, they just have chosen not to implement them. And now Facebook, the company looking for this content most aggressively, is starting to rethink its policy in doing that. Over the last few years, a lot of tech companies have realized that privacy is an important feature that a lot of their customers are expecting. And citing privacy concerns, Facebook announced that it will soon encrypt its entire Messenger platform, which would effectively blind them to doing any type of automated scanning between chat — which, again, was responsible for nearly 12 million of those 18.4 million reports.

michael barbaro

So they would stop searching for this child sexual abuse material?

michael keller

Right. They would limit their own ability to be aware of it.

michael barbaro

And privacy is important enough that they would handicap their ability to find this criminal conduct and these horrible photos?

michael keller

Based on the criticism that a lot of tech companies have received over the last few years, moving towards encryption is a really attractive option for them. Because it lets them say, we really care about the privacy of your conversations, and we’re going to make that more secure.

michael barbaro

Gabriel, it occurs to me that the debate over privacy is enormous and loud. We’ve done episodes of “The Daily” about it, many episodes of “The Daily” about it. But the child sexual abuse subject is not as well-known. It’s not as widely debated. It’s the first time we’re talking about it. Is that reflected in this decision, the attention that these two subjects get?

gabriel dance

It is. And in fact, it’s one of the main reasons we chose this line of reporting. The issue of child sexual abuse imagery really brings the privacy issue to a head, because we’re forced with these very, very stark decisions that we’re discussing here right now. Which is that, is it more important to encrypt the communications on a platform where it’s well known that children interact with adults? Is it worth it to encrypt those communications to protect people’s privacy when we know what the ramifications for children are?

michael keller

But the child protection question is also a privacy question. And when you are Facebook and you’re saying, we’re going to encrypt conversations for the privacy of our users, the child advocates will say, well, but you’re doing nothing to protect the privacy of the child in the image.

michael barbaro

In fact, you’re making it harder for their life to ever be private.

michael keller

Exactly. And this means that next year, or whenever Facebook moves ahead with its plan to encrypt, they won’t be sending nearly 17 million reports to the National Center. They’ll be sending far fewer.

michael barbaro

Right.

michael keller

And that means for the family that we spoke with on the West Coast, who is receiving about 100 notifications a year that someone has been caught with images and videos of their daughter, they’ll likely be getting far fewer notifications. But not because people aren’t looking at these images. They still will be — the family just won’t know about it.

mother That’s my big thing. I want people to care about this, because there is a human factor to this obviously. We have to force people to look at it, you know, the tech companies. They have to do something about it.

michael barbaro

So where does that leave this family now?

gabriel dance

They feel abandoned and angry with the tech companies.

mother They have to do something about it, because knowing that that’s out there, I don’t know, it’s just being traumatized all over again.

gabriel dance

They keep getting these notifications that their daughter’s imagery has been found.

stepfather It’s ongoing. It’s lifelong. There’s nothing they can do about being a victim for the rest of their life.

gabriel dance

And the way they describe it is it’s like getting hit by a truck.

stepfather They can’t stop being in a car wreck every day.

gabriel dance

Only to get back up and get hit by a truck time after time after time.

stepfather There’s no, there’s no other way to say it. She’s — that will be there forever until it’s removed, until somebody comes up with a way to take that off.

gabriel dance

And so, what this family has decided is that —

stepfather She doesn’t know.

gabriel dance

— they’re not telling their daughter. They’re not going to tell her that her images are online.

mother There’s no good it would do. There’s no benefit, at least from our perspective, to tell her. I mean, she needs to worry about soccer and — stepfather Making the team. She worries about sleepovers and wrestling, grades. mother She doesn’t need to be worrying about the worst part of her life available on the internet. stepfather I don’t want her to be fearful of what other people — mother Might see. stepfather — might be seeing of her — mother Be recognized. stepfather — when they do a Google search for a bullet, up pops her image. That’s horrible.

gabriel dance

But there is a clock ticking.

stepfather I just found out when she turns 18, it isn’t a choice. The F.B.I. will get ahold of her, because she’s an adult victim now.

gabriel dance

When she turns 18, the federal government is going to start sending her the notifications. So what they’re hoping is that in the four years until that happens, the tech companies are going to solve this problem.

michael barbaro

Which is to say, get these images offline for good.

gabriel dance

That’s what they hope.

stepfather My motivation for this is we have to explain to our daughter, this is on the internet. And she has to live with that. But being able to tell her — you know, if you can tell me in five years, this will be something that was, not is, that’d be great.

gabriel dance

The dad asked us the question, can you tell me that when I talk to my daughter about this when she turns 18, that I can tell her that these horrific images used to be online but no longer are?

michael barbaro

And what did you say?

gabriel dance

What I wished I could tell them was yes.

gabriel dance Well, I think that’s why we’re doing this story — mother Yeah. gabriel dance — to be honest with you, is —

gabriel dance

What I did tell him was that that was the point of us doing this reporting. Was the hopes that something would change, and that in five years, those images would no longer be there.

gabriel dance But once we publish this article, and we see how they respond, and we see how, not only tech companies —

gabriel dance

But from everything we’ve learned, it’s only getting worse.

stepfather What if it was your daughter? mother Yeah, you know, put yourself in that kid’s shoes. stepfather What if you were the person they’re looking at the rest of your life? If we can tell that this happened, instead of this is happening, the world would be a lot better off. She’ll be a lot better off.

[music]

michael barbaro

Michael, Gabriel, thank you.

michael keller

Thank you.

gabriel dance

Thank you.

michael barbaro

A few weeks ago, the National Center for Missing and Exploited Children released new data for 2019. It said that the number of photos, videos and other materials related to online child sexual abuse grew by more than 50 percent to 70 million.

[music]

michael barbaro

We’ll be right back. For highlights and analysis of Wednesday night’s Democratic debate in Las Vegas, the first to include former New York City Mayor Michael Bloomberg, listen to this morning’s episode of “The Latest.” You can find it on “The Daily” feed or by searching for “The Latest,” wherever you listen.

[music]