Why A New Zealand Official Insists 'Facebook Can't Be Trusted'

Rachel Martin talks to New Zealand's Privacy Commissioner John Edwards, who criticized Facebook after last month's attacks on two mosques in Christchurch were live-streamed on Facebook.

RACHEL MARTIN, HOST:

How much are social media companies to blame for the hate that spreads on their platforms? Last month's attack on two mosques in Christchurch, New Zealand, were livestreamed on Facebook. Facebook CEO Mark Zuckerberg told ABC News last week his company needs to act faster to take down videos with violent, hateful content. In the case of New Zealand, he said Facebook took down more than a million copies of the video, but other versions kept cropping up. Here's Zuckerberg on ABC after the New Zealand shooting.

(SOUNDBITE OF ARCHIVED RECORDING)

MARK ZUCKERBERG: One of the things that this flagged for me overall was the extent to which bad actors are going to try to get around our systems.

MARTIN: Our next guest says it's the systems that are the problem, not just the bad actors. One member of New Zealand's government is John Edwards. He is New Zealand's privacy commissioner, and he lambasted Facebook in a series of tweets, writing Facebook cannot be trusted. He joins us on the line this morning. Thank you so much for being with us.

JOHN EDWARDS: Good morning, Rachel.

MARTIN: You also wrote that Facebook is a group of, quote, "morally bankrupt pathological liars." Why?

EDWARDS: Yeah. Well, that is more extravagant language than I would normally use as an official and lifted out of its context promoting another interview that was a reaction to Mr. Zuckerberg's ABC interview. It certainly has got some cut through it. And so, you know, I mean, I don't regret using that language. We have a platform that has displayed shocking lack of responsibility and accountability for the tools that it has enabled. And I mentioned in the tweet, you know, there's the genocide in Myanmar, there's the corruption of the elections and undermining of democratic institutions elsewhere. It is really time for social media platforms to start taking more responsibility for the effects of their platforms.

MARTIN: So how would you like them to do it? Because we heard Mark Zuckerberg point out the lengths to which people would go to get around any tighter restrictions.

EDWARDS: Yeah, that was really interesting because he said - you know, he kind of conflated that with the livestreaming of the atrocity in New Zealand. But that person, you know, didn't go to any lengths. There were no systems. I mean, if you are going to offer a service that is capable of such deep and profound harm, then it is incumbent upon you to ensure that that's safe. In the U.S., you have product liability. If a manufacturer makes something - a product which causes harm, they are liable for that. You know, I think it's time we started looking to the social media companies for that. You know, to put this event in context for you, Rachel, as a proportion of our population that was affected, this is actually like to us what 9/11 was to the U.S.

MARTIN: So let's talk more about the change you'd like to see. Australia recently passed legislation where social media companies could face huge fines. Executives could get jail time if they don't pull down this type of material fast enough. Is that something you're looking at for New Zealand?

EDWARDS: Well, I don't get to make those decisions, but that will certainly make the companies sit up and take notice. I'm not sure about the practicalities of that. Certainly, in the U.K., we've also seen a legislative proposal introduced. We've seen the same in Singapore. But what prompted my tweets was the lack of responsibility that the company is taking. They should be acting now. I mean, if they can't assure us that that streaming service is safe, then it should be taken down. I was really quite - or a delay in search or something. I was quite disappointed when I heard Mr. Zuckerberg equate the atrocity in Christchurch with children's birthday parties. You know, he said if you put a delay in the system, it might have prevented the uploading of that video, but that would have broken the experience for the people who use it for children's parties.

I mean, I don't understand the mathematics there. You know, how many children's parties, Mr. Zuckerberg, equals one murder, one livestreamed suicide, one sexual assault livestreamed? You know, it's really incumbent, I think, on the platform to take some responsibility to make the product safe. And until they can to take it down - in the same way, for example, that we've seen Boeing 737 Max grounded because of a software fault, you can't fly them anywhere in the world. Here we have livestreaming as a software fault capable of causing great harm, and they've done nothing to change it since the 15 of March.

MARTIN: Do you believe they can find a balance? Or would you prefer that they just take the service down altogether?

EDWARDS: Well, I think that they have billions of dollars to invest on launching these products. It's incumbent on them to find the solution to launch it in a way that it can be safe. It's not incumbent on me to devise the solution for them. And, you know, they need to divert some of that investment. And maybe they need a cross industry agreement for all the other potential hosts to say, well, let's pause until we can do this safely.

MARTIN: Sharing the video, the activist taking it online and clicking and sharing it, is illegal in New Zealand because of censorship laws there. I understand you've asked Facebook to tell you who has shared the video. Have they responded to you?

EDWARDS: In the days following the shooting, I did make that comment. The scale, I think, is - has meant that that's impractical. But we do have at least one person facing charges in New Zealand for distributing that. And that's before the courts now.

MARTIN: Other social media companies like YouTube also had a role in spreading the video of the mosque killings. Are you holding them accountable, too?

EDWARDS: Well, I'm not really in a position to hold anyone to account because of the...

MARTIN: At least rhetorically.

EDWARDS: ...Attitude - yeah. Well, certainly. I mean, the thrust of my comments has been about the failure to address the immediate problem of that livestreaming service. Certainly, the subsequent distribution and the whack-a-mole and endless replication that makes that a practical impossibility is really difficult. But, you know, they also have to design solutions to that. But I think that they need to act immediately to assure us that a product that they're offering can't cause such harm again.

MARTIN: New Zealand Privacy Commissioner John Edwards, thank you so much for your time this morning. We appreciate it.

EDWARDS: Thank you.

(SOUNDBITE OF MARLEY CARROLL'S "SEVEN CROWS")

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.