WHAT NEXT?

In the minutes and hours following the shooting of nearly 100 Muslim worshippers at two Christchurch mosques on March 15 this year, Matt Blomfield’s 13-year-old daughter had live footage of the carnage shared to her Instagram account by four separate people. She watched the whole thing, filmed by the gunman on a GoPro attached to his helmet. She saw terror and panic; she saw real people ripped apart by real bullets. She saw the blood. She didn’t tell her parents.

Many of the other kids at her school also saw the video, as did many thousands of others around New Zealand and the world. Instagram, Facebook, YouTube… it was shared more than 1.5 million times. It just popped up on people’s – children’s – social media feeds, unasked for.

It was only some months later that his daughter told Matt what she’d seen. It’s a parent’s nightmare, he says. He felt keenly that his ability to raise his daughters the way he wanted to - that is, appropriately protected, with some control over the rate at which they are exposed to the complexities of the world - had been usurped by the giant corporations whose platforms bring horrible material straight to his kids’ devices.

It felt very wrong. Something needs to be done, he said to himself.

In May this year, the book Whale Oil was published, telling the story of Matt’s eight-year legal battle against the blogger Cameron Slater whose online lies had almost destroyed Matt’s life. Ever since, he’s received messages from people who have read the book, all expressing great sympathy for what happened. Many shared their own experiences of being bullied online, some at the hands of the same blogger. Most asked: what’s next for Matt? A few, whose expertise lay one way or another in the online world, asked: How can I help?

In fact, Matt had already begun work on “next”. After years of putting energy into the fighting negative court battles with Slater, Matt wanted to work on projects that contribute positively. During his years of struggle he thought long and hard about the wider issues inherent in his personal battle: the immensely complex matter of balancing democratic access to the internet and freedom of expression on it, against controls to prevent it becoming a weapon of harm; the inability of our justice and enforcement systems to effectively respond to breaches of the law when they happen on social media; the sheer, global scale of the platforms that dominate the internet, and the difficulty for individual jurisdictions in controlling content.

In November 2016, he drafted a Universal Declaration of Rights Pertaining to the Internet. He managed to get some interest from the Privacy Foundation, with a little more interest expressed by organisations in the aftermath of the Christchurch shootings. He’d hoped it might get championed at government level, but so far that hasn’t happened. He’s still proud of it – its thoroughness, thoughtfulness and logic. Article Eight would have been very relevant to Prime Minister Jacinda Ardern’s “Christchurch Call”:

“All member states shall do everything in their power to prevent the internet being used to facilitate and distribute material and images that portray the abuse, torture, humiliation or degradation of any person or persons where they have not consented, or are unable to consent to that treatment.”

He watched with considerable interest as Ardern headed overseas in the wake of the Christchurch shootings to try and win multi-lateral cooperation to better control the spread of harmful material. He noted the increasing public concern and debate about social media platforms but, along with that, the powerless handwringing that usually accompanies such conversations. Many people, and certainly many parents, not only worry about the material that children are watching, but are also deeply conflicted about both their ability and their right to do anything about it.

Matt has no such dilemma.

“As parents, we have a responsibility for our children not to watch mass shootings at age 13, or porn at age 10,” he says. “Let’s stop and take a look at what the problem is, the elephant in the room, which is what’s happening right here on our own shores. Our kids, here in New Zealand, are watching stuff that no parent would want them watching”.

“We’re sitting here worrying about youth suicide statistics, youth mental health, young kids who feel shit about their own bodies and their own lives, kids who are getting their sex and relationship education through free porn sites controlled by massive corporates. And we're sitting here going, this needs to change. And we're waiting for the government to do it. Waiting for Facebook to do. Waiting for Instagram to do it. Waiting for who?”

“Jacinda’s efforts are good, but only partially deal with the problem. Up until now, the corporates have decided what happens to us online, and now they’re deciding what steps they're going to take to help us. We can’t leave it up to them. Let's take the steps ourselves and get back some control.”

It takes a village to raise a child, the saying goes. Matt believes it will take a community effort to save our children from the harmful effects of exposure to damaging and illegal material on the internet. Our own community, saving our own children.

“Who are we counting on to sort this out for us? And the answer is, it's not one person's fix. This is not just a corporate or government issue. It’s a collective issue. We need a combination of commercial businesses, academia and government to work together on this with a common goal of saving our kids.”

He talked to people he knows in the technology sector, and it became apparent to him that the technology already exists that could put the power back into the hands of parents. What doesn’t exist, however, is a system around the technology to ensure that it’s easy to use, flexible enough to provide for individual choice and control, and expertly tailored to acknowledge important steps of a child’s developing maturity. In other words, this concept needed a comprehensive vision and, crucially, a plan.

That is what Matt’s doing next.

“People are always talking about the problems in this space, but no-one ever talks about what we can actually do,” he says.

He’s begun putting together an informal working group, comprising technology experts in big data, AI and software development, child development specialists, media academics, and ISP and handset providers – as well as smart business minds, branding and sales experts. He’s casting his net wide, hoping other people with expertise and ideas in this broad area will get in touch.

He envisages a carefully designed and flexible package that parents would sign up for when they purchase their phone and internet plan – a package that they pick and choose themselves, according to the level of protection they want to provide for their child. Information will be provided about child development, and the levels of understanding inherent in each stage of a child’s developing intellectual and emotional maturity.

“Parents should have the flexibility of how much control they want to put in place. There is no compulsion in this proposed system, at any level. If you don’t want any controls, that’s fine. Leave it up to Facebook and Instagram – they were more than happy to send your child a video of the mosque shooting. But for me, as a parent, I don’t want those companies out there deciding what kind of human being my child becomes.”

“We're taking that responsibility of how we raise our kids seriously and we're going to do it ourselves, so we're going to be the ones that are deciding what our kids see.”

There is increasing evidence of the extent to which young people are routinely seeing horrible material on their social media feeds. The Youth and Porn study that came out late in 2018, commissioned by the Office of Film and Literature Classification, showed that of 2000 New Zealand teenagers aged between 14 and 17, three-quarters of the boys had seen online porn, and more than half the girls – including sexual violence and non-consensual sex. One in four had seen it before the age of 12. Most had not been looking for it, but they came across it anyway. Most had not talked about this with their parent or caregiver.

Such facts can make parents feel very disempowered and helpless. “People are daunted by the scale of the internet,” Matt acknowledges. “We know that China simply banned Facebook – they can do that because they are an authoritarian society. Of course, we don’t want to do that anyway, but it points to the difficulty of creating safeguards in a society like ours where we’re concerned about censorship and the fair balance of opinions. So, let’s give the power back to the people and let the people decide.

“Big corporations want your data. They use it to learn a lot about you, to push advertising and sell you more. On the other hand, they do not enable you to have access to that data, and there is no AI looking out for people in this equation. There is no balance of data, no fair exchange of value. As an example, Google is starting to get its hands on individuals’ health data (Stuff: ‘Google wants to get its hand on your health data’, 17-11-19) without people’s consent; its objective is to grow its revenues.

“My plan is about taking that control away from the corporates, and taking the responsibility away from them in some sense because we don't trust them with that responsibility. We'll give parents the choice to decide what they can and can't see.”

New Zealand is the perfect place to trial such a system, he believes. We’ve already flown the flag with the Christchurch Call, and we love our identity as a world leader in forward-thinking ideas. Votes for women, nuclear-free, zero-carbon… safe internet.

If you are interested in discussing this with Matt, send an email to:

Watch this space.