Earlier this month, we published a guide to figuring out what’s real on the internet — whether that means spotting a malicious “fake news” site or an innocent post with bad information. We also explained that individuals can only do so much. Schools, governments, web platform owners, and other institutions need to tackle bigger issues at a structural level. But that’s a much tougher task, and it might require a new way of thinking about the internet.

Whitney Phillips is an expert on the dark side of internet culture. She’s the author of trolling ethnography This Is Why We Can’t Have Nice Things as well as the Data & Society report The Oxygen of Amplification, which laid out a toxic feedback loop between well-meaning journalism and harmful propaganda. Now, drawing on social media scholar Claire Wardle’s term “information pollution,” she and College of Charleston scholar Ryan Milner are writing a book that reframes the digital world as an ecosystem in crisis.

In an interview with The Verge, Phillips explains how individual solutions aren’t going to fix the web — but why, right now, they’re the best option we have.

This interview has been condensed and lightly edited for clarity.

The internet is full of viral half-truths, honest mistakes, outright lies, and other unreliable information. But in many cases, you can cut through the chaos by following a few simple rules.

What’s the book you’re currently working on about, and what are you hoping people will get out of it?

The name of this book is You Are Here: A Field Guide for Navigating Polluted Information, and that’s the idea — that we need to have a better sense of where we individually sit within all these structures and systems, and until that becomes a basic part of media literacy, we’re going to be responding to symptoms as opposed to the underlying causes.

We’re arguing that we need to think ecologically and frame the problem more in the terms of the climate crisis. That feeds into some specific strategies that people can take, but it’s more about mapping how we got here and how our networks push us into this very complicated space and what we can do moving forward.

One of the more overarching themes throughout the book is using the term “polluted information” as opposed to either disinformation or misinformation. When you’re talking about offline pollution, polluters who are actively trying to pollute obviously pollute. But so do people who aren’t intending to — just washing your hair or using Drano is going to contribute to the overall problem, even if you totally love the Earth.

We’re also using the metaphors of redwood root systems, and that focuses on the means by which polluted information travels. Another metaphor is land cultivation. Everyday actions, whether they’re deliberately positive or deliberately negative, are affecting your network. The third metaphor is hurricanes. Nobody would point to a single gust of wind and say, “That’s the hurricane.” The real hurricane is a confluence of so many factors. It’s the same thing online. You can’t just look at a presidential tweet and comment on that; there’s a whole storm system to consider, including why you’re seeing the tweet in the first place.

How well do you think we understand the underlying causes of these problems?

Not well. I think part of the issue is that traditional media literacy efforts — like efforts to fact-check or check sources — are super-duper critical. But that only takes you so far in the conversation, if you’re not thinking about how you fit into all of the stuff everybody else is doing. Just saying that a conspiracy theory is false, for example, still continues the story through the information ecosystem.

Where should we be targeting solutions for the problem?

I think that making people more aware of amplification issues is really critical. We have so many incentivizing structures that allow for and, in some cases, encourage polluting information. And these issues are not a new problem born of the internet. The internet exacerbates and lays bare a lot of issues we’ve already been dealing with. I’m starting an additional phase of the process where I’m working with K–12 teachers so they understand not just online spaces, but also how you get people thinking more holistically and ecologically about the world.

And in many cases, it asks us to sort of rethink assumptions about the marketplace of ideas and other issues of free speech. We’ve got to start thinking “holy shit, what have we normalized, what have we internalized, what assumptions do we make about our own speech and about our systems, about responsibilities to ourselves and other people?” And until we do that, it’s going to be a lot of Band-Aids on broken arms.

How much are the solutions about individual people modifying their behavior versus big, structural changes?

Both things need to happen. It’s sort of like the plastic straws argument when it comes to the climate crisis. It definitely doesn’t help when individual people use single-use plastics, but that doesn’t address the structural corporate incentives, the lack of regulation, all the big things. The same is true in digital spaces.

“I don’t think that we should wait to see what Facebook does.”

But there are so many variables up in the air related to what happens in 2020 and who’s going to be responsible for the decision-making process. We don’t have that answer yet, so the best we can do is to limit the single-use plastic and limit the kind of things we are putting into the ecosystem. Especially because so much pollution that gets dumped isn’t intentional!

Typically, when people talk about misinformation and disinformation, they’re focused on what we describe as the apex predators in the book: the people who are actively, deliberately sowing all this chaos. What doesn’t get discussed is what everyday people are doing without intending to, and I think that’s an area where some interventions can be made. Because everyday people in the cumulative actually have a lot of influence over what is even possible for the apex predators. We’ve got to start thinking small so that we can begin thinking big.

Are there things companies could do to fix the problems?

These companies could base corporate decision-making on public health concerns. They could stop worshipping at the altar of free speech. But it has not happened yet, and there have been warning signs and warning signs and warning signs. They play an enormous role in this, but I don’t think that we should wait to see what Facebook does until we’re thinking about individually what can we all do and what kind of cumulative effect that can have.

So is it fair to say that changing individual behaviors isn’t the solution, but it’s the best solution we have right now?

It is 100 percent the best solution we have right now. It can’t be the only solution — like we’re not going to solve the climate crisis if people just stop drinking water out of water bottles. But we need to start minimizing the amount of pollution that’s even put into the landscape. It’s a place to start; it’s not the place to end.

You’ve mentioned work by researcher danah boyd, saying a lot of people know they’re sharing misinformation but just don’t care. How big do you think that group is?

Speaking from my research over the last 10 years, I think there’s a smaller percentage of the population that is actively trying to pollute — and even the people who are polluting maybe don’t know that’s what they’re doing.

“Some people are spreading disinformation because they don’t understand the consequences.”

One place we do draw a line is between citizens of good faith and citizens of bad faith. It may be that some people are spreading disinformation because they don’t understand the consequences, they just think it’s funny, but it doesn’t come from a place of willful maliciousness. Or it may be that those people are both myopic and malicious.

I’m sure that’s a percentage of the population — some people, there’s absolutely nothing you could ever say that could convince them. But I have no reason to believe that’s a majority of the population. It’s just a function of how you communicate the significance of what they do in a way that they have maybe never thought about before.

If you’re an ordinary person trying to convince somebody to stop spreading disinformation, are there good strategies for that?

I think that the metaphors are helpful in trying to open up that conversation because a lot of this is really abstract. Many people don’t know how the infrastructure of digital networking works, nor do they have much reason to care. But you can have a different kind of conversation when you reframe it.

We take a lot of inspiration from a brilliant botanist named Robin Wall Kimmerer, and she wrote this book called Braiding Sweetgrass that emphasized the importance of shifting how we talk about the climate crisis. Online, we have to shift essentially from a culture of rights to a culture of responsibilities. When you just frame the issue as what you should be allowed to see or when others should have to hear what you say, that’s very different from “What is my responsibility to the people who share my space with me?”

How do you deal with the fact that a lot of this is based on people really, really not trusting news outlets?

That speaks to another enormous structural problem. Particularly on the right, there is a cultural norm of distrusting mainstream organizations that goes back to the 1950s, when evangelical Christians started to lay the infrastructure for their own media ecology. That media ecology ran parallel to mainstream media for many, many decades. It was just that the twain typically didn’t meet. And what happened in the social networking era is that those disparate modes of being are all crisscrossed, so you’re kind of seeing what actually had been happening for a very, very long time in this country but wasn’t visible.

The solutions are decades-long, and it’s about the ground game. It’s about talking with other people. And those are things that just don’t work very well in digital spaces. I wish that I had a clear answer to what we do next, but I think that the most immediate step is starting to articulate what the issues are and how we got here, and how we can begin thinking differently.

How much of a solution is the “log off, burn it all down” mentality where everyone leaves social media and we go back to what we had before?

What we had before was fundamentally broken, and what social media did was lay bare the way that it already was. The same kind of misinformation and disinformation campaigns were found essentially after the Civil War — especially if you look at the secessionist movement, that’s predicated on these same ideas. It’s just that our networks were limited in their reach and connectedness. So what we have is a foundation that is rotten, and what we need to do is build a new foundation.

How you convince enough people that radical structural solution needs to happen is the open question. But we can’t build on top of the foundation that we have because the foundation that we have is the one that brought us to this moment.

This is all really depressing because a lot of the current climate debate is about how individual solutions aren’t going to solve the problem at all.

It is. I’m not convinced that we will recover from this. But not doing anything and not trying to rethink these big problems and not situating it in these big ideological terms — we’re certainly not going to recover if we don’t do that.