Tristan Harris, former design ethicist at Google and co-founder of the Center for Humane Technology, speaks to The Associated Press during a round-table discussion on Tuesday, July 30, 2019, in New York. Harris says he's concerned about people's addiction to technology, thanks to tools that major technology companies employ to persuade people to spend more time on their services. (AP Photo/Jenny Kane)

Tristan Harris, former design ethicist at Google and co-founder of the Center for Humane Technology, speaks to The Associated Press during a round-table discussion on Tuesday, July 30, 2019, in New York. Harris says he's concerned about people's addiction to technology, thanks to tools that major technology companies employ to persuade people to spend more time on their services. (AP Photo/Jenny Kane)

Tristan Harris wants to reverse the harmful effects he believes technology has had on all of us.

Harris, a former Google design ethicist, first rose to national awareness after a presentation he gave within Google in 2013 spread throughout the industry. In it, he argued that many tech products were designed to be addictive, causing people to spend too much time on them and distracting them from living their lives. He urged designers to alter their approach.

Harris spent more than two years pushing change within Google, but says he couldn’t get traction. So he quit and started a movement called Time Well Spent, which eventually pushed companies such as Apple and Google to build screen time usage metrics and tools into their phones.

ADVERTISEMENT

He has since widened his focus, having decided that many issues facing society today are actually connected and can be traced, at least partly, to the design of technologies we use every day.

The goal of his organization, the Center for Humane Technology, is to reverse human “downgrading,” or the idea that technology is shortening our attention spans, pushing people toward more extreme views and making it harder to find common ground. In short: technology has caused humanity to worsen, and Harris wants to help fix it.

Harris recently spoke to the Associated Press about his work, the tech industry’s progress so far, and why all hope is not lost. This interview has been condensed and edited for clarity.

Q: Could you tell us the important ideas of your work?

This isn’t about addiction, it’s not about time. It’s about what we call “human downgrading.” It’s a phrase that we came up with to describe something we don’t think people are acknowledging as a connected system.

Technology is causing a set of seemingly disconnected things —shortening of attention spans, polarization, outrage-ification of culture, mass narcissism, election engineering, addiction to technology. These seem like separate problems, and we’re actually saying that these are all predictable consequences of a race between technology companies to figure out how to scoop attention out of your brain.

Q: Where is the central place to fight this multifaceted problem that you’ve outlined?

A: Much like you say, “How do you solve climate change?” Do you just get people to turn off their light bulbs? No. Do you pass some policy? Yes. But is that enough? No. Do you have to work collaboratively with the oil companies to change what they’re doing? Yes. Do you have to pass laws and mandates and bans?

ADVERTISEMENT

You have to do all these things. You have to have a mass cultural awareness. You have to have everybody wake up.

This is like the social climate change of culture. So working on internal advocacy and having people on the inside of tech companies feel, frankly, guilty, and ask, “what is my legacy in this thing that’s happening to society?”

We work on the internal advocacy. We work on public pressure and policy.

Q: How do you work with companies, and how are they taking to your vision?

A: Doing it from the inside didn’t do anything when the cultural catch-up wasn’t there. But now in a world post-Cambridge Analytica, post the success of Time Well Spent, post more whistleblowers coming out and talking about the problem, we do have conversations with people on the inside who I think begrudgingly accept or respect this perspective.

I think that there might be some frustration from some of the people who are at the YouTubes and Facebooks of the world whose business models are completely against the things we’re advocating for. But we’ve also gotten Facebook, Instagram, YouTube, Apple and Android to launch Time Well Spent features through some kind of advocacy with them.

Q: Is there a path that you try to help map out for these companies?

A: They’re not going to do it voluntarily. But with lots of outside pressure, shareholder activism, a public that realizes they’ve been lied to by the companies, that all starts to change.

There are multiple business models — subscription is one.

Would you pay $8 a month to a Facebook that didn’t have any interest in manipulating your brain, basically making you as vulnerable as possible to advertisers, who are their true customers? I think people might pay for that.

So our policy agenda is to make the current business model more expensive and to make the alternatives less expensive.

Q: Washington is now in a huge debate about privacy and data and misinformation. Will that process deal with the causes that you care about by default?

A: I actually worry that we’re so mindlessly following the herd on privacy and data being the principle concerns when the actual things that are affecting the felt sense of your life and where your time goes, where your attention goes, where democracy goes, where teen mental health goes, where outrage goes. Those things are so much more consequential to the outcomes of elections and what culture looks like.

Those issues connected together have to be named as an impact area of technology. There has to be regulation that addresses that.

My concern about how the policy debate is going is everyone is just angry at Big Tech. And that’s not actually productive, because it’s not just the bigness that is the problem. We have to name that the business model is the problem.

Q: Don’t people have individual agency? Are we really in the thrall of tech companies and their software?

A: There’s this view that we should have more self-control or that people are responsible for whatever they see.

That hides an asymmetry of power. Like when you think, “I’m going to go to Facebook just to look at this one post from a friend,” and then you find yourself scrolling for two hours.

In that moment, Facebook wakes up a voodoo doll-like version of you in a supercomputer. The voodoo doll of you is based on all the clicks you’ve ever made, all the likes you’ve ever done, all the things you’ve ever watched. The idea is that as this becomes a better and more accurate model of you, I know you better than you know yourself.

We always borrow this from E. O. Wilson, the sociobiologist: the problem of humans is that we have Paleolithic brains, medieval institutions and godlike technology. Our medieval institutions can only stay in control of what’s happening at a slow clock rate of every four years. Our primitive brains are getting hijacked and are super primitive compared to godlike tech.

Q: Do you feel there’s awareness (within tech companies) that you wouldn’t have thought existed two years ago?

A: There has been a sea change. For four years, I was watching how no one was really accepting or working on or addressing any of these issues. And then suddenly in the last two years — because of the Cambridge Analytica scandal, because of “60 Minutes,” because of Roger McNamee’s book “Zucked.” I would have never suspected that Chris Hughes, the co-founder of Facebook, would be saying it’s time to break up Facebook.

I’ve seen an enormous amount of change in the last three years and I can only bank on the fact that the clip at which things are starting to change is accelerating. I just want to give you hope that I would have never expected so much to start changing that is now changing. And we just need that pressure to continue.