Tech Platforms Love Moving Fast — Except When Their Users Are in Trouble

The CEOs of YouTube and Twitter are being far more cautious in addressing the problems of their platforms than they were in building them

Photo: Drew Angerer/Getty

Jack Dorsey and Susan Wojcicki seem to have little in common, aside from their jobs—running two of the biggest platforms on the internet.

Wojcicki, CEO of YouTube, comes across as an accomplished, even-keeled professional who blends in and shuns the spotlight. A Harvard graduate from a distinguished Silicon Valley family, she took over YouTube after successful stints at Intel and Google. A rare profile of her in the New York Times called her “the most measured person in tech.”

Dorsey, Twitter’s co-founder and CEO, is an unkempt idealist who throws himself into wellness trends, mindfulness retreats, and facial grooming experiments. An NYU dropout who once aspired to be a fashion designer, he helped start Twitter after he was rejected for a job at a shoe store. His onstage interview at the annual TED conference was only the latest in his ongoing tour of the media circuit.

Both Twitter and YouTube are rife with abuse, harassment, misinformation, manipulation, and every manner of extremism.

Although the CEOs are quite different, their recent media appearances revealed they are grappling with the same kinds of problems on their respective platforms — and taking a strikingly similar approach to addressing them. This approach could be charitably described as deliberate, even philosophical, and that deliberation stands in stark contrast to the breakneck speed at which the platforms were built, grew, and continue to operate.

Both Twitter and YouTube are rife with abuse, harassment, misinformation, manipulation, and every manner of extremism. One platform is stalked by armies of human trolls and bots lobbing death threats at scale. The other sucks users through wormholes to parallel realities where Earth is flat, vaccines cause autism, and radical ideologies are the cure for society’s ills. Both struggled to contain the viral circulation of livestreams of the New Zealand massacre.

Neither CEO is blind to this — not anymore, at least. The Times found Wojcicki reflecting on a staff meeting she called to address the appearance on YouTube of bestiality images alongside children’s content. At TED, Dorsey sounded more like one of Twitter’s many critics than the company’s CEO: “It’s a pretty terrible situation when you’re coming to a service where ideally you want to learn something about the world, and you spend the majority of your time reporting abuse, receiving abuse, receiving harassment,” he said.

Both executives acknowledged that they had failed to anticipate the severity of the problems their platforms would create as they grew. This explanation has started to feel more like an excuse in the tech world. No doubt it’s true that the people who built the platforms that now dominate the global flow of information underestimated how their creations could be misused. But at this point, intentions are irrelevant. What matters is how they respond. And to judge by their recent words and their actions until now, Wojcicki and Dorsey are responding very slowly.

It’s true that both companies have begun hiring more human reviewers and using machine learning to flag potential policy violations, with YouTube focusing more on problematic content and Twitter on patterns of user behavior. But neither has undertaken fundamental changes to their algorithm or the structure of the platform, and both have used freedom of speech as an excuse for inaction.

This timid approach seems to come from the top. A Bloomberg feature in April painted Wojcicki and her team as obsessed with growth and engagement metrics, even as YouTube’s problems mounted. When her employees pressed her to address problematic content, Wojcicki reportedly demurred, saying it wasn’t her job to decide what users could say or see. In the Times profile, Wojcicki took issue with that characterization, pointing to the sheer complexity of content moderation. “It’s not like there is one lever we can pull and say, ‘Hey, let’s make all these changes,’ and everything would be solved,” she said. “That’s not how it works.”

The Times’ Daisuke Wakabayashi sat in on a meeting in which Wojcicki and her deputies pondered — in painstaking depth — the proper response to a potentially dangerous video called the “Condom Challenge.” They ultimately decided not to remove it. While more sympathetic than Bloomberg’s portrayal, the Times profile concluded that Wojcicki’s “deliberate style may be at odds with the pace and scale of horrors and just plain stupidity that relentlessly arises on YouTube.”

That will sound awfully familiar to anyone who has followed Dorsey’s handling of Twitter’s ongoing troubles with bots, neo-Nazis, and widespread harassment of women and people of color. While Dorsey has been admirably forthcoming about Twitter’s shortcomings, he rarely matches his frank words with decisive actions. An official update on Twitter’s progress toward its goals of healthier interactions showed mostly incremental improvements, such as a 16 percent decline in reports of abuse by users whom the victim doesn’t follow. The company is justifiably proud of the fact that it is now proactively flagging 38 percent of the abusive content on which it takes action, but the bigger surprise is that the company was doing none of this work as recently as a year ago. Until 2018 — more than a decade after the company was founded and years after the scale of abuse it enabled had become clear — Twitter had put the entire burden of reporting on its users.

Taking action will help only if it’s the right action, and caution is warranted when making changes to platforms this important — and not just to shareholders.

A year ago, I praised Dorsey for “rethinking everything” and admitting he didn’t have all the answers at a time when Facebook was acting as though it did. That’s the first step toward reshaping a platform around the understanding that algorithms will always be gamed, anonymity can shield the hateful, and not all engagement is good. I still believe Dorsey wants to build a better Twitter. To that end, the company has begun looking for healthier metrics to prioritize, and has rolled out a beta-test app called Twttr to test new ideas. So far, it has tested features such as Reddit-style reply threads, hiding replies that may be abusive or spammy, and making like counts and retweet counts less readily visible. At TED, Dorsey suggested that the entire concept of likes and follower counts may be incentivizing the wrong sorts of interactions.

The Times profile finds Wojcicki waxing similarly philosophical, albeit perhaps partly to counter her mercenary image, which began to take shape in Bloomberg’s exposé. “One way I think about some of the decisions is putting myself in the future and thinking: in five or 10 years, what will they say?” Wojcicki said. “If someone were to look back on the decisions that we’re making, would they feel we were on the right side of history? Would I feel proud? Will my children feel like I made good decisions?”

These are the kinds of questions a reflective political leader might ask, and they underscore the sheer power these CEOs wield—not always willingly. Taking action will help only if it’s the right action, and caution is warranted when making changes to platforms this important — and not just to shareholders.

But posing the right questions now does not excuse the failure to ask them years ago, and it does not excuse these companies’ ongoing inability to devise solutions proportional to the problems they’ve created. It’s hard to shake the sense that these companies and their leaders were far less risk-averse regarding features that might help them grow, make money, and demolish competitors than they are when it comes to making sure their creations don’t ruin people’s lives.

TED’s Chris Anderson kept pounding this point in his interview with Dorsey. Calling the CEO the “captain of the Twittanic,” Anderson told Dorsey, “There are people on board in steerage” who are saying, “‘We’re worried about the iceberg ahead!’ And you go, ‘That is a powerful point’ and ‘Our ship frankly hasn’t been built properly for steering as well as it might.’ And we say ‘Please do something!’ And you go to the bridge, and we’re waiting, and we look in, and you’re showing this extraordinary calm — but we’re all standing outside saying, ‘Jack, turn the fucking wheel!’”

One explanation for why Dorsey and his counterparts have been reluctant to turn the wheel is that it’s not obvious what alternate course they could chart, given the vessels they have built. Hate, bullying, violence, and extremism are endemic in social media, as they are in the real world, and it’s probably impossible to eradicate them on a platform of YouTube or Twitter’s scale. The hard question these companies face is how to contain them, and avoid exacerbating them, without destroying a business model that’s premised on automation. The answer almost certainly involves forgoing some growth and profits and paying more humans to moderate content, which can be a tough sell to investors.

Perhaps the deeper reason these leaders have been reluctant to make those sacrifices is that they don’t actually see real-world harm as an iceberg. Facebook has livestreamed murders, suicides, and now a terrorist massacre, yet Mark Zuckerberg is still CEO and the company is posting record profits. The unintended consequences of these platforms’ flaws may be fatal to others, but not to the companies or their leaders.

For tech companies and their CEOs, the real iceberg is stagnant user growth, declining revenues, shareholder panic. Confronted with Anderson’s metaphor, Dorsey reminded the audience that when he took over the company, his first job was to steer it away from financial ruin and irrelevance. He succeeded because he didn’t dither. He acted.

If Facebook’s former unofficial motto of “move fast and break things” summed up the blitzscaling mentality that gave rise to modern social media, with all its power and problems, it might make sense that the antidote would be to “move slowly and fix things.” The problem is, the platforms are still breaking as rapidly as ever, and only the fixing is moving slowly.