Ask almost anyone: Our brains are a mess and so is our democracy.

In the last several years, we’ve seen increased focus on the crimes of the “attention economy,” the capitalist system in which you make money for enormous tech companies by giving them your personal life and your eyes and your mind, which they can sell ads against and target ads with. The deleterious effects of, say, the wildfire sharing of misinformation in the Facebook News Feed, on things of importance at the level of, say, the sanity of the American electorate, have been up for debate only insofar as we can bicker about how to correct them or whether that’s even possible.

And as such, we’ve seen wave after wave of tech’s top designers, developers, managers, and investors coming forward to express regret for what they made. “These aren’t apologies as much as they’re confessions,” education writer Audrey Watters wrote when she described one wave of the trend in early 2018. “These aren’t confessions as much as they’re declarations — that despite being wrong, we should trust them now that they say they’re right.”

Related How to quit Facebook without quitting Facebook

She termed this phenomenon “The Regret Industry,” a nod to the fact that many of these apologists manage to, while apologizing, find ways to book lucrative speaking gigs and secure book deals. At The Goods, we’ve decided to name the hallmark of the more recent iteration — the big, bold, written apology, which can be either in the first person or as a third-person profile — the “regreditorial.” (We also considered “oops-eds.”)

To be clear: Not every tech apology is the same, and any former tech star is allowed to apologize for mistakes, as well as to provide helpful information to those trying to undo said mistakes.

But there’s a spectrum ranging from useful proposals and frank factual statements to regret that centers the self and ignores the fact that much of the damage the speaker is copping to has already been widely reported and discussed by the people it affects more directly. That type of regret can telegraph superior intellect even while admitting to something destructive, and can have unmistakable whiffs of opportunism.

Here, a brief chronological timeline of recent tech regret:

July 2019, former Twitter developer Chris Wetherell

The headline for this regreditorial was, “The Man Who Built the Retweet: ‘We Handed a Loaded Weapon to 4-Year-Olds.’” This BuzzFeed profile of developer Chris Wetherell, who was hired at Twitter in 2009, describes the retweet as “the button that ruined the internet.” It considers the role the retweet played in amplifying Gamergate in the middle of the decade — the event that revealed to many people for the first time just how violent the internet could be and made it almost impossible to have a rational conversation online.

Wetherell tells the reporter, “I remember specifically one day thinking of that phrase: We put power in the hands of people. But now, what if you just say it slightly differently: Oh no, we put power in the hands of people.” His suggestion is not that Twitter get rid of the retweet button but that it consider a system for disabling retweets from and of “audiences” that regularly misbehave.

May 2019, Facebook co-founder Chris Hughes

This regreditorial appeared in the opinion section of the New York Times. It begins with Facebook co-founder Chris Hughes describing the last time he saw Mark Zuckerberg, sometime in the summer of 2017: “When the shadows grew long, I had to head out. I hugged his wife, Priscilla, and said goodbye to Mark.”

“Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government”

In it, Hughes, who left Facebook in 2007 to volunteer for Barack Obama’s presidential campaign and later purchased the New Republic for a brief time, argues that Facebook should be broken up by the Federal Trade Commission under antitrust law. This is a suggestion that has also been put forth by presidential candidate Elizabeth Warren and a slew of academics. (The FTC announced an investigation into Facebook’s arguable monopoly power this week.)

But Hughes’s op-ed caused a stir because of his extremely high-up former position at Facebook, and because of the nature of his personal relationship with Mark Zuckerberg. While Hughes calls his estranged friend a “good, kind person,” he also says, quite chillingly, “Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government,” and gives this as a major reason for the FTC to interfere.

April 2019, former Google engineer Tristan Harris

Arguably the foremost former technologist, Tristan Harris has widely been called the “conscience” of Silicon Valley, and at times even an “unlikely revolutionary.” When Harris worked at Google, he distributed a manifesto urging his employers to stop stealing attention from its users with irritating push notifications and other types of urgency-implying design that invades their brains and ruins their concentration.

I wrote about his rise as a major figure in the public battle against the ills of the attention economy in April, when he cohosted a summit about what he and his Center for Humane Technology co-founder Aza Raskin call “human downgrading.” (That’s their term for the way addictive technology has manipulated our emotions to the point where we are less functional as human beings.)

Alongside this summit, he was the subject of a laudatory profile in Wired, which described him as “Part Don Draper, part Carrie Mathison, and part John Nash as portrayed by Russell Crowe” — in other words, the lonely intellectual piecing together a world-saving theory in a race against the clock. When I asked Harris about it, he called it “an example of the attention economy at work” and said that people respond better to hero narratives.

November 2018, former Google engineer Guillaume Chaslot

Guillaume Chaslot helped build YouTube’s recommendation algorithm, and although he has not penned a buzzy op-ed or participated in a big profile, he has been in and out of the spotlight because of his website “Algo Transparency” and multiple viral Twitter threads decrying the way the YouTube recommendation algorithm has quickened the spread of conspiracy theories and other misinformation. In November 2018, he tweeted about the role YouTube played in the resurgence of the flat earth conspiracy theory, arguing that Google itself does not even truly understand the impact of the artificial intelligence it uses to recommend videos to users.

The real danger of AI is not that it becomes "self-conscious". It's that we trust it too much and slowly become imbeciles



Flat-earthers are the canaries in the coalmine /18 — Guillaume Chaslot (@gchaslot) November 20, 2018

“With AI in charge of our information, we’re facing a brand new, existential problem that concerns all of us,” he wrote. “From the algorithm’s point of view, flat earth is a gold mine.”

December 2017, former Facebook VP of user growth Chamath Palihapitiya

A former VP of user growth at Facebook, Chamath Palihapitiya came out against the company in December 2017 to say that it was “ripping apart the social fabric of how society works.” He was giving a talk at the Stanford Graduate School of Business and expressed “tremendous guilt” about the “short-term, dopamine-driven feedback loops” he had helped create.

He also said that Facebook “overwhelmingly does good in the world,” even after strongly implying that the company had eroded civil discourse and amplified misinformation and “mistruth,” calling it a “global problem.” He argued that venture capitalists are largely to blame because of their tendency to fund companies that can grow like cancers — “shitty, useless, idiotic companies” — and their disinterest in unsexy companies that address actual human problems. He now runs a VC firm called Social Capital, which has funded companies like Slack, SurveyMonkey, and Bustle.

November 2017, former Facebook operations manager Sandy Parakilas

Sandy Parakilas was a platform operations manager at Facebook in 2011 and 2012 who says he repeatedly warned higher-ups at Facebook about the site’s vulnerabilities to data theft — warnings that, had they been heeded, he believes may have prevented the Cambridge Analytica scandal. He wrote an op-ed in the New York Times in November 2017 that begins, “I led Facebook’s efforts to fix privacy problems on its developer platform in advance of its 2012 initial public offering. What I saw from the inside was a company that prioritized data collection from its users over protecting them from abuse.”

“What I saw from the inside was a company that prioritized data collection from its users over protecting them from abuse”

He was later profiled in the Guardian and has regularly argued for government intervention to regulate Facebook’s data collection and privacy policies, saying, “Nothing less than our democracy is at stake.” (Facebook was fined $5 billion by the Federal Trade Commission this month and is now required to submit privacy reviews of any new products or services to third-party assessors.)

“[Facebook has] a business model that is going to push them continuously down a road of deceiving people,” he told NPR shortly after. “It’s a surveillance advertising business model.” He now works on privacy at Apple.

November 2017, Napster founder and early Facebook investor Sean Parker

Napster founder Sean Parker was an early investor in Facebook (notoriously played by Justin Timberlake in the 2010 David Fincher movie The Social Network), who made $2 billion when Facebook went public in 2012. He spoke at an Axios event in Philadelphia in November 2017 and put forth the opinion that not only did Facebook design an addictive product, it did so on purpose:

“It’s exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. The inventors, creators — it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people — understood this consciously. And we did it anyway.”

He did not totally take responsibility for this and called himself a “conscientious objector” to social media. He still uses social media, though, so it wasn’t entirely clear what this meant.

October 2017, a whole lot of people

In a feature called, “The tech insiders who fear a smartphone dystopia,” the Guardian followed the careers and remorse of former Google, Twitter, and Facebook employees who had pivoted to become “refuseniks alarmed by the race for human attention.”

This included Justin Rosenstein, who created Facebook’s “Like” button and disavowed its “bright dings of pseudo-pleasure.” His coworker Leah Pearlman, who helped him build the “Like” button, said she had started using a browser plugin to disable her Facebook News Feed.

“Facebook and Google assert with merit that they are giving users what they want. The same can be said about tobacco companies and drug dealers.”

Loren Brichter, who invented the “pull-to-refresh” functionality for the Twitter feed, said, “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all.” The venture capitalist Roger McNamee — an early investor in Google and Facebook — turned on his investments, saying, “Facebook and Google assert with merit that they are giving users what they want. The same can be said about tobacco companies and drug dealers.” Ex-Google advertising strategist James Williams referred to the commercial social web as, “The largest, most standardized and most centralized form of attentional control in human history.”

Chris Marcellino, one of two engineers credited with inventing push notifications for Apple in 2009, said he had quit Silicon Valley entirely and was studying to be a neurosurgeon. From studying brains, he said, he’d learned that he was exploiting “the same circuits that make people seek out food, comfort, heat, sex,” but ultimately didn’t think it was a crime. “It is not inherently evil to bring people back to your product. It’s capitalism,” he told the publication.

June 2016, former Facebook product manager Antonio Garcia Martinez

Antonio Garcia Martinez left Facebook’s ad-targeting team in 2013 and became a best-selling author and sought-after pundit when he published Chaos Monkeys: Inside the Silicon Valley Money Machine in 2016. That summer, he published a piece in Wired titled, “I Helped Create Facebook’s Ad Machine. Here’s How I’d Fix It.”

First, he describes his paternal protectiveness of Facebook, writing that negative press coverage from “the usual tech journalist peanut gallery” made him feel like “a father watching his son get bullied in a playground for the first time,” and excerpting his internal monologue: “How can this perfect, innocent creature get assailed by such ugliness?”

Martinez says that, as he was writing the piece, Mark Zuckerberg addressed the problems with Facebook’s advertising platform and promised to fix them, and he agreed with all of the solutions Zuckerberg proposed. The only quibble he had was with Facebook’s reluctance to acknowledge itself as a global power. “The company needs to put on its big-boy pants and assume its place on the world stage,” he concluded.

August 2014, former ad-tech designer Ethan Zuckerman

One of the authors of the first regreditorials, former ad-tech designer Ethan Zuckerman, wrote a long piece for the Atlantic in August 2014, which is largely thoughtful and frank. In it, he refers to advertising as “the original sin of the web” and takes credit for helping invent the pop-up ad in the late 1990s:

The fiasco I want to talk about is the World Wide Web, specifically, the advertising-supported, “free as in beer” constellation of social networks, services, and content that represents so much of the present-day web industry. … Once we’ve assumed that advertising is the default model to support the Internet, the next step is obvious: We need more data so we can make our targeted ads appear to be more effective.

Zuckerman is now director of the MIT Center for Civic Media and is widely known as a privacy activist.

Some people are better at taking responsibility than others. I would argue that the platonic ideal of the addictive tech creator apology came just before Valentine’s Day in 2014, when developer Dong Nguyen voluntarily pulled his own extremely lucrative game from the App Store. He said the game was designed to be relaxing, but he accidentally made it too frustrating, and he felt bad about the average $50,000 per day he was making from hooked users. “I think it has become a problem,” he said. “My life has not been as comfortable as I was before. I couldn’t sleep.”

Of course, Mark Zuckerberg rolling out of bed after a poor night’s rest and easing his anxiety by deleting Facebook would cause total chaos and probably global economic disaster, which makes it a woefully insufficient fantasy.

Sign up for The Goods’ newsletter. Twice a week, we’ll send you the best Goods stories exploring what we buy, why we buy it, and why it matters.