YouTube does not regularly make the mechanics of its recommendation system available to the public, and it frequently changes its algorithmic recipes. Until 2016, the system optimized for “watch time.” Now, according to a company spokesperson, it privileges attributes like “information quality” and “user satisfaction,” neither of which the spokesperson was willing to define in any detail. The lack of transparency makes independent analysis virtually impossible. Perhaps as a result, YouTube’s algorithms have earned a sinister place in the public imagination, right alongside Facebook’s News Feed.

If the recommendation system is truly a horrifyingly competent engine of radicalization, then it’s difficult to imagine Bonnell or anyone else making much of an impact. Yet the familiar narrative may be incomplete. Last fall, Kevin Munger and Joseph Phillips, a pair of political scientists at Penn State, published a corrective study of radicalization on YouTube. Using the platform’s API, which is publicly available, they examined metadata from nearly a million videos, drawn from 54 different channels. They sorted the channels into five segments: liberals (including Bonnell), skeptics, conservatives, alt-lite, and alt-right. (These last two categories distinguished between carnival barkers like Milo Yiannopoulos and white supremacists like Richard Spencer.) Munger and Phillips found that while overall viewership in all five categories has boomed in the past decade, viewer­ship of alt-lite and alt-right channels has actually declined since mid 2017. The highest growth, by far, occurred in the conservative category, which includes mainstream commentators like Ben Shapiro.

In seeking to explain their results, Munger and Phillips eschew the “radicalization by algorithm” hypothesis. Instead, they propose a “supply and demand” framework. YouTube, they point out, has an unprecedented ability to match “radical alternative political canons” with the communities that are prone to be persuaded by them. It allows these underserved audiences to begin “consuming media more consistent” with their true beliefs and sentiments. So while the platform may well facilitate the spread of radical ideas, it does not implant them into the minds of unsuspecting viewers. What it does do, Munger and Phillips write, is afford radicalized viewers a sense of community and shared purpose that they struggle to find in their ordinary lives.

That the far right has been able, however artificially, to fulfill these needs for thousands of people—mostly white men—is what makes this phenomenon genuinely dangerous. Scott Atran, a widely respected anthropologist who studies terrorism, religion, and international conflict, has written about the similarities between the far right in America and violent extremists in the Muslim world. In both groups, Atran says, the ability to divert or deradicalize someone “depends on where along the path to radicalization” they are. Earlier on in the process, he says, various forms of persuasion—an income, a prison sentence, a supportive community—“might do the trick.” But if the person has bought into the radical group’s “sacred values,” the beliefs they will not compromise for anything (like, say, ethnic purity or racial supremacy), then it becomes vastly more difficult to deter them.

Bonnell shares none of the far right’s values, sacred or otherwise, but he is uniquely positioned to intervene. In 2018 the Data & Society Research Institute published a report charting the relationships between some of YouTube’s popular political voices. There, in a visualization on the report’s 11th page, lodged above the men’s rights activist Stefan Molyneux, pinched between the anti-­immigrant pundit Lauren Southern and the self-proclaimed “disaffected liberal” Tim Pool, is Destiny. Bonnell has entangled himself, like a gadfly, into a web of contrarianism and derangement.

To the extent that Bonnell manages to deter radicalization and make people think critically, then, it is not because he has hijacked YouTube’s recommendation algorithms but because he knows the cultural norms that the far right trades in. If you’re someone who has succumbed to reactionary politics online, you’ll see in Bonnell a kindred spirit—a college dropout from Nebraska who scoffs at political civility, revels in seamy, self-referential humor, and will talk openly about literally anything. And, perhaps most important, you’ll see someone who has spent years cultivating a community that is more likely to forgive your past indiscretions than to shame you for them.

Sign Up Today Sign up for our Longreads newsletter for the best features, ideas, and investigations from WIRED.

It’s more or less impossible for Bonnell to measure how effective he has been, so he grudgingly relies on intuition. After any debate, he spends a great deal of time scouring the various forums of the internet—Reddit, 4chan, comment threads on YouTube or Facebook—in search of minds perturbed. In doing so, he has noticed a common formulation of doubt among viewers, which he generalizes as, “You know, I normally really like Figure X and I think Destiny is a fucking idiot, but I don’t think Figure X responded well to what he said.”