It is hard to know how this might play out in a pandemic situation. But it’s worth noting that secular societies struggle with providing mechanisms to resolve feelings of guilt. The contemporary radical right has been particularly successful in encouraging its followers to redirect any societal guilt they might feel about past historical wrongs or current states of injustice into rage at those groups who would make them feel guilty: women, people of color, Jews.

In a pandemic situation it may be possible, for instance, that we see a mass phenomenon of survivor’s guilt at the end of this. What could happen as a result is our being bombarded with tempting offers to rechannel our guilt into anger at those who were most affected, who serve as a reminder of our relative good luck: undocumented immigrants, the elderly, the poor, the disabled, even the dead. These ideas could even be promoted by those in power, who will no doubt be grateful for the transference of accountability. In some places, we can already see these forces mobilizing — see, for instance, arguments on the far right that discussions of Chinese culpability for the virus are being suppressed in the name of “political correctness,” or that there are groups “intentionally” spreading the virus, who must be punished.

Which brings me to my third point: To understand how such a spiral of anger and guilt might work, we desperately need to update our understanding of how internet subcultures function.

We think of radicalization in ways that are hopelessly old-fashioned: We frequently ignore or downplay mainstream or institutional complicity, and as a researcher of technology and politics, Becca Lewis, writes, we often think of radicalization as something the radicalized passively fell into and got swept up in.

In fact, the internet — for good and for ill — is a collaborative and imaginative space, rather than somewhere one group of people talks and another listens. We can be both influencer and influenced. As Ms. Lewis put it, “audiences often demand ” increasingly radical content from their preferred creators. Then, as far-right content continues to get enormous engagement, we see the numbers, and our understanding of this content as beyond the pale naturally decreases.

Internet users are an active audience, but they are also constantly in an unprecedented state of surveillance of one another. So before we have even made the decision to watch a video or read an article, our perception of it has already been altered almost imperceptibly by the various tiny signals surrounding it. Whatever social media platform you use to engage with the world, your timeline is almost certainly the greatest source of unchecked and frequently subconscious influence.

The seemingly anarchic, democratic state of affairs on the internet can be one of its chief joys — but it is also frequently an illusion. Advertisers, after all, are keenly aware of how to utilize it to their benefit. So too, increasingly, are politicians and governments. In this age of isolation, we need to be aware of how far-right actors will attempt to exploit this unprecedented situation — and we need to be prepared for the fact that it may very well work.

Annie Kelly is a Ph.D. student at the University of East Anglia in England, researching the impact of digital cultures on anti-feminism and the far right.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.