We already know how dangerous this cycle of radicalization can be, because similar mechanisms have fed Islamist terrorism in recent years. Anwar al-Awlaki, the cleric who communicated with the 2009 Fort Hood shooter and coached a young man to try to blow up an airliner over Detroit, left a digital footprint that survived on YouTube for years after his assassination by an American drone strike in Yemen. Videos of his sermons, even anodyne history lectures or self-help coaching, were always popular, thanks to his pleasant voice and serious demeanor . Now they also have a martyr’s allure.

If a viewer clicked on the cleric’s earlier, gentler, talks, YouTube’s algorithms would point the viewer to one of his later sermons, like one describing why it’s a Muslim’s duty to kill Americans. Dzhokhar Tsarnaev, one of the two Boston Marathon bombers, tweeted approvingly about Mr. Awlaki’s lectures. Chérif Kouachi, one of shooters who killed 12 people at the Paris offices of the magazine Charlie Hebdo in 2015, name-dropped Mr. Awlaki in a phone interview with a reporter before being shot by police. In death, as in life, Anwar al-Awlaki’s words inspired lonely, disturbed, or disaffected young men to kill.

By 2017, YouTube began to rethink its policies, and now all of Mr. Awlaki’s material — unless presented as news commentary or in a critical context — is banned from the platform. Facebook has long banned all of Mr. Awlaki’s videos. Both avow a commitment to combat hate speech, extremism and misinformation.

But platforms have been more tentative in dealing with the kind of right-wing extremism that focuses on white supremacy. Although organizations like the Anti-Defamation League and the Center for Strategic and International Studies provide information about these groups, official government sources are still crucial if there is to be an effective crackdown. Vast federal resources, for example, went into identifying the networks around Mr. Awlaki, who has been on a designated terrorist list since 2010.

But the government does not officially designate domestic terrorist organizations. The Trump administration has reduced or eliminated modest programs begun under President Barack Obama to counter violent extremism and deter recruitment, including among white supremacists. Mr. Trump has focused on Islamic extremism to the exclusion of other threats. Federal agencies do not even have common definition s of “domestic terrorist” and “domestic terrorism.”

Tech companies often draw on government lists to police their platforms for violent extremism. YouTube, for example, has long prohibited designated terrorists from having their own channels. For years, Facebook has banned the praise or support of organizations deemed dangerous or violent — a list at least partly informed by governments. (Facebook claims that it does not heavily rely on government lists.) Both platforms, along with Twitter and other technology companies, use a shared database of terrorist content — coordinated through the nonprofit Global Internet Forum to Counter Terrorism — to help take down extremist content faster. What the forum is capable of identifying is informed by what kind of information official organizations have about extremism.