In 2014, Elliot Rodger went on a shooting and stabbing spree, killing six and injuring 14 at the University of California, Santa Barbara. Rodger was a self-proclaimed “incel” (short for involuntary celibate)—a group of young men who feel furious at their perceived rejection by women and meet online to discuss and spread their ideology. Their toxic misogyny fuels a hatred for women that has led to several recent incidents of mass violence, with many incels citing Rodger’s own disturbing manifesto as an inspiration.

The authorities are taking note. Last month, the Texas Department of Public Safety released a report finding that incels “are an emerging domestic terrorism threat as current adherents demonstrate marked acts or threats of violence in furtherance of their social grievance.”

Now a group of computer scientists have painted the most complete picture yet of the misogynistic groups that fuel the incel movement online.

The “manosphere,” as it is known, is divided into four broad groups. “Men’s right’s activists” (MRAs) claim that family law and social institutions discriminate against men. “Men going their own way” (MGTOW) take this feeling of grievance further, arguing that society can’t be “amended”; they often avoid women, blaming them for their problems. “Pick-up artists” (PUAs), meanwhile, date and harass women; they believe society is “feminizing” men.

And then there are the incels, the most potentially violent of the group. Incels abide by the “black pill,” a belief that women use their sexual power to dominate men socially. For that, incels want revenge.

The team’s analysis found that the manosphere is evolving—and fast. Over the past 10 years, the population of men identifying as men’s rights activists and MGTOW—traditionally older and less violent—is falling while younger, more toxic PUA and incel communities have seen a spike.

Worryingly, it seems that there has been a significant migration from men’s rights groups to incel groups. Every year since 2015, around 8% of MRA or MGTOW members appear to have become more radicalized and joined incel groups online.

“The older [groups] are dying off,” says coauthor Jeremy Blackburn, an assistant professor at Binghamton University.

Indeed, it seems that not only are older, less violent groups dying off, but membership in the more violent groups is becoming more toxic. To determine the level of hate being espoused by these groups, the team used a machine-learning tool developed by Google, called Perspective, that looks for keywords in speech. It produces a “toxicity score” to give an idea of how much hate speech is being used in the forums.