In the immediate aftermath of the Parkland shooting, the breakneck spread of conspiracy theories made one thing abundantly clear: tech companies are no longer the masters of their own algorithms. Misinformation flourished even as YouTube, Facebook, and other companies scrambled to play catch up—a phenomenon that appears to have been spurred by more than outraged, ignorant users. As the Daily Beast reports, members of at least one far-right group crafted a strategy to game YouTube’s Trending tab into promoting false videos, seizing on a soft target with a boldness that in the past has been restricted to the darkest corners of the Internet.

While the far right has been active in its own echo chamber for decades—think obscure subReddits, 4chan, and 8chan—its push into mainstream platforms is relatively new. “It was only when they started to insert themselves into other online spaces that the far right started enjoying more success,” George Hawley, a political-science professor at the University of Alabama and author of Making Sense of the Alt-Right, told me. At first, these efforts mostly targeted the comment sections of news stories, which “allowed them to be seen by a large number of people.” When outlets started to crack down on comments, however, these groups migrated to places like Twitter and YouTube. “Twitter was even more valuable for the far right, as it allowed anonymous users to directly interact with public figures and spontaneously launch semi-coordinated trolling campaigns,” Hawley said. YouTube, too, is important, allowing users who may not be seeking out right-wing content to stumble upon it organically.

More pernicious than the far right’s occupation of YouTube, however, is the group’s means of manipulating the platform to spread its messages. In the case of the Parkland shooting, the far-right group Reconquista Germanica used networks of fake accounts to manipulate YouTube’s algorithms, strategically downvoting and upvoting videos in an attempt to make those they like rise in the platform’s search function, while pushing down videos they disagree with so that they take longer to find or are ignored altogether. “We can push our own videos through likes and comments, through the organization we’ve created, so that they are rated more relevant by YouTube’s search algorithm,” one Reconquista Germanica member said in German, according to screenshots tweeted by the anti-far-right group Alt Right Leaks. Reconquista Germanica’s chats took place on Discord—a messaging platform originally intended for gaming, which the far right has embraced. (Discord, for its part, is reportedly shutting down some of its far-right servers.)

“Most of this is less new than people think it is,” extremism expert J.M. Berger told me, explaining how the far right has learned to emulate tactics used for years by spammers, Russian hackers, and even the Islamic State. At the same time, it is hard to deny that__Donald Trump__ gave them a boost. “The right-wing resurgence we’re seeing now is not just astroturfing, but the result of several years of work by far-right activists, culminating in the rise of a candidate who was willing to overtly pander to white nationalists and other right-wing extremists. The election of President Trump has done more to mainstream white nationalism than anything in the last 40 years, but it’s a symbiotic arrangement. He elevates their issues, and they organize social-media campaigns to protect and elevate him.”

The other piece of the puzzle, of course, is tech companies’ inconsistent responses to far-right activity. What came out of the group’s occupation of certain shady corners of the Internet, said Ryan Lenz, a senior writer for the Southern Poverty Law Center, was “not just ideas and ideologies, but a pattern of behavior and an understand[ing] of how to use the online space. You cannot deny that serious players in the alt-right emerged from what was a troll culture,” he added, a fact that makes the group’s tactics alarmingly effective in a market engineered to reward shock value.

With their platforms under siege, tech companies are struggling to reimagine the algorithms on which they’re based; Facebook recently announced changes to its News Feed that will de-emphasize posts from media outlets, while a source familiar with YouTube’s Trending tab told me last week that YouTube is “working to improve the applications of [its] policies to ensure that videos containing hoaxes both in the video and in the title and description do not appear in the Trending tab again.” But a simpler solution, said Lenz, would be for mainstream tech-platforms to send a clear signal to malignant operators. Lenz pointed to Medium, which recently banned some far-right figures like Mike Cernovich, Jack Posobiec, and Laura Loomer. “We do not allow calls for intolerance, exclusion, or segregation based on protected characteristics, nor do we allow the glorification of groups which do any of the above,” the company’s new rules specify. “Amid this rise of right-wing ideologies,” said Lenz, “tech [companies] are saying, ‘Our platforms are . . . not a place to propagate racist messaging.’”