YouTube, Facebook and Twitter were caught flat-footed when conspiracy theories about survivors of the Parkland, Fla., school shooting were unintentionally elevated on their sites through algorithms that promote trending topics and popular content.

Why it matters: Pressure is building on social media companies to better manage the spread of misinformation during breaking news. But they're struggling to contain the problem without compromising openness, which can also help get facts out quickly.

Many of the conspiracy theories alleged that the shooting survivors that are speaking out for gun control are paid "crisis actors." A few right-wing blogs and Russian troll accounts have been identified as spreading claims.

Fringe-right site The Gateway Pundit posted a story and circulated it on social media that claimed student survivor David Hogg had been coached on anti-Trump lines.

Data from the German Marshall Fund and Alliance for Securing Democracy's "Hamilton 68" dashboard that tracks Russian propagandists on Twitter found that Russian accounts were using hashtags like "#parklandstudentsspeak" a week after the shooting.

How they responded: The tech companies took steps to remove the false attacks:

Facebook Head of Content Policy Mary deBree says the company is removing "abhorrent" images that attack the victims. Posts from conspiracy theorist received thousands of likes and shares on the platform before they were removed.

Mary deBree says the company is removing "abhorrent" images that attack the victims. Posts from conspiracy theorist received thousands of likes and shares on the platform before they were removed. A YouTube spokesperson says conspiracy videos should never have appeared in its Trending videos tab, but "because the video contained footage from an authoritative news source, our system misclassified it." It's since removed that content from YouTube for violating policies.

says conspiracy videos should never have appeared in its Trending videos tab, but "because the video contained footage from an authoritative news source, our system misclassified it." It's since removed that content from YouTube for violating policies. Twitter announced Thursday it would limit automated tweets to curb bots. It also locked out thousands of accounts to verify their validity through phone numbers, causing some conservative Twitter users to lose thousands of followers, Gizmodo reports. Tweets about Hogg received thousands of retweets before being removed.

Breaking news is a prime opportunity for misinformation campaigns. All three platforms featured content from outlets known to publish misleading information as top news sources about the Las Vegas shooting in October.

Uncertainty around developing stories allows bad actors to game algorithms and avoid editorial scrutiny.

They often capitalize on emotionally-driven and politically-divisive stories, like mass shootings. Because it's harder to verify sources in real time, they can take advantage of user bias towards a subject, making them more susceptible to misinformation.

emotionally-driven and politically-divisive stories, like mass shootings. Because it's harder to verify sources in real time, they can take advantage of user bias towards a subject, making them more susceptible to misinformation. They take advantage of the ability to create fake or bot accounts that can quickly amplify fake stories through engagement, retweets, shares or likes.

Catch 22: As these companies get better at disseminating news, they become even bigger targets for bad actors, perpetuating the cycle of misinformation.

Experts recommend that platforms use phone numbers to verify users (like Twitter), or scrap "trending" news sections altogether.

Simon Thorpe, director of product and account security at Twilio, an account authorization tool, tells Axios, "Phone numbers as usernames provide fantastic security because each number is tied to identifying information about the owner."

"Maybe it's time to talk about ending 'trending'?" says David Carroll, Associate Professor of Media Design at Parsons School of Design. Platforms were engineered to optimize engagement, he said, "but without humans making editorial decisions, trending algorithms create a dangerous feedback loop and self-fulfilling prophecy of 'popularity' based on signals of consumption not reliability."

What's next? Platforms have been investing in local news to help steer users to the best reliable sources of information on the ground during breaking news events.