For indispensable reporting on the coronavirus crisis, the election, and more, subscribe to the Mother Jones Daily newsletter.

In late July, Facebook announced it had removed nearly three dozen pages that had been involved in “coordinated inauthentic behavior” intended to “mislead” users ahead of the 2018 midterm elections. The pages—which appeared across both Facebook and the company’s photo-sharing platform, Instagram—shared content that amplified positions espoused by American liberals, such as the “Abolish ICE” platform. They also advertised fake events intended to attract progressive activists, like a “Trump Nightmare Must End” rally in Times Square. (You can see samples of the content Facebook removed here.) While Facebook hasn’t yet completed its investigation into who orchestrated the effort, the company told federal lawmakers that it suspects a Russian group is behind them.

Our team at Mother Jones is keeping a close eye on the ways in which social-media platforms and lawmakers are trying to stop these sorts of massive coordinated efforts, but we know not all attempts to disseminate misinformation originate with sophisticated foreign agents. On Monday, for example, Facebook removed pages belonging to Alex Jones, the right-wing conspiracy theorist who—through his website Infowars—has been a chief propagator of false information that’s reached millions of social-media users. And while Jones may be off the platform, his imitators are here to stay. Facebook said Jones’ removal had been based on his use of “hate speech that attacks or dehumanizes others” rather than Infowars‘ involvement in spreading misinformation. That reasoning builds on Facebook CEO Mark Zuckerberg’s statement last month that he wasn’t planning to ban conspiracy theorists from his company’s sites.

What Facebook’s recent actions illustrate is that misinformation comes in all shapes and sizes. Sometimes, misinformation looks like an advertisement advocating a particular position. Other examples, such as Infowars posts, look just like news articles—but they’re telling a story that can’t be factually verified. One of the most common characteristics of misinformation is that it’s intended to manipulate your emotions and elicit a reaction. (You can read our FAQ on spotting misinformation here.)

A few months ago, we asked you to join our new effort to track and fight disinformation ahead of the midterm elections. Many of you raised your hand, and we’re eager to start working with those of you who volunteered. Right now, however, we have a specific ask: Have you seen any social-media posts—on Facebook, Instagram, or Twitter—that seem to be sharing misleading information? If you point these out to us, we’ll investigate who’s behind them and what their interest in disseminating misleading information might be.

Share links and describe what you’re seeing in the form below, or email us screenshots at talk@motherjones.com.