Posts that are deemed false won't be removed, but they won't appear in the Explore tab or in the hashtag result pages, either. "Our approach to misinformation is the same as Facebook's — when we find misinfo, rather than remove it, we'll reduce its distribution," a spokesperson told Poynter.

Since the US midterm election, Instagram has reportedly been working with Facebook's News Feed Integrity team. When images with false information are found on Facebook, the company's image recognition technology can search for them on Instagram, as well. Of course, there are plenty of false posts on Instagram that don't appear on Facebook, and vice versa. So now, questionable Instagram posts will be flagged and sent directly to Facebook's fact checkers.

According to Poynter, Instagram is also considering adding pop-ups that appear when people search for misinformation, like anti-vaccine content. Still, some say that's not enough and want to see labels for photos that have been debunked or warnings that appear when users try to like or comment on those posts.

This news comes less than a week after Facebook and Instagram banned far-right extremists like Alex Jones for promoting or engaging in violence and hate. While Instagram has been spared some of the content problems Facebook has seen, it's clear the company is looking to take a stronger stance against hate and misinformation across both platforms.