Instagram could become a new platform for the sharing of disinformation around the 2020 election because of the way propagandists are relying on images and proxy accounts to create and circulate fake content, leading social intelligence experts tell Axios.

The big picture: "Disinformation is increasingly based on images as opposed to text," said Paul Barrett, the author of an NYU report that's prompted a renewed look at the problem. "Instagram is obviously well-suited for that kind of meme-based activity."

How it works: A false claim about the Odessa shooter in Texas being a Beto O'Rourke supporter appeared as a tweet from a far-right account called @UncleSamsChild, which has nearly 30k followers.

This tweet quickly turned into screenshot images shared on Instagram from proxy accounts for @UncleSamsChild, whose accompanying Instagram account has zero posts, presumably because it was taken down for violating Instagram's rules.

The group's hashtag #UncleSamsMisguidedChildren appears on over 31,000 posts, meaning they have a healthy following on Instagram despite not having any actual posts on their own account.

So it was a tweet made to look like an Instagram post that was also shared by various people on Facebook — all as images and by accounts other than the main disinformation culprit, @UncleSamsChild.

Why it matters: This makes it harder for platforms to enforce their rules, remove content and suspend accounts.

What they're saying: Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, said the Odessa example was the first time he's seen this proxy account strategy at work.

"They’re keeping their main account and brand and being careful not to violate policies or get that account suspended, and using other proxy accounts to share screenshots. They have also the impact but not the accreditation back to the main account, so they’re circumventing the rules."

"Instagram isn’t built for virality in the same way as other platforms, so it does require other kinds of ingenuity to abuse the platform in other ways," said Joan Donovan, director of the Technology and Social Change Research Project at the Shorenstein Center.

Why Instagram matters: It's an engagement powerhouse that attracts far younger users than its parent company, Facebook.

And it drove more engagement with Russian disinformation in 2016 than Facebook, according to the NYU report.

In a statement, Instagram said: "We know that our adversaries are always changing their techniques so we are constantly working to stay ahead."

Experts say the tactics of the people looking to spread disinformation around the 2020 election have gotten savvier since 2016, so it will be harder to crack down on it.

"No kind of competent information operation will be single-platform," said Ben Nimmo, head of investigations at Graphika, a social-media analytics company.

He studied one Russian operation that targeted the U.S. and spread false content across more than 30 different social networks and blogging platforms.

So now researchers are focused on the behavior online — not just specific platforms — when trying to identify and get ahead of disinformation.

They're watching how accounts use "backup" proxy accounts to share information, and what type of hashtags they're using that might be incongruous with the type of content they're posting.

Researchers are also keeping an eye on activities like sharing celebrity gossip to build an audience and then pivoting to political content as an election nears.

What to watch, per Nimmo: "The more big platforms are cracking down on stuff on their platforms, the more they’re forcing the bad actors to look elsewhere."