In October 2018, a cellphone video of a Brooklyn woman calling 911 to claim a 9-year-old black boy grabbed her rear went viral on social media, becoming one of a series of videos that activists and journalists seized upon as an example of the everyday racism faced by minorities in America.

That woman became known as “Cornerstore Caroline.” Other individuals who became the subjects of similar stories would be known by nicknames such as “Basketball Becky” or “Taco Truck Tammy.”

Now, Clemson University researchers have found those videos received instrumental early social media promotion from inauthentic accounts, some of which have since been removed by Twitter and linked by U.S. intelligence to Russia’s efforts to stoke racial tensions in America.

The Russia-linked accounts adopted fake black personas, from phony profile photos to style of speech, to infiltrate an existing organic online movement that calls out and shames acts of alleged racial prejudice.

The tweets by the suspicious accounts drew 50-90 percent of the initial retweets before the stories took off, an indication their content played a leading role in drumming up attention, according to the researchers' pre-publication findings.

Darren Linvill, an associate professor of communications, and Patrick Warren, an associate professor of economics, both at Clemson, identified more than 300 tweets from almost 30 suspicious Twitter accounts that appeared to look for and promote videos of racially tense incidents.

“Fundamentally, what they’re trying to do is put a spotlight and rub salt in the wound of these divisive issues,” Linvill told NBC News, referring to the strategy of using social media to exacerbate racial tensions.

The accounts reposted or tweeted the videos, framed the content as explicit discrimination, used emotionally loaded language, and employed calls to action, including encouraging people to publish the participants’ personal information — a strategy known as “doxxing.”

The October 2018 video was promoted by @BLK_Hermione, an account that has since been suspended for platform manipulation, according to a Twitter spokesperson. The account encouraged others to retweet the video until someone recognized the woman. Nine minutes later, the account tweeted the woman’s phone number, among the first to do so online.

The coordinated activity began in early 2018 and appeared to peak in the run-up to the midterm elections later that year, the researchers found.

“It’s clear in several examples that some of these stories would never have gone viral without the influence of Russian disinformation,” Linvill said.

Three of the accounts the researchers analyzed were taken down by Twitter in 2018 and this year for being part of or having the characteristics of Kremlin-linked disinformation operations. Twitter confirmed it suspended other accounts reported by the researchers for platform manipulation.

Twitter removed three of the accounts analyzed by the researchers for being part of potential foreign influence operations: @JEMISHAAAZZZ, @KANIJJACKSON, and @QuartneyChante.

Disinformation experts told NBC News that the goal of the inauthentic accounts is to highlight and inflame existing tensions to destabilize America, a strategy also used by Russia’s disinformation campaign leading up to the 2016 election.

“The real goal is to get the conflict off Twitter, to get it into the streets,” Philip Howard, director of the Oxford Internet Institute, said.

The researchers said they placed the accounts as probably Russian in origin based on forensic analysis of account information and behavior that they shar­­ed with NBC News.

A Twitter spokesperson said the company is fully committed to “protect conversation around the 2020 elections and beyond.”

These stories that the accounts focused on followed a pattern: a video uploaded online showing a white person calling or threatening to call the authorities on a minority. The objects of the outrage were often given alliterative nicknames, on the lines of “Taco Truck Tammy” or “Basketball Becky.”

As the accounts promoted the videos, online outrage would build, along with calls to expose and promote the phone number, address and other personal information of the person.

Sometimes, all the accounts added was a nudge, merely reposting a video and tweeting, “Twitter, do your thing.”

Some of the accounts tracked by the researchers were among the first group of accounts in conversation, providing a key early boost to a viral moment’s lift, researchers say.

The researchers, however, did not make any judgment on the veracity of the videos.

In the case of “Cornerstore Caroline,” closed-circuit TV footage later showed the boy had accidentally made contact while walking past her. After the video racked up at least 18 million views and heaps of social media outrage, the woman, Teresa Klein, apologized.

In at least one instance, one of the suspended accounts coined a nickname. In April, a Hispanic taco truck worker in Texas recorded and uploaded a video of a white woman threatening to call U.S. Immigration and Customs Enforcement after the truck was parked in front of her house.

One of the suspicious accounts spotted by the researchers was the first to call her “Taco Truck Tammy.”

“Twitter, you know what to do,” the account tweeted as it posted the video, which drew at least 16 million views and regional media coverage. Memes later shared the woman’s full name, date of birth, phone number and home address. The account is now suspended.

According to the Clemson researchers’ analysis, the suspicious accounts drew about half of the retweets of the encounter between the homeowner and the workers in the first two days.

Despite takedowns by Twitter, the activity by the suspicious accounts continues. In July, inauthentic accounts spotted by the researchers helped the story of “Basketball Becky” to trend.

CLARIFICATION (Aug 13, 2019, 1:45 p.m. ET): An earlier version of this article incorrectly implied that Russia-linked accounts were the primary driver of online attempts to expose examples of racial prejudice. The article has been amended to clarify that there are many legitimate accounts participating in the same conversations.