A lousy mood and inflammatory debate can provoke anyone to transform from a friendly offline Jekyll into an evil online Hyde, according to new Stanford and Cornell research.

It’s widely assumed that Internet “trolls” are different from the rest of us. Conventional wisdom holds that they’re innately sociopathic individuals whose taunting, derogatory or provocative internet posts disrupt cordial discussion.

But new research, published as part of the upcoming 2017 Conference on Computer-Supported Cooperative Work and Social Computing, reaches a different conclusion: Under the right circumstances, anyone — even ordinary, good people — can become a troll, changing their online behavior in radical ways.

“It’s not some idiot on the other side of the keyboard,” said Michael Bernstein, professor of computer science at Stanford and co-author of the paper. “It is probably someone like you, who’s having a bad day.”

Through experiments, data analysis and machine learning, the researchers honed in on simple situational factors that make the average person more likely to troll.

In one test, people who were in a bad mood after completing a difficult test and reading negative online comments were almost twice as likely to troll — making a personal attack or cursing, for instance — than people who took an easy test and read neutral posts.

The time of day or week is also influential. Researchers analyzed data from CNN’s comment sections of more than 26 million posts, looking at the time stamp of posts, and found that incidents of “down-votes” and flagged posts were higher late at night and early in the week — times known to correspond with grouchy moods.

Finally, others’ trolling can encourage us to behave badly, they found. Using a machine-learning algorithm, they found that the flag status of the previous post in the discussion was the strongest predictor of whether the next post would be flagged. The user’s history and user ID, although somewhat predictive, played a smaller role.

“It takes less than you might think to get someone to engage in trolling online,” said Bernstein. “And if someone does that to you, you become more likely to do it — and this can just cascade into creating a negative environment.”

Jure Leskovec, associate professor of computer science at Stanford and senior author of the study, called it “a spiral of negativity.”

“Just one person waking up cranky can create a spark and, because of discussion context and voting, these sparks can spiral out into cascades of bad behavior. Bad conversations lead to bad conversations,” he said, in a prepared statement.

Social environments — such as teams at work — operate best when everyone is abiding by the implicit rules of that community, they noted. But it doesn’t take much to nudge that rhythm out of balance. The same thing happens online, they said.

“Everyone will fly off the handle occasionally. But when you put it all together, it produces an environment where it looks like everyone is going nuts,” said Bernstein. “And that’s really hard to recover from.”

To be sure, there are professional trolls like the Breitbart News editor Milo Yiannopoulos, who has been permanently banned from Twitter. Violent demonstrations prompted UC Berkeley to cancel his speaking event there last week.

But much of trolling behavior is not due to experienced provocateurs — it’s due to normal people who turn negative, they found.

“There are people who are expert trolls, and they are very effective at what they do,” said Bernstein. “But they’re only successful because they can provoke normal people.”

Their findings could lead to the creation of better online discussion spaces, they concluded.

Many forums and online communities have struggled to find ways to strike back. Too often, online networks try to root out trolls by focusing on small groups of people.

“Instead, we have to design systems that can deal with us on the days when we are not our best selves,” said Bernstein.

The team suggested these steps to reduce trolling: a “cooling-off period” for agitated commenters who have just had a post flagged, creating a three-hour delay before they can send more comments. They also suggested systems that automatically alert moderators to a post that’s likely to be a troll post or “shadow banning,” which is the practice of hiding troll posts from non-troll users without notifying the troll.

“What I’m thinking about is designing a system that is the equivalent of saying ‘Hey, let’s take this offline,'” said Bernstein. “We need to understand what causes things to amplify, cascade and tip normal people into negative behaviors.

“The natural impulse, when we see this behavior, is to think: ‘It’s someone else,’ ” he said. “The most important outcome of this study is that ‘It’s not those people — it is us.”