The silhouette of Mark Zuckerberg, chief executive officer and founder of Facebook Inc., is seen during the Oculus Connect 4 product launch event in San Jose, California, on Wednesday, Oct. 11, 2017. David Paul Morris | Bloomberg | Getty Images

ALTENA, Germany — When you ask locals why Dirk Denkhaus, a young firefighter trainee who had been considered neither dangerous nor political, broke into the attic of a refugee group house and tried to set it on fire, they will list the familiar issues. This small riverside town is shrinking and its economy declining, they say, leaving young people bored and disillusioned. Though most here supported the mayor's decision to accept an extra allotment of refugees, some found the influx disorienting. Fringe politics are on the rise. But they'll often mention another factor not typically associated with Germany's spate of anti-refugee violence: Facebook. Everyone here has seen Facebook rumors portraying refugees as a threat. They've encountered racist vitriol on local pages, a jarring contrast with Altena's public spaces, where people wave warmly to refugee families. Many here suspected — and prosecutors would later argue, based on data seized from his phone — that Mr. Denkhaus had isolated himself in an online world of fear and anger that helped lead him to violence. This may be more than speculation. Little Altena exemplifies a phenomenon long suspected by researchers who study Facebook: that the platform makes communities more prone to racial violence. And, now, the town is one of 3,000-plus data points in a landmark study that claims to prove it. Karsten Müller and Carlo Schwarz, researchers at the University of Warwick, scrutinized every anti-refugee attack in Germany, 3,335 in all, over a two-year span. In each, they analyzed the local community by any variable that seemed relevant. Wealth. Demographics. Support for far-right politics. Newspaper sales. Number of refugees. History of hate crime. Number of protests. One thing stuck out. Towns where Facebook use was higher than average, like Altena, reliably experienced more attacks on refugees. That held true in virtually any sort of community — big city or small town; affluent or struggling; liberal haven or far-right stronghold — suggesting that the link applies universally. More from The New York Times:

New Russian Hacking Targeted Republican Groups, Microsoft Says

Amazon's Ripple Effect on Grocery Industry: Rivals Stock Up on Start-Ups

Slack Raises $427 Million More, at $7.1 Billion Valuation Their reams of data converged on a breathtaking statistic: Wherever per-person Facebook use rose to one standard deviation above the national average, attacks on refugees increased by about 50 percent. Nationwide, the researchers estimated in an interview, this effect drove one-tenth of all anti-refugee violence. The uptick in violence did not correlate with general web use or other related factors; this was not about the internet as an open platform for mobilization or communication. It was particular to Facebook. Other experts, asked to review the findings, called them credible, rigorous — and disturbing. The study bolstered a growing body of research, they said, finding that social media scrambles users' perceptions of outsiders, of reality, even of right and wrong. Facebook declined to comment on the study, but a spokeswoman said in an email, "Our approach on what is allowed on Facebook has evolved over time and continues to change as we learn from experts in the field." The company toughened a number of restrictions on hate speech, including against refugees, during and after the study's sample period. Still, experts believe that much of the link to violence doesn't come through overt hate speech, but rather through subtler and more pervasive ways that the platform distorts users' picture of reality and social norms. We visited Altena and other German towns to retrace each step from the site's algorithm-driven newsfeed to real-world attacks that its users might not otherwise commit — and that hint at subtle but profound ways that the social network reshapes societies.

Separating right from wrong

When refugees first arrived, so many locals volunteered to help that Anette Wesemann, who runs Altena's refugee integration center, couldn't keep up. She'd find Syrian or Afghan families attended by entourages of self-appointed life coaches and German tutors. "It was really moving," she said. But when Ms. Wesemann set up a Facebook page to organize food drives and volunteer events, it filled with anti-refugee vitriol of a sort she hadn't encountered offline. Some posts appeared to come from outsiders, joined by a handful of locals. Over time, their anger proved infectious, dominating the page. Told about research linking Facebook to anti-refugee violence, Ms. Wesemann responded, "I would believe it immediately." Such links would be indirect, researchers say, but begin with the algorithm that determines each user's newsfeed. That algorithm is built around a core mission: promote content that will maximize user engagement. Posts that tap into negative, primal emotions like anger or fear, studies have found, perform best and so proliferate. That is how anti-refugee sentiment — which combines fear of social change with us-versus-them rallying cries, two powerful forces on the algorithm — can seem unusually common on Facebook, even in a pro-refugee town like Altena. But even if only a minority of users express vehement anti-refugee views, once they dominate the newsfeed, this can have consequences for everyone else. People instinctively conform to their community's social norms, which are normally a brake on bad behavior. This requires intuiting what the people around us believe, something we do through subconscious social cues, according to research by Betsy Paluck, a Princeton University social psychologist. Facebook scrambles that process. It isolates us from moderating voices or authority figures, siphons us into like-minded groups and, through its algorithm, promotes content that engages our base emotions. A Facebook user in Altena, for instance, might reasonably, but wrongly, conclude that their neighbors were broadly hostile to refugees. "You can get this impression that there is widespread community support for violence," said Dr. Paluck. "And that changes your idea of whether, if you acted, you wouldn't be acting alone." In his office, Gerhard Pauli, a grandfatherly local prosecutor, flipped through printouts of social media posts that the police had pulled from Mr. Denkhaus's cellphone. "He was very interested in Facebook," Mr. Pauli said. He paused over an image of wide-eyed, dark-skinned men, superimposed with the text, "The welfare ministry is out of money. It's back to work." Mr. Denkhaus messaged near constantly with friends to share articles and memes disparaging foreigners. At first they trafficked in provocations, ironically addressing one another as "mein Führer." Over time, they appeared to lose sight of the line separating trolling from sincere hate. Heavy social media users refer to this effect as "irony poisoning." "He said to his partner one day, 'And now we have to do something,'" Mr. Pauli recalled. Mr. Denkhaus and a friend doused the attic of a refugee group house with gasoline and set it on fire. No one was hurt. In court, his lawyer would argue that Mr. Denkhaus had shown no outward animus toward refugees before that night. It was only online that he'd dabbled in hate. Intended as exonerating — wasn't the real world what mattered? — this defense underscored how Facebook can provide a closed environment with its own moral rules. Mr. Denkhaus had little opportunity to encounter anti-refugee hatred in the real Altena, where overwhelmingly tolerant social norms prevailed. But within his Facebook echo chamber, he could drift unchecked toward extremism. Though Altena's residents condemned Mr. Denkhaus, his was not the last act of violence. Last year, the mayor was stabbed by a man said to be outraged by his pro-refugee policies. Mr. Pauli suspected a social media link: Local pages had filled with hateful comments toward the mayor just before the attack.

Distorted social norms

And these attacks may represent only the tip of a much larger iceberg, the University of Warwick researchers said. Each person nudged into violence, they believe, hints at a community that has become broadly more hostile to refugees. For most users, the effect will be subtler, but, by playing out more widely, perhaps more consequential. Traunstein, a Bavarian mountainside town is, in most ways, quite different from Altena. Its tourist economy is thriving. Young people are active in the community. Though the town leans liberal, the surrounding region is solidly center-right. But, as in Altena, Facebook use and anti-refugee violence rates are both unusually high. Could that hint at more than a few isolated vigilantes? We sought out a particular kind of user, known to researchers as a superposter, who is thought to embody the ways that Facebook can make a community incrementally more hostile to outsiders. Rolf Wasserman, an artist whose studio overlooks Traunstein's quaint central square, is not politically influential in any traditional sense. Though conservative, he is hardly extremist. But he is furiously active on Facebook. He posts a steady stream of rumors, strident opinion columns and news reports on crimes committed by refugees. Though none crosses into hate speech or fake news, in the aggregate, they portray Germany as beset by dangerous foreigners. "On Facebook, it's possible to reach people who are not highly political, to bring information to them," he said. "You can build peoples' political views on Facebook." Superposters tend to be "more opinionated, more extreme, more engaged, more everything," said Andrew Guess, a Princeton University social scientist. When more casual users open Facebook, often what they see is a world shaped by superposters like Mr. Wasserman. Their exaggerated worldviews play well on the algorithm, allowing them to collectively — and often unknowingly — dominate newsfeeds. "That's something special about Facebook," Dr. Paluck said. "If you end up getting a lot of time on the feed, you are influential. It's a difference with real life." In the offline world, people decide collectively whom to listen to and whom to ignore. Professional gatekeepers such as editors or party leaders decide which voices to elevate. Facebook overrides those practices. In a recent study, Dr. Paluck found that schoolchildren decide whether bullying is right or wrong based largely on what they believe their classmates think. But the students, as shorthand for figuring this out, paid special attention to a handful of influential peers. Dr. Paluck, by persuading the influential students to oppose bullying, could shift social norms in an entire school, reducing bullying by about a third. Isolating students who favor bullying and elevating those who oppose it can also reduce violence. By shuffling around the students just so, a few moderating voices could be made to set norms for the whole community. Facebook's algorithm, engineered to maximize the amount of time spent on the site, does the opposite of this. It elevates a class of superposters like Mr. Wasserman who, in the aggregate, give readers an impression that social norms are more hostile to refugees and more distrustful of authority than they really are. Even if no one endorses violence, it can come to feel more justifiable. Natascha Wolff has seen this firsthand, she said at a Traunstein church lunch for local Nigerian families. Ms. Wolff, who teaches at a vocational school, has found that young people like her students often express the most anti-refugee views. They seem to draw, she said, on things they saw on Facebook — and a mistaken belief that everyone agrees. Any rumor or tidbit about foreigners, she said, "sure gets around fast. People feel confirmed in their viewpoint." The ideological bubbles can be radicalizing, she added: "It's just, 'like, like, like.'" Her refugee students, she said, have had coffee or other objects thrown at them from windows — casual, light-of-day violence one only braves with the assumption that it will be tolerated. But police here aggressively pursue crimes against refugees, highlighting that some locals have a skewed perspective of their own community's social norms. A young woman who attends Ms. Wolff's vocational school, but asked to not be named so she could speak more freely, described lurid stories of refugee wrongdoing she'd read on Facebook. Everyone her age uses the site to discuss refugees, she said, and everyone agrees that they are a threat. She may have been misled by Facebook's tendency to sort people into like-minded groups. Our interviews in Traunstein, along with voter records, suggest that the town is split but leans liberal. Like most Germans, she is at little risk of committing violence. But her Facebook-tinged social norms show in other ways. She supported hardline anti-immigration policies, she said. When an African classmate was deported over an error in his paperwork, she'd hoped more would face similar fates. German politics are divided. Even if only a small fraction of Germans harden their views through Facebook, that could make a difference. Here in Bavaria, polls show rising support for the far-right, leading the dominant center-right party to adopt immigration policies so hard line they sparked a national crisis in July.

Without Facebook, violence drops