In 2013, James Evans, a University of Chicago sociologist and computational scientist, launched a study to see if science forged a bridge across the political divide. Did conservatives and liberals at least agree on biology and physics and economics? Short answer: No. “We found more polarization than we expected,” Evans told me recently. People were even more polarized over science than sports teams. At the outset, Evans said, “I was hoping to find that science was like a Switzerland. When we have problems, we can appeal to science as a neutral arbiter to produce a solution, or pathway to a solution. That wasn’t the case at all.”

Evans started his study on Amazon. You know the heading that says, “Customers who bought this item also bought”? Evans and his colleagues analyzed the top 100 items in this list for two “seed” books: Barack Obama’s Dreams from My Father and Mitt Romney’s No Apology. They repeated this process for each book in the top-100 list until they ran out of new titles. “The resulting ‘snowball sample,’ ” Evans and company wrote in their 2017 Nature Human Behaviour paper, “contained virtually all books in the largest strongly connected component in Amazon’s directed co-purchase network,” or 1,303,504 unique titles.

Courtesy of Dan Dry

After performing a co-purchase network analysis—the sort used to study co-citation and co-author networks—on this dataset, the scholars concluded that political ideology guided people to science books. With some curious results. Liberal readers preferred basic science (physics, astronomy, zoology), while conservatives went for applied and commercial science (criminology, medicine, geophysics).



“It seems like conservatives are happy to draw on science associated with economic growth—that’s what they want from science,” Evans said. “Science is more like Star Trek for liberals: traveling through worlds, searching for new meanings, searching for yourself.” Science turned out to be “a huge example of confirmation bias,” Evans said. “You expect something to be true, you want it to be true, you read books that affirm and confirm those truths.”

The thing most disturbing to me is the onslaught of claims about fake information and fake news.

Looking at the polarized results, Evans had an idea. What would happen if you put together a group of diverse people to produce information? What would the results look like? Evans knew just the place to conduct the experiment: Wikipedia. Evans and Misha Teplitskiy, a postdoctoral fellow at the Laboratory for Innovation Science at Harvard, and colleagues, studied 205,000 Wikipedia topics and their associated “talk pages,” where anybody can observe the debates and conversations that go on behind the scenes.



The scholars judged the quality of the articles on Wikipedia’s own assessments. “It’s based on internal quality criteria that is essentially: What do we want a good encyclopedia article to be? We want it to be readable, comprehensive, pitched at the right level, well-sourced, linked to other stuff,” Teplitskiy explained.

In their new Nature Human Behaviour paper, “The Wisdom of Polarized Crowds,” Evans and Teplitskiy concluded that polarization doesn’t poison the wells of information. On the contrary, they showed politically diverse editor teams on Wikipedia put out better entries—articles with higher accuracy or completeness—than uniformly liberal or conservative or moderate teams. It’s a surprising result and so I caught up with Evans and Teplitskiy to offer their interpretations.





What does Wikipedia tell us about diversity?

James Evans: People talk about the importance of diversity. It’s not diversity in general; it’s diversity in specific. If you have these different ideologies, it’s associated with different filters on the world, different intakes of information, and so when it comes to constructing reference knowledge on an encyclopedic web page that’s supposed to thoroughly characterize an area, you do a much better job because you have a lot more information that’s attended to by this ideologically diverse group.

Editors working on a social issues page said, “We have to admit that the position that was echoed at the end of the argument was much stronger and balanced.” Did they begrudgingly come to that? They did, and that’s the key. If they too easily updated their opinion, then they wouldn’t have been motivated to find counter-factual and counter-data arguments that fuel that conversation. We found that more diversity is associated with longer conversations. If they were immediately willing to give up on these things, then it wouldn’t have produced the sustained competition that ended up generating the balance that they, themselves, came to appreciate.



Which pages on Wikipedia benefit most from political diversity?

Evans: Political pages. The second most are social issues pages, which have a substantial political content. Even science pages benefit because sciences resonate with different political ideologies. And there’s no question that the science articles that benefited the most were the science articles that are associated with political polarization. I would be surprised if we didn’t see that across the science pages associated with the environment, which include climate change, but likely a lot of other things, including biodiversity. Those are kinds of science articles that benefited the most from political polarization because they’re the ones, unsurprisingly, for which diverse political perspectives end up offering really different filtered information.

Misha Teplitskiy: Psychologists and organizational scholars call this “task relevance.” It’s the idea that diversity in ideas should help only for tasks for which diversity is relevant. You expect ideological diversity to be most relevant for politics, less so for social issues, and less so for science. The surprising feature there is that it’s at all relevant to science, but generally we expect it to matter less and less the farther you get away from task relevance.

Courtesy of Misha Teplitskiy

The evidence for our contribution to climate change is unimpeachable. So might a Wikipedia entry that encouraged diverse opinions not produce a high quality result? Would that run counter to your study, to the importance of diversity?



Evans: You’re saying there are some things where diversity can just generate noise. In general, one could imagine this, and there’s a wonderful book called Merchants of Doubt which explores precisely this issue, that companies in a number of areas increased the amount of apparent diversity in a decreasingly diverse consensus about, for example, smoking’s influence on lung cancer. That’s certainly taking place in the world. But for some reason, and this is a tribute to some of the standards in the context of Wikipedia, people discipline each other and are effectively disciplined by higher-level editors.

There’s also a whole host of different perspectives that people might take with respect to global warming. Even though there might be general agreement that human activity is increasing greenhouse gases and higher temperatures, it could be one assumption has you thinking there are human solutions to human problems, and another one has you thinking of the importance of human stewardship over the earth. So different perspectives aren’t just generating artificial conflict in these contexts.

At the same time, our experience is for broad topics. There are few places where there’s enormous amounts of certainty in the sciences. My guess is in places where there is strong certainty, we’re not going to see a big effect from political diversity. Political diversity is not a magical substance. If the distribution of political perspectives aren’t correlated with useful information about the topic at hand, then you’re not going to see a benefit. You’re going to see noise. You might even see a detriment.

What do you think about fake news?

Evans: The thing most disturbing to me is the onslaught of claims about fake information and fake news. In some sense, all information is fake. All of it has a purpose, an angle. But the fact that now it’s just so easy to claim that it’s fake without any particular support for that claim, and it’s popular to do so, means it’s easier to discount alternative information than ever before.

Angles are useful. They motivate people to look in a certain place, to search out information that you probably wouldn’t have searched out if you weren’t motivated by the possession of a belief. Angles end up having a lot of value, unless you discount them all. It begins with Trump arguing that everything’s fake news and then people arguing that Trump’s producing fake news all the time. There’s this cloud of fakery out there, and, of course, it’s exacerbated by the proliferation of bots and other things generating noise. I see that in mass media news in the same way I see it online. It’s a new level. It’s like we’ve just discovered that there’s bias in the system and so everything is biased, categorically, and we can agree or disagree with it at will.

I hope that we can begin to persuade people to really value the importance of bias, that bias is critical to how we view things, that there isn’t an unbiased position.

Why are the highest quality articles overseen or written by an ideologically diverse group of people?



Evans: More collective insight is generated when you draw people who have non-random and minimally overlapping sets of information or knowledge exposures and you put them in a forum that’s well-regulated by a set of norms, which can be appealed to and are, in fact, appealed to. I was really struck by the fact that people often experience this. When they experience balanced debates on these sites, they really described the process as painful and beleaguered but the outcome as satisfying.

Teplitskiy: Ideologically diverse teams end up debating more. These people are carrying different bits of knowledge. When they bring it together, they’re spending more effort to aggregate it into good content. Even aside from increased effort, we’re also finding that the kinds of debates they have are a bit more focused. They zero in on a smaller set of issues and really hash out those issues that are presumably most problematic. They end up having more conflict and rely on policies more for regulating what we call their “task conflict,” or conflict that’s oriented around creating content, and they also have a lower relational conflict—they gang up on each other less and harass each other less on a personal level compared to more unbalanced teams. Those that are more balanced have a lower harassment prevalence.

What happens when editor teams are politically unbalanced, or overwhelmingly left- or right-wing?

Evans: When you have a single person going in and describing a set of pages as, “They look like Russian propaganda,” those people don’t recognize that they are entering a system of 30 or 40 people who have constructed this page in conversation, and they are coming in and really just trashing that characterization. Almost invariably, they just got beaten up, labeled as trolls, sent out of the community on a rod with tar and feathers. The homogenous group has a sense of, “We’ve built the social contract and then this person is coming in from the outside.” We found empirically, in our study, that there was a lot more toxic language when you have these imbalances.

Teplitskiy: Our data is suggestive. You would prefer diverse teams that are in a moderate position. Shifting in either direction away from the middle or the moderate position as a team is negatively associated with quality.

Also in Sociology The Hannah Montana Hypothesis By Jessica Seigel It’s pamper time for Maddie, 16, who’s sitting down for a manicure right now, but not before telling her 2,000 Twitter followers. “Spa day,” the teen taps out, her long blond hair clipped up in a sloppy bun as she...READ MORE

Are there some key lessons that groups that produce or evaluate ideas can take from your Wikipedia study?

Teplitskiy: One lesson that our work raises is around branding or creating a culture and letting people know about it, and letting it be the mechanics of how you organize a platform. One interesting thing about Wikipedia is it’s got a very strong culture. If you want to play in the sandbox, you should be ready to back up your claims, cite your sources, cite sources that are reasonable, listen to others. That clearly discourages some people from joining, people who are not willing to play by reasonable rules. They do more filtering up front on who can play, not in a heavy handed way, but more by signaling their culture strongly, and people who don’t like it don’t stick around.

Compared to the science-book study, the Wikipedia paper sure seems to hold out hope for consensus.

Evans: Yes, and I hope that we can begin to persuade people, with this kind of paper, to really value the importance of bias, that bias is critical to how we view things, that there isn’t an unbiased position. Only when we begin to demonstrate the value of bias can we battle the cloud that bias is bad. Everything’s biased, so we have to reach into our core values and use those to guide our way through this world. There’s a strong scientific value behind bias, so our hope is to begin a conversation about the value of polarized crowds.

What can scientists learn from your results?

Evans: My hope is that not just scientists, but people with opinions and political stakes in general, can seriously consider the fact that people who don’t share their political viewpoints have something valuable to say—and even if they don’t have something valuable to say about a particular political topic, that their different experience and perspective has likely given them access to other kinds of information that will be valuable and new to you. That’s the key to unlocking the potential of polarization: to allow people to constructively contribute to knowledge projects and other projects together. If you know enough about Wikipedia to open up the talk page, which anybody can do but almost nobody does, you’ll see extensive discussions going on. You’ll see people carefully, painstakingly employing diverse perspectives that are perceived by experts as being systematically better. It just produces more robust knowledge because there’s less ideological filtering going on.





Brian Gallagher is the editor of Facts So Romantic, the Nautilus blog. Follow him on Twitter @BSGallagher.