Abstract Social media aggregate people around common interests eliciting collective framing of narratives and worldviews. However, in such a disintermediated environment misinformation is pervasive and attempts to debunk are often undertaken to contrast this trend. In this work, we examine the effectiveness of debunking on Facebook through a quantitative analysis of 54 million users over a time span of five years (Jan 2010, Dec 2014). In particular, we compare how users usually consuming proven (scientific) and unsubstantiated (conspiracy-like) information on Facebook US interact with specific debunking posts. Our findings confirm the existence of echo chambers where users interact primarily with either conspiracy-like or scientific pages. However, both groups interact similarly with the information within their echo chamber. Then, we measure how users from both echo chambers interacted with 50,220 debunking posts accounting for both users consumption patterns and the sentiment expressed in their comments. Sentiment analysis reveals a dominant negativity in the comments to debunking posts. Furthermore, such posts remain mainly confined to the scientific echo chamber. Only few conspiracy users engage with corrections and their liking and commenting rates on conspiracy posts increases after the interaction.

Citation: Zollo F, Bessi A, Del Vicario M, Scala A, Caldarelli G, Shekhtman L, et al. (2017) Debunking in a world of tribes. PLoS ONE 12(7): e0181821. https://doi.org/10.1371/journal.pone.0181821 Editor: Jose Javier Ramasco, Instituto de Fisica Interdisciplinar y Sistemas Complejos, SPAIN Received: February 21, 2016; Accepted: May 13, 2017; Published: July 24, 2017 Copyright: © 2017 Zollo et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Data Availability: The entire data collection process has been carried out exclusively by means of the Facebook Graph API, which are publicly available. Further details about data collection are provided in the Methods section of the paper, together with the complete list of pages. Funding: Funding for this work was provided by EU FET project MULTIPLEX nr. 317532, SIMPOL nr. 610704, DOLFINS nr. 640772, SOBIGDATA 654024, IMT/eXtrapola Srl (P0082). SH and LS were supported by the Israel Ministry of Science and Technology, the Japan Science and Technology Agency, the Italian Ministry of Foreign Affairs and International Cooperation, the Israel Science Foundation, ONR and DTRA. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Competing interests: The authors have declared that no competing interests exist.

Introduction Socio-technical systems and microblogging platforms such as Facebook and Twitter have created a direct path from producers to consumers of content, changing the way users get informed, debate ideas, and shape their worldviews [1–6]. Misinformation on online social media is pervasive and represents one of the main threats to our society according to the World Economic Forum [7, 8]. The diffusion of false rumors affects public perception of reality as well as the political debate [9]. Indeed, links between vaccines and autism, the belief that 9/11 was an inside job, or the more recent case of Jade Helm 15—a simple military exercise that was perceived as the imminent threat of the civil war in the US—are just few examples of the consistent body of the collective narratives grounded on unsubstantiated information. Confirmation bias plays a pivotal role in cascades dynamics and facilitates the emergence of echo chambers [10]. Indeed, users online show the tendency a) to select information that adheres to their system of beliefs even when containing parodistic jokes; and b) to join polarized groups [11]. Recently, researches have shown [12–17] that continued exposure to unsubstantiated rumors may be a good proxy to detect gullibility—i.e., jumping the credulity barrier by accepting highly implausible theories—on online social media. Narratives, especially those grounded on conspiracy theories, play an important cognitive and social function in simplifying causation. They are formulated in a way that is able to reduce the complexity of reality and to tolerate a certain level of uncertainty [18–20]. However, conspiracy thinking creates or reflects a climate of disengagement from mainstream society and recommended practices [21]. Several efforts are striving to contrast misinformation spreading from algorithmic-based solutions to tailored communication strategies [22–27] but not much is known about their efficacy. In this work we characterize the consumption of debunking posts on Facebook and, more generally, the reaction of users to dissenting information. We perform a thorough quantitative analysis of 54 million US Facebook users and study how they consume scientific and conspiracy-like contents. We identify two main categories of pages: conspiracy news—i.e. pages promoting contents neglected by main stream media—and science news. Using an approach based on [12, 14, 15], we further explore Facebook pages that are active in debunking conspiracy theses (see section Materials and methods for further details about data collection). Notice that we do not focus on the quality of the information but rather on the possibility for verification. Indeed, it is easy for scientific news to identify the authors of the study, the university under which the study took place and if the paper underwent a peer review process. On the other hand, conspiracy-like content is difficult to verify because it is inherently based upon suspect information and is derived allegations and a belief in secrets from the public. The self-description of many conspiracy pages on Facebook, indeed, claims that they inform people about topics neglected by mainstream media and science. Pages like I don’t trust the government, Awakening America, or Awakened Citizen, promote wide-ranging content from aliens, chem-trails, to the causal relation between vaccinations and autism or homosexuality. Conversely, science news pages—e.g., Science, Science Daily, Nature—are active in diffusing posts about the most recent scientific advances. The list of pages has been built by censing all pages with the support of very active debunking groups (see section Materials and methods for more details). The final dataset contains pages reporting on scientific and conspiracy-like news. On a time span of five years (Jan 2010, Dec 2014) we downloaded all public posts (with the related lists of likes and comments) of 83 scientific and 330 conspiracy pages. In addition, we identified 66 Facebook pages aiming at debunking conspiracy theories. Our analysis shows that two well-formed and highly segregated communities exist around conspiracy and scientific topics—i.e., users are mainly active in only one category. Focusing on users interactions with respect to their preferred content, we find similarities in the consumption of posts. Different kinds of content aggregate polarized groups of users (echo chambers). At this stage we want to test the role of confirmation bias with respect to dissenting (resp., confirmatory) information from the conspiracy (resp., science) echo chamber. Focusing on a set of 50,220 debunking posts we measure the interaction of users from both conspiracy and science echo chambers. We find that such posts remain confined to the scientific echo chamber mainly. Indeed, the majority of likes on debunking posts is left by users polarized towards science (∼67%), while only a small minority (∼7%) by users polarized towards conspiracy. However, independently of the echo chamber, the sentiment expressed by users when commenting on debunking posts is mainly negative.

Conclusions Users online tend to focus on specific narratives and select information adhering to their system of beliefs. Such a polarized environment might foster the proliferation of false claims. Indeed, misinformation is pervasive and really difficult to correct. To smooth the proliferation of unsubstantiated rumors major corporations such as Facebook and Google are studying specific solutions. Indeed, examining the effectiveness of online debunking campaigns is crucial for understanding the processes and mechanisms behind misinformation spreading. In this work we show the existence of social echo chambers around different narratives on Facebook in the US. Two well-formed and highly segregated communities exist around conspiracy and scientific topics—i.e., users are mainly active in only one category. Furthermore, by focusing on users interactions with respect to their preferred content, we find similarities in the way in which both forms of content are consumed. Our findings show that debunking posts remain mainly confined within the scientific echo chamber and only few users usually exposed to unsubstantiated claims actively interact with the corrections. Dissenting information is mainly ignored and, if we look at the sentiment expressed by users in their comments, we find a rather negative environment. Furthermore we show that the few users from the conspiracy echo chamber who interact with the debunking posts manifest a higher tendency to comment, in general. However, if we look at their commenting and liking rate—i.e., the daily number of comments and likes—we find that their activity in the conspiracy echo chamber increases after the interaction. Thus, dissenting information online is ignored. Indeed, our results suggest that debunking information remains confined within the scientific echo chamber and that very few users of the conspiracy echo chamber interact with debunking posts. Moreover, the interaction seems to lead to an increasing interest in conspiracy-like content. On our perspective the diffusion of bogus content is someway related to the increasing mistrust of people with respect to institutions, to the increasing level of functional illiteracy—i.e., the inability to understand information correctly—affecting western countries, as well as the combined effect of confirmation bias at work on a enormous basin of information where the quality is poor. According to these settings, current debunking campaigns as well as algorithmic solutions do not seem to be the best options. Our findings suggest that the main problem behind misinformation is conservatism rather than gullibility. Moreover, our results also seem to be consistent with the so-called inoculation theory [34], for which the exposure to repeated, mild attacks can let people become more resistant in changing their ordinary beliefs. Indeed, being repeatedly exposed to relatively weak arguments (inoculation procedure) could result in a major resistance to a later persuasive attack, even if the latter is stronger and uses arguments different from the ones presented before i.e., during the inoculation phase. Therefore, when users are faced with untrusted opponents in online discussion, the latter results in a major commitment with respect to their own echo chamber. Thus, a more open and smoother approach, which promotes a culture of humility aiming at demolish walls and barriers between tribes, could represent a first step to contrast misinformation spreading and its persistence online.

Acknowledgments The authors declare no competing interests. Funding for this work was provided by EU FET project MULTIPLEX nr. 317532, SIMPOL nr. 610704, DOLFINS nr. 640772, SOBIGDATA 654024, IMT/eXtrapola Srl (P0082). SH and LS were supported by the Israel Ministry of Science and Technology, the Japan Science and Technology Agency, the Italian Ministry of Foreign Affairs and International Cooperation, the Israel Science Foundation, ONR and DTRA. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. We would like to thank: Dr. Igor Mozetič for his precious help with the sentiment analysis task; Geoff Hall and “Skepti Forum”, for providing fundamental support in defining the atlas of news sources in the US Facebook; Francesca Pierri for her valuable advices and suggestions.

Author Contributions Conceptualization: WQ FZ. Formal analysis: WQ FZ AB MDV. Investigation: WQ FZ. Methodology: WQ FZ. Software: WQ FZ AB MDV. Supervision: WQ. Validation: WQ FZ AB MDV AS GC LS SH. Visualization: WQ FZ. Writing – original draft: WQ FZ AB MDV AS GC LS SH. Writing – review & editing: WQ FZ.