The large availability of user provided contents on online social media facilitates people aggregation around shared beliefs, interests, worldviews and narratives. In spite of the enthusiastic rhetoric about the so called collective intelligence unsubstantiated rumors and conspiracy theories—e.g., chemtrails, reptilians or the Illuminati—are pervasive in online social networks (OSN). In this work we study, on a sample of 1.2 million of individuals, how information related to very distinct narratives—i.e. main stream scientific and conspiracy news—are consumed and shape communities on Facebook. Our results show that polarized communities emerge around distinct types of contents and usual consumers of conspiracy news result to be more focused and self-contained on their specific contents. To test potential biases induced by the continued exposure to unsubstantiated rumors on users’ content selection, we conclude our analysis measuring how users respond to 4,709 troll information—i.e. parodistic and sarcastic imitation of conspiracy theories. We find that 77.92% of likes and 80.86% of comments are from users usually interacting with conspiracy stories.

Introduction

The World Wide Web has changed the dynamics of information transmission as well as the agenda-setting process [1]. Relevance of facts, in particular when related to social relevant issues, mingle with half-truths and untruths to create informational blends [2, 3]. In such a scenario, as pointed out by [4], individuals can be uninformed or misinformed and the role of corrections in the diffusion and formation of biased beliefs are not effective. In particular, in [5] online debunking campaigns have been shown to create a reinforcement effect in usual consumers of conspiracy stories. In this work, we address users consumption patterns of information using very distinct type of contents—i.e., main stream scientific news and conspiracy news. The former diffuse scientific knowledge and the sources are easy to access. The latter aim at diffusing what is neglected by manipulated main stream media. Specifically, conspiracy theses tend to reduce the complexity of reality by explaining significant social or political aspects as plots conceived by powerful individuals or organizations. Since these kinds of arguments can sometimes involve the rejection of science, alternative explanations are invoked to replace the scientific evidence. For instance, people who reject the link between HIV and AIDS generally believe that AIDS was created by the U.S. Government to control the African American population [6]. The spread of misinformation in such a context might be particularly difficult to detect and correct because of the social reinforcement—i.e. people are more likely to trust an information someway consistent with their system of beliefs [7–17]. The growth of knowledge fostered by an interconnected world together with the unprecedented acceleration of scientific progress has exposed the society to an increasing level of complexity to explain reality and its phenomena. Indeed, a shift of paradigm in the production and consumption of contents has occurred, utterly increasing the volumes as well as the heterogeneity of available to users. Everyone on the Web can produce, access and diffuse contents actively participating in the creation, diffusion and reinforcement of different narratives. Such a large heterogeneity of information fostered the aggregation of people around common interests, worldviews and narratives.

Narratives grounded on conspiracy theories tend to reduce the complexity of reality and are able to contain the uncertainty they generate [18–20]. They are able to create a climate of disengagement from mainstream society and from officially recommended practices [21]—e.g. vaccinations, diet, etc. Despite the enthusiastic rhetoric about the collective intelligence [22, 23] the role of socio-technical system in enforcing informed debates and their effects on the public opinion still remain unclear. However, the World Economic Forum listed massive digital misinformation as one of the main risks for modern society [24].

A multitude of mechanisms animates the flow and acceptance of false rumors, which in turn create false beliefs that are rarely corrected once adopted by an individual [8, 10, 25, 26]. The process of acceptance of a claim (whether documented or not) may be altered by normative social influence or by the coherence with the system of beliefs if the individual [27, 28]. A large body of literature addresses the study of social dynamics on socio-technical systems from social contagion up to social reinforcement [12–15, 17, 29–41].

Recently in [42, 43] it has been shown that online unsubstantiated rumors—such as the link between vaccines and autism, the global warming induced by chem-trails or the secret alien government—and main stream information—such as scientific news and updates—reverberate in a comparable way. Pervasiveness of unreliable contents might lead to mix up unsubstantiated stories with their satirical counterparts—e.g. the presence of sildenafil-citratum (the active ingredient of Viagra™) [44] in chem-trails or the anti hypnotic effects of lemons (more than 45000 shares on Facebook) [45, 46]. In fact, there are very distinct groups, namely trolls, building Facebook pages as a caricatural version of conspiracy news. Their activities range from controversial comments and posting satirical contents mimicking conspiracy news sources, to the fabrication of purely fictitious statements, heavily unrealistic and sarcastic. Not rarely, these memes became viral and were used as evidence in online debates from political activists [47].

In this work we target consumption patterns of users with respect to very distinct types of information. Focusing on the Italian context and helped by pages very active in debunking unsubstantiated rumors (see acknowledgment section), we build an atlas of scientific and conspiracy information sources on Facebook. Our dataset contains 271,296 post created by 73 Facebook pages. Pages are classified according to the kind of information disseminated and their self description in conspiracy news—alternative explanations of reality aiming at diffusing contents neglected by main stream information—and scientific news. For further details about the data collection and the dataset refer to the Methods section. Notice that it is not our intention claiming that conspiracy information are necessarily false. Our focus is on how communities formed around different information and narratives interact and consume their preferred information.

In the analysis, we account for user interaction with respect to pages public posts—i.e. likes, shares, and comments. Each of these actions has a particular meaning [48–50]. A like stands for a positive feedback to the post; a share expresses the will to increase the visibility of a given information; and comment is the way in which online collective debates take form around the topic promoted by posts. Comments may contain negative or positive feedbacks with respect to the post. Our analysis starts with an outline of information consumption patterns and the community structure of pages according to their common users. We label polarized users—users which their like activity (positive feedback) is almost (95%) exclusively on the pages of one category—and find similar interaction patterns on the two communities with respect to preferred contents. According to literature on opinion dynamics [37], in particular the one related to the Bounded confidence model (BCM) [51]—two individuals are able to influence each other only if the distance between their opinion is below a given distance—users consuming different and opposite information tend to aggregate into isolated clusters (polarization). Moreover, we measure their commenting activity on the opposite category finding that polarized users of conspiracy news are more focused on posts of their community and that they are more oriented on the diffusion of their contents—i.e. they are more prone to like and share posts from conspiracy pages. On the other hand, usual consumers of scientific news result to be less committed in the diffusion and more prone to comment on conspiracy pages. Finally, we test the response of polarized users to the exposure to 4709 satirical and demential version of conspiracy stories finding that, out of 3888 users labeled on likes and 3959 on comments, the most of them are usual consumers of conspiracy stories (80.86% of likes and 77.92% of comments). Our findings, coherently with [52–54] indicate that the relationship between beliefs in conspiracy theories and the need for cognitive closure—i.e. the attitude of conspiracists to avoid profound scrutiny of evidence to a given matter of fact—is the driving factors for the diffusion of false claims.