After an anonymous source alleged that Facebook’s Trending News algorithm (and human staff) was intentionally hiding conservative news from the social network, all hell broke loose. Congressional hearings have been called. Whether the reports are right–and whether hearings are justified–underneath the uproar is a largely unspoken truth: The algorithms that drive social networks are shifting the reality of our political systems—and not for the better.

advertisement

advertisement

Algorithms, network effects, and zero-cost publishing are enabling crackpot theories to go viral. The filter bubble–the idea that online recommendation engines learn what we like and thus keep us only reading things we agree with–has evolved. Algorithms, network effects, and zero-cost publishing are enabling crackpot theories to go viral, and—unchecked—these ideas are impacting the decisions of policy makers and shaping public opinion, whether they are verified or not. Fiction As Fact First it is important to understand the technology that drives the system. Most algorithms work simply: Web companies try to tailor their content (which includes news and search results) to match the tastes and interests of readers. However as online organizer and author Eli Pariser says in the TED Talk where the idea of the filter bubble became popularized: “There’s a dangerous unintended consequence. We get trapped in a ‘filter bubble’ and don’t get exposed to information that could challenge or broaden our worldview.” Facebook’s news feed and personalized search delivers results that are tailored just to us because a social network’s business is to keep us interested and happy. Feeling good drives engagement and more time spent on a site, and that keeps a user targetable with advertisements for longer. Pariser argues that this nearly invisible editing of the Internet limits what we see—and that it will “ultimately prove to be bad for us and bad for democracy.” In his 1962 book, The Image: A Guide to Pseudo-Events in America, former Librarian of Congress Daniel J. Boorstin describes a world where our ability to technologically shape reality is so sophisticated, it overcomes reality itself. “We risk being the first people in history,” he writes, “to have been able to make their illusions so vivid, so persuasive, so ‘realistic’ that they can live in them.” Since Pariser’s TED Talk, we’ve reached the point where social networks are now used as primary news outlets. Seeking news from traditional sources—newspapers and magazines—has been replaced with a new model: getting all of one’s news from trending stories on social networks. The people that we know best are most likely to influence us because we trust them. Their ideas and beliefs shape ours. And the tech behind social networks is built to enhance this. Where “proximity” used to mean knowing people next door or down the street, it now includes online communities. It’s easier than ever to find like-minded people independently of geography. Join an anti-vaccine group, and your suggestions will include anti-GMO, chemtrail watch, flat Earther (yes, really), and “curing cancer naturally” groups. Once a user joins a single group on Facebook, the social network will suggest dozens of others on that topic, as well as groups focused on tangential topics that people with similar profiles also joined. That is smart business. However, with unchecked content, it means that once people join a single conspiracy-minded group, they are algorithmically routed to a plethora of others. Join an anti-vaccine group, and your suggestions will include anti-GMO, chemtrail watch, flat Earther (yes, really), and “curing cancer naturally” groups. Rather than pulling a user out of the rabbit hole, the recommendation engine pushes them further in. We are long past merely partisan filter bubbles and well into the realm of siloed communities that experience their own reality and operate with their own facts.

advertisement

advertisement

advertisement

How Do We Fix This? Address The Underlying Tech Ultimately, we need our best minds playing both offense and defense if we are going to reduce the prevalence and impact of conspiracist influence both online, and in real life. How do we do that? Our platform designers themselves should be considering what steps they can take to bring us back. We need a shift in how we design our ever-evolving social platforms. The very structure and incentives in social networks have brought us to this point. Our platform designers themselves should be considering what steps they can take to bring us back. Perhaps we should have more conversations about ethics in design–and maybe these Facebook allegations will kick that off. Their product design is having a dramatic impact on public policy, and the effects are only going to get stronger. What responsibility do the designers of those products have to civil discourse? Platforms have the power here. Some have begun to introduce algorithms that warn readers that a share is likely a hoax, or satire. Google is investigating the possibility of “truth ranking” for web searches, which is promising. These are great starting points, but a regulation by algorithm has its own set of advantages and pitfalls. The primary concern is that turning companies into arbiters of truth is a slippery slope, particularly where politically rooted conspiracies are concerned. But we have to start somewhere. As Eli Pariser said: “We really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. We need you to make sure that they’re transparent enough that we can see what the rules are that determine what gets through our filters.” The Internet isn’t separate from the real world; building the web we want is building the future we want. Renee DiResta works in technology in San Francisco. She volunteers on public health policy issues with Vaccinate California. Follow her on Twitter.