New media—news and entertainment entities native to the Internet, whether companies or individuals—are gaining a share of customer attention and wallet at the expense of legacy media—newspapers, television stations and magazines that have mostly been around for decades, even if they now exist, at least in part, online. This shift may seem natural, considering that the consumer Internet made new media possible by enormously lowering the costs of broadcasting content. This threatened legacy media not only by encouraging competition for consumer attention on a massive scale, but also by allowing the generation of content quicker and closer to the source of a story than legacy media could manage, and by challenging established legacy media price points and hence business models. However, the shift is actually beginning to accelerate beyond its natural rate, due to legacy media’s abject failure to effectively deal with the new environment. Furthermore, the likely direction of further technological and demographic change will only make things worse.

The Gell-Mann Amnesia Effect

In a fantastically funny talk on the prevalence of speculation, Michael Crichton coined the now infamous term the Gell-Mann amnesia effect, which he drolly named after the renowned physicist in order to ascribe more importance to the effect:

You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

This is both amusing and yet obvious. If we are not to believe the newspaper on a topic we know nothing about, then who or what should we believe? Do you personally know any experts on Palestine, or people with relevant first-hand experience? I’m sure some of us do, but the fact that such people—or people with any other expertise whatsoever—are in short supply strikes me as a reasonable explanation for the existence of newspapers in the first place, and media in general. If done well, intermediation of information is extremely efficient because nobody has the time to source the truth on absolutely everything all on their own. Some of us have jobs. So, is it funny because it’s true? No, it’s funny because its truth is a silly one. It needn’t be true, and in the age of the Internet it is becoming less true by the day.

Now, the internet by no means makes anybody an expert on anything—in fact, it dupes many into revealing that they are experts on precisely nothing—but it has provided a seemingly unbounded amount of information, should people be motivated to seek it out.

This provides a hint as to why the Gell-Mann amnesia effect might be decaying, if not yet fully dead. Purely educational material is valuable, certainly, in building the kind of esoteric knowledge that might eventually lead one to be able to debunk the odd extra article here or there, or, in time, add an entire discipline to one’s arsenal. But there is another possible sequence of events that is much faster and more dramatic: you can debunk articles by accessing knowledge in direct contradiction to them, without expertise, from anywhere, for free.

My First Foray into the Battle Between New and Old Media

This phenomenon, as it pertains to YouTube sensation PewDiePie, was the subject of a recent article of mine. Vox slandered PewDiePie in a manner so easily refutable and likely to cause a backlash that it makes one wonder if the staff at Vox really understand what YouTube is and how it works. Almost a third of all commentators on my article who mentioned my name—on Quillette, Twitter, Reddit and elsewhere—assumed I was a journalist. Almost all the messages I received did so too. I’m not. I have a regular 9–5 job that requires a threshold understanding of my subject matter. Were I ever to lie or otherwise fabricate information, I would be disappointed if I weren’t summarily fired, and possibly even sued.

I answered some of these emails, informing people I was not a journalist. I wasn’t sure how this information would be received. Would people be devastated, since their new-found faith in journalism was to be immediately shattered? Or would they be intrigued at the changes to the discipline and industry of journalism to which this misunderstanding pointed? I certainly hoped the latter, because it would demonstrate optimism, if nothing else, an optimism ultimately rooted in technological progress. Prior to the Internet, none of this could have happened. Today, I can be pointed towards informational resources by friendly Internet strangers who could be anybody, anywhere in the world. If I’m lucky, I can sit back and watch PewDiePie debunk his slanderers for me. This can all be very entertaining for those inhabiting one small corner of the Internet—while the wider media and public at large take little notice.

Then came Covington.

Covington: This Time It’s Different

The rumbling aftermath of the Covington debacle has thrown up the prospect of a defamation lawsuit against the news organizations and twitterati that promulgated false information about the boys involved. Unfortunately, the would-be defendants were offered a 48-hour grace period to delete the offending tweets and many took advantage of this, although C. J. Pearson and Ali Alexander, founders of the #VerifiedBully campaign, are creating a database of relevant tweets and media coverage, deleted or not.

But the more important judgment has likely already been pronounced in the court of public opinion. This does not represent the death knell of the entire legacy media establishment but it will need to be recovered from. I do not think that the importance of media in general will decline, but the incident is the best example yet of the ideologically driven and yet inexplicably poor tactics of legacy media, which have prompted a gradual consumer shift to new media platforms. The Covington narrative was arguably reversed in the first place because new media flexed its decentralized muscles and unearthed irrefutable evidence that legacy media had collectively perpetrated what can most charitably be described as spontaneously coordinated negligence.

After the initial outrage died down, legacy media had nowhere to hide. The scale of misinformation was so vast and so clearly politically motivated, and its targets were children (children who were called faggots and crackers by one group of screaming adults and then had another group chant directly in their faces—both of which groups of adults later lied about the incident). The likely defenders of the boys are not restricted to fringe groups, like fans of PewDiePie, science fiction or video games: they include anybody who is alarmed by the lustful destruction of the lives of children. That is, everyone.

Don’t They Know How Bad This Looks?

So, why doesn’t anybody who works at these institutions step in? The New York Times, CNN and Vox are businesses, however nominally. Surely at least some of the employees are aware of the need to continue to be profitable, which would require retaining their customers, which would require upholding their reputations, which would require avoiding easily refutable lies.

Well, no. To understand why, we must turn to Albert Hirschmann’s classic of political science, Exit, Voice and Loyalty. Hirschmann identifies the options for individuals to influence institutions as voice—exerting influence through persuasion of other members—and exit—cutting ties altogether and possibly joining a rival company. His central thesis is that economists tend to value exit as an explanatory tool but not voice, while political scientists value voice but not exit. This is a shame, Hirschmann claims, because almost every institution has both economic and political characteristics, and examining their interplay is far more fruitful in explaining real world phenomena than is ignoring one or the other altogether.

In the case of legacy versus new media, exit used to be possible, but inadvisable, as there were no rivals from which to receive the same service. Or at least none with enough of a difference in service to make exit worthwhile. With minimal exit, voice was valued. However, the Internet has given rise not only to legitimate rivals to legacy media, but has made the entire model partially redundant as—in an increasing variety of circumstances—the audience can go directly to the source material. Intermediation adds nothing. This isn’t necessarily positive. Given the natural human propensity for overreaction and confirmation bias, effective intermediation of information can be extremely valuable, not to mention efficient, for the audience. Regardless, there have been vast exits. My hunch is that voice doesn’t really matter, because everybody who thinks differently from the consensus has likely already exited.

Technology and Demographics May Solve This on Their Own

Setting aside the debate regarding the social utility of information intermediaries, their business model, too, is almost certainly redundant. Handwringing about how things should really work aside—and with one eye on the possibility of censorship resistant micropayments and general computing resources—Google and Facebook are the only ones making decent money under the new paradigm. The effect of this unsavory economic reality has in almost all cases been a shift to reliance on ads, which in turn means a reliance on clicks, which in turn very strongly incentivizes rapidly produced, poorly researched, maximally outraged clickbait garbage.

There is good reason to believe that this model will not survive the primacy of the millennials. As early as 2015, market analysis suggested that Gen Z had numerous markedly different characteristics from its millennial predecessors. This idea has resurfaced in the wake of Trump and has taken on a political tone that is at least distracting if not flat-out wrong. The Huffington Post does a good job of debunking this trope and presenting a more thoughtful analysis. What is worth focusing on from the Business Insider article is that Gen Z is “better at using the Internet,” “more entrepreneurial” and “less idealistic,” i.e. likely to be suspicious of outraged clickbait, desirous to find the source material for themselves, and admiring of internet entrepreneurs—and hence inevitably drawn to the YouTube personalities gleefully smeared by legacy media. They certainly will not like the look of fellow children having their lives ritualistically destroyed by legacy media for the alleged crime of privileged smirking. I would characterize Gen Z as the troll generation. They can both make and take a joke. In the current climate, this naturally pushes them further away from most intermediated information.

Gen Z has no loyalty. To call theirs an exit from legacy media is overdramatic. They were never a part of it. Nor are they likely to become so because there is no opportunity for voice. The Internet, on the other hand, is perhaps the greatest amplifier of voice in the history of technology. If it doesn’t edge out the printing press, then it will certainly give it a good run for its money. YouTube’s founding motto was broadcast yourself. A similar spirit is an inextricable part of the fabric of Blogger, Tumblr, WordPress and Medium; as well as, less directly, of Shopify, Stripe, Patreon and Squarespace and even of traditional social media, with its less polished formats. Intermediated information is being slowly decentralized away.

There may be nascent concerns about the willingness of these platforms to stick to their missions, but if they fail to do so, they will simply be replaced. If this replacement is not by a competitor it may ultimately be by improvements in technology itself. Things are heading in one direction. Journalism must adapt if it is to have any social value, economic basis or generational appeal: if it is to stoke loyalty and prevent exit.

This Areo article you’ve just read was, of course, not written by a journalist. I’m just a regular guy.