Anybody who communicates for a living faces a certain degree of frustration. No matter how hard you try, some people just don't get the intended message. For areas like politics and economics, this typically means that some people don't buy in to your interpretation of things. But when it comes to science, the failures end up involving an inability to convey basic facts. No matter how hard brilliant writers and speakers have tried to explain the evidence, there's still a bunch of people out there who refuse to accept that evolution can shape life on Earth.

Perhaps because there's a very obvious and very measurable failure there, science communicators tend to expend a lot of thought on considering how we could be doing a better job. One of the areas that has been receiving a lot of attention lately is what's picked up the handle "the science of science communication." It comes up frequently at various science communication meetings; the National Academies of Science held a two-day conference dedicated to it last year and is planning another.

Communications breakdown

The basic idea is that humans don't have some specialized system dedicated to processing facts. Instead, like any other form of information, they're absorbed through a system that isn't always paying full attention. Then the facts are filtered through a variety of cultural lenses before being assimilated with previous beliefs. (We've covered two examples of how this plays out recently.) Once they're in the brain, the facts compete with other ideas that are sometimes contradictory—different portions of the US' population will tell you that the Earth is billions of years old depending on how, precisely, the question is asked.

The science of science communication is an attempt to take what we know about human information processing and use that to get better about communicating science. In general, it's a really good idea. Take, for example, what we know about working memory, where our brains temporarily stuff things—images, numbers, abbreviations, what have you. Working memory helps us complete a task (like, say, reading an article) without wasting the resources needed to commit something to permanent memory. Even the best of us only have about a half-dozen places to park things in our working memory, so there's a limit to how much we can handle at once. If you read a typical scientific paper, though, or even some writing meant for a broad audience, you'll find it's a blizzard of acronyms and technical terms.

In other words, in communicating science, we're sometimes ignoring things that science told us years ago.

The problem I see arising is that things start to break down as soon as you get past basic communication principles. To start, we can look at what's called the deficit model to try to explain some of the barriers to communication. That's the assumption that people who don't believe basic facts—that the world is four billion years old, that the Earth has warmed over the last century, etc.—do so because they haven't been sufficiently exposed to the compendium of information that we've used to construct these large, overarching facts. If we fix this informational deficit, the reasoning goes, more people will accept reality.

But it doesn't work that way. People don't think the Earth is young because they haven't been exposed to sufficient evidence for its age; they want to believe that it's young because they feel a cultural affinity for other people who believe that way (or, in some cases, they think believe that way). If you just dump more facts on them, they'll simply undertake a biased process of assimilating those facts in order to protect their beliefs. In many cases, a strict fact dump actually leaves people less likely to believe the facts, since the protective processes get triggered so strongly.

There are ways around this, like using culturally resonant phrasing and finding authorities who belong to the same cultural groups but accept the facts. But it's clear that simply attempting to fix the informational deficit won't do the job. The deficit model is a dumb way to go about shifting the public understanding, and people who care about scientific communications are regularly warned against it.

Mistargeted communication

I'm a full-time science communicator, and the crusade against the deficit approach creates all sorts of problems for me. To begin with, my job is largely to convey basic facts. I also try to provide context and a degree of analysis where appropriate, but it's ultimately the latest research findings that make up the majority of our science content. More generally, I'm not attempting to resonate with any particular cultural group, unless you consider an interest in the natural world a cultural affinity. So I'm often told that my job goes against the best practices of science communication.

I'm not the only one who faces that problem. If a grad student wants to describe their research to the public, they're not going to try to resonate culturally with anyone; instead, they need skills in distilling complex science down to things that are easy to grasp. In contrast, someone advocating for science-based policies doesn't necessarily need to the ability to distill complex science but will need to know how to use culturally resonant phrasing if they want to get everyone on board.

Unfortunately, all too often, different types of communicators are getting this one-size-fits-all advice. Sure, the deficit model is wrong, but does that matter to you? If you are planning on talking about your own research, how do you find out what's effective?

Another problem with the science of science communication is one that it shares with other fields of science: it's tentative. Working memory is pretty well-studied, and what we know about it is unlikely to undergo a major overhaul. But that's not true for everything else we're finding out. For example, a few years back, a study found that acceptance of nanotechnology was lower among the religious, and it speculated on why that might be the case. This year, a study looking at a different question found that the original results didn't hold up.

If someone had been trying to reach a broad audience about nanotechnology a few years ago, they might have been advised to craft a message that resonated with religious people. Now, they may be told that this sort of targeting may not be necessary—and it could possibly be harmful.

Something’s missing here...

So what we have is a situation where people who are actually doing the science communicating are getting a mess of advice, some of which doesn't apply and some of which doesn't stand the test of time. What seems to be missing is an applied science of science communication—the communication equivalent of the engineers and biotechnicians who take the latest research and figure out how to put it to use.

What I suggest we need is a discipline that's filled with people who are willing to comb the literature for the most up-to-date findings and figure out whether they're likely to be reliable. They then need to determine which groups of communicators the results might apply to and test whether applying them actually leads to better communication. Ultimately, these applied scientists need to try to make sure that the communicators themselves actually know about what they find. As far as I know, an applied communications discipline doesn't really exist, and I've not seen any group make a big deal about funding it—even groups that have a vested interest in effective science communications.

Right now, what we tend to have instead is science communicators being told that they're doing things wrong (or they're being given advice that doesn't apply to them). It would be nice if we could have some more focused advice on how different types of communicators could do things right.