We live in a complex world, and it’s impossible to be an expert on everything that impacts our lives. In many domains, we have to trust the expertise of others to guide our decisions. Yet not all experts hold rational beliefs, and many people who are framed as experts in media are not actually experts. How do we separate the wheat from the chaff, focusing on high-quality sources of information and ignoring low-quality sources?

Over the years I’ve observed the behavior of various public figures in my areas of expertise, and I feel that I’ve developed pretty good nose for quickly sniffing out the credibility of public experts. In this post, I’ve attempted to take my intuitions and put them in writing so I can pass at least some of them on to others.

Here is a list of some of the heuristics I use when I’m assessing the credibility of people who are framed as experts. None of these are ironclad rules, but together they can help you get a quick sense of whether to listen to someone:

Does he have training or experience in the subject? This is particularly relevant when someone disagrees with expert consensus. Sometimes people get a platform just because they have a novel or interesting idea, even if that idea is unconvincing to a knowledgeable person. It is of course possible that the non-expert is right and the experts are wrong, but it’s unlikely. This is just a heuristic, since in some areas the experts are truly not knowledgeable. For example, economists are barely better than chance at predicting the short-term behavior of the economy, but this doesn’t prevent some of them from prognosticating about it. Does she try to explain everything with one idea? The work of Philip Tetlock, PhD and others has shown that people who have one big idea to explain everything (“hedgehogs” or ideologues) are very bad at accurately modeling the world, predicting outcomes, and recommending effective actions. These people are often selected for media attention because they are clear and confident about their beliefs. The world is a complex place, and people who are able to model that complexity in their minds have better information than those who aren’t. Look for people who tend to use multi-factor explanations. Does he show nuance? Related to #2. There are usually exceptions and nuances. Is this person able to build them into his mental model? Is she able to change her mind? Inability to change one’s mind is a hallmark of an irrational belief system. Can you find examples of this person changing her mind in the past when presented with new evidence or a better interpretation of existing evidence? Does he lean toward conspiracy explanations and/or blaming the government? Large-scale conspiracies are improbable and the government is a favorite punching bag for cranks. Does she cite evidence that isn’t representative of the literature as a whole? This is common among experts, less common among the best experts, and very common among non-experts. It’s also often easy to spot with a little effort. Do a quick Google Scholar search for a recent meta-analysis on the topic, ideally from the Cochrane Collaboration. Are the conclusions of the meta-analysis consistent with the evidence she is citing, or is she cherry-picking individual studies that support her position? Does he portray experts who disagree with him as corrupt, stupid, or dishonest? Corruption, irrationality, and dishonesty do exist among experts, but if someone makes these claims without presenting clear evidence for them, it’s a bad sign. Accusing opposing experts of “lies”, “scams”, and other similar language is a red flag that the person making the claim is not objective. Conflicts of interest are common among experts, and good to keep in mind, but it’s useful to consider possible conflicts of interest of the person pointing them out as well. Does the topic relate to personal identity, or is it otherwise highly political or controversial? Domains such as religion, politics, and nutrition relate strongly to peoples’ personal identities. In this context, beliefs are often driven by group affiliation rather than rational consideration of evidence. The likelihood that someone is providing high-quality information in these domains is lower than in less controversial areas like physics or neuroscience.

What did I miss?