OMFSM, we didn’t know what we didn’t know until 1972? As a humanist, I have great faith in the capacity of humans to do good and to be responsible for our own destiny. I use that word ‘faith’ specifically because I believe* these kind things about my fellow humans despite mountains of evidence to the contrary. Every once in a while, that faith is shaken by some troubling news about just how bumbling we humans are.

O n my previous post about Pope Benedict, commenter playonwords posted the following, “The debate over what the term “truth” means has festered in philosophy for nearly two and a half millennia and they are still no closer to resolving it.” In his defense, he specified “in philosophy” which is a whole different ballgame. But my initial reaction was to be outraged at the concept that we just don’t know what ‘truth’ is. To explain the point, I first pointed to Francis Bacon, who, in 1620 pioneered the scientific method. His approach of hypothesis, experimentation, and observation is the foundation for how we understand truth. Much love to Aristotle and Archimedes and Da Vinci and Magellan, but up until Francis Bacon, we were all just winging it.

But I also wanted to talk about cognitive biases. Cognitive biases relate to the wide range of psychological failures we make when trying to learn things. We humans very often pick data that already support what we believe to be true. We blame others for what is really our own fault and take credit when unrelated actions are the real culprit. We pretend we knew something all along when we really didn’t have any idea. What blew me away was that these ideas weren’t even codified until 1972. It’s absolutely outrageous and embarrassing that Freud didn’t start with cognitive biases and leave Oedipal complex for someone else to figure out.

From Wikipedia (citations omitted): “The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people’s innumeracy, or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judgement and decision making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences. Heuristics are simple for the brain to compute but sometimes introduce “severe and systematic errors”

So yes folks, we humans have a pretty solid grasp on truth. That’s how we can travel billions of miles and land on a tiny rock hurtling through space. That’s how we are beating AIDS through treatment and hopefully one day a cure. But we can’t forget that our douchebag brain is always trying to fool us. And we can’t forget that with cognitive biases being introduced only in 1972, we’ve got a long way to go to understand and prevent those biases in science, reason, and ethics. Be warned against your own biases or suffer “severe and systematic errors.”

—

* disclaimer: the ‘faith in humanity’ bit above is mostly for effect. I’m not humanist because I have ‘faith’ humanity will do well, I’m humanist because we’re all humanist and we’re stuck with each other and that’s the only option, despite our many flaws. If we must cut down the mightiest tree in the forest, and we have only a herring to do so, I guess that’s what we’ll have to work with.

* images: Kahneman and Tversky from grawemeyer.org; others Fair Use adaptations of generalized public images.